Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Making AI trustworthy: Can we overcome black-box hallucinations? | TechCrunch

Aug 24, 2023 - techcrunch.com
Dr. Mike Capps, CEO of Diveplane and former president of Epic Games, discusses the lack of transparency in major AI platforms such as OpenAI, Google, and Microsoft, which operate on black-box models. These models, based on neural networks, provide answers without revealing the data or reasoning behind them, making them untrustworthy and unaccountable. Capps highlights the potential dangers of this, including the use of false or biased information and the inability to correct mistakes.

Capps suggests an alternative AI framework, instance-based learning (IBL), which allows users to trace every decision back to the training data used. Unlike black-box AI, IBL does not generate an abstract model of the data but makes decisions directly from the data itself, allowing for greater transparency and accountability. This makes IBL a potentially valuable tool for companies, governments, and other regulated entities looking to deploy AI in a trustworthy, explainable, and auditable way, particularly in areas prone to bias allegations such as hiring, college admissions, and legal cases.

Key takeaways:

  • Major AI platforms operate on black-box models, which are untrustworthy as they cannot be held accountable for their actions due to their lack of transparency.
  • These black-box AI platforms are built on a technology framework called a “neural network,” which makes predictions based on abstract representations of data, not actual data.
  • An alternative AI framework, instance-based learning (IBL), is gaining prominence as it allows users to trust, audit, and explain AI decisions, tracing every decision back to the training data used.
  • IBL AI could be used by companies, governments, and other regulated entities to meet regulatory and compliance standards, and is particularly useful for applications where bias allegations are rampant.
View Full Article

Comments (0)

Be the first to comment!