Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Nvidia's Jensen Huang says AI hallucinations are solvable, artificial general intelligence is 5 years away | TechCrunch

Mar 19, 2024 - techcrunch.com
The article discusses the concept of Artificial General Intelligence (AGI), often referred to as “strong AI,” “full AI,” “human-level AI” or “general intelligent action.” Unlike narrow AI, AGI is designed to perform a broad spectrum of cognitive tasks at or above human levels. Nvidia’s CEO, Jensen Huang, addressed the topic at the company's annual GTC developer conference, stating that predicting the arrival of AGI depends on its definition. If AGI is defined as a set of tests where a software program can perform better than most people, Huang believes it could be achieved within five years.

The article also covers the issue of AI hallucinations, where some AIs generate plausible but factually incorrect answers. Huang suggests this issue can be solved by ensuring AI systems research their answers thoroughly, a practice he refers to as 'Retrieval-augmented generation.' For critical answers, such as health advice, Huang recommends checking multiple resources and known sources of truth. He also emphasizes that AI systems should have the option to admit when they do not know the answer to a question.

Key takeaways:

  • Artificial General Intelligence (AGI) represents a significant future leap in the field of artificial intelligence, capable of performing a broad spectrum of cognitive tasks at or above human levels.
  • CEO of Nvidia, Jensen Huang, believes that predicting when AGI will be achieved depends on how it is defined and measured.
  • Huang suggests that if AGI is defined as a software program that can perform specific tests better than most people, it could be achieved within 5 years.
  • Huang also addressed the issue of AI hallucinations, suggesting that they can be solved by ensuring that AI systems research their answers and cross-check them with multiple sources.
View Full Article

Comments (0)

Be the first to comment!