Melanie Mitchell, a professor at Santa Fe Institute with AI expertise, expressed concern that the AI-generated overviews provided by Google could lead to widespread misinformation and disinformation. She warned of the danger of trusting AI that is not capable of performing its intended job. Google and Perplexity AI have not yet responded to requests for comment from Business Insider.
Key takeaways:
- Google's AI search summary feature has been generating inaccurate responses, including claims that Barack Obama was a Muslim president and that Africa has no countries beginning with the letter K.
- Despite Google's assertion that these incidents are rare and not representative of most people's experiences, users continue to share their experiences with the inaccuracies.
- Some speculate that Google may delay a wider rollout of the feature if it continues to produce such responses, which could potentially harm Google's reputation.
- Melanie Mitchell, a professor at Santa Fe Institute with AI expertise, warns that the inaccuracies could lead to an avalanche of misinformation and disinformation, as people generally trust Google searches and may not question the AI-generated overviews.