Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Don't trust ChatGPT Search and definitely verify anything it tells you

Dec 05, 2024 - zdnet.com
A review by Columbia's Tow Center for Digital Journalism has raised concerns about the efficiency of OpenAI's ChatGPT Search. The study tested how well publisher content is represented on ChatGPT by selecting 10 articles from 20 random publishers and extracting 200 quotes to see if the AI could accurately attribute the sources. The results varied in accuracy, with some being entirely correct or incorrect, and others partially correct. The study also found that ChatGPT often confidently presented answers, even when they were incorrect or from publishers who had blocked its web crawler.

The study highlighted potential issues, such as reputational damage to publishers due to misattribution of quotes. For example, ChatGPT inaccurately attributed a quote from the Orlando Sentinel to a Time article. Other issues included citing an article from The New York Times, which has blocked ChatGPT, from a plagiarized source, or citing a syndicated version of an article instead of the original. The research raises questions about the benefits and drawbacks for publishers partnering with AI companies.

Key takeaways:

  • A review by Columbia's Tow Center for Digital Journalism found that OpenAI's ChatGPT Search may not accurately attribute sources, potentially causing reputational damage to publishers.
  • The study involved extracting 200 quotes from 10 articles from 20 random publishers and testing whether ChatGPT could correctly identify the sources.
  • ChatGPT often presented answers confidently, even when they were incorrect or partially correct, and even when the source was a publisher that had blocked its web crawler.
  • The research raises questions about whether partnering with AI companies like OpenAI offers publishers more control and whether creating new AI search engines truly benefits publishers or potentially harms their businesses.
View Full Article

Comments (0)

Be the first to comment!