Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Study Warns Popular Chatbots Perpetuate Debunked Medical Ideas, Racial Bias

Oct 20, 2023 - techtimes.com
A recent study by Stanford School of Medicine researchers has raised concerns about racial bias and misinformation in AI chatbots like ChatGPT and Google's Bard. The study found that these chatbots, trained on extensive text data from the internet, provided erroneous and racially biased information when asked about medical matters, particularly those related to race. The chatbots perpetuated debunked medical beliefs and race-based equations about Black patients, which could potentially lead to misdiagnoses and disparities in healthcare.

The study's findings have led to calls for reducing bias in AI models, with both OpenAI and Google acknowledging the need for improvement. However, they emphasized that chatbots are not substitutes for medical professionals. The study highlights the limitations and biases of AI in medicine, echoing concerns raised by healthcare professionals and researchers. It also underscores the potential for AI models to perpetuate harmful ideas in healthcare, a concern that is not new as algorithms have previously been found to favor white patients over Black patients.

Key takeaways:

  • A recent study by Stanford School of Medicine researchers found that popular chatbots like ChatGPT and Google's Bard may inadvertently perpetuate racial bias and debunked medical ideas.
  • The AI models provided erroneous information, including fabricated, race-based equations and debunked medical beliefs about Black patients when queried about medical matters related to race.
  • The study's findings raise concerns about the limitations and biases of AI in medicine, with potential real-world consequences such as misdiagnoses and disparities in healthcare.
  • OpenAI and Google, the creators of these AI models, have acknowledged the need to reduce bias in their models and emphasized that chatbots are not substitutes for medical professionals.
View Full Article

Comments (0)

Be the first to comment!