Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Using ChatGPT for medical advice could put your health at risk, reveals study

Dec 06, 2023 - indiatoday.in
A recent study by pharmacists at Long Island University has raised concerns about the use of OpenAI's AI chatbot, ChatGPT, for seeking medical advice. The study found that the chatbot often provides inaccurate or incomplete responses to medication-related queries, posing potential risks to patients. Out of 39 drug-related questions posed to ChatGPT, only 10 responses were deemed satisfactory, while the remaining received incomplete responses, inaccurate information, or failed to address the query directly.

In response to the study, OpenAI emphasized that users are advised against using ChatGPT's responses as a substitute for professional medical advice, acknowledging the chatbot's limitations in the healthcare domain. The study highlights the importance of verifying any responses from the chatbot with trusted sources and underscores the need for patients and healthcare professionals to exercise caution when using ChatGPT for medication-related information.

Key takeaways:

  • A recent study reveals that OpenAI's AI chatbot, ChatGPT, often provides inaccurate or incomplete responses to medication-related queries, posing potential risks to patients.
  • The study, conducted by pharmacists at Long Island University, found that ChatGPT provided inaccurate or incomplete answers to nearly three-fourths of drug-related questions.
  • Researchers recommend that both patients and healthcare professionals exercise caution when using ChatGPT for drug-related information and verify any responses with trusted sources.
  • OpenAI stresses that users should not use ChatGPT's responses as a substitute for professional medical advice and acknowledges the chatbot's limitations in the healthcare domain.
View Full Article

Comments (0)

Be the first to comment!