Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

ChatGPT provides false information about people, and OpenAI can’t correct it

Apr 29, 2024 - noyb.eu
OpenAI's chatbot, ChatGPT, has been accused of generating inaccurate and false information, a problem known as "hallucination". The issue becomes critical when it involves personal data, as EU law requires such data to be accurate. OpenAI has reportedly refused requests to rectify or erase incorrect data, arguing it's impossible to correct the data. The company also failed to respond adequately to requests for access to personal data, a right guaranteed by the GDPR.

In response, digital rights organization noyb has filed a complaint with the Austrian data protection authority (DSB), asking it to investigate OpenAI's data processing and measures taken to ensure the accuracy of personal data. The organization also requests the DSB to order OpenAI to comply with the complainant’s access request and align its processing with the GDPR, and to impose a fine to ensure future compliance.

Key takeaways:

  • OpenAI's ChatGPT has been criticized for generating inaccurate or false information, a problem known as 'hallucinating', which is particularly concerning when it comes to personal data.
  • EU law requires personal data to be accurate, and individuals have the right to rectify inaccurate data and request false information be deleted, which OpenAI has been unable to comply with.
  • OpenAI has refused requests to rectify or erase inaccurate data, arguing it's not possible to correct data, and has failed to adequately respond to access requests.
  • The non-profit organization noyb has filed a complaint with the Austrian data protection authority, asking them to investigate OpenAI's data processing and to ensure the accuracy of personal data processed in the context of the company's large language models.
View Full Article

Comments (0)

Be the first to comment!