Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

ChatGPT is leaking passwords from private conversations of its users, Ars reader says

Jan 30, 2024 - arstechnica.com
ChatGPT, an AI chatbot developed by OpenAI, has reportedly leaked private conversations, including login credentials and personal details of unrelated users. The leaked data includes usernames and passwords connected to a pharmacy prescription drug portal, details of an unpublished research proposal, and a script using the PHP programming language. The leaks were discovered by a user who noticed additional conversations appearing in his history that were not his own.

This incident raises concerns about the security of AI services, as it is not the first time OpenAI has faced such issues. In March, the company had to take ChatGPT offline due to a bug that showed titles from one user's chat history to unrelated users. In November, researchers demonstrated how queries could prompt ChatGPT to divulge private data included in its training material. As a result, companies like Apple have restricted their employees' use of ChatGPT and similar sites. OpenAI is currently investigating the report.

Key takeaways:

  • ChatGPT, an AI chatbot developed by OpenAI, has been found to leak private conversations, including login credentials and personal details of unrelated users.
  • The leaked information includes usernames and passwords connected to a pharmacy prescription drug portal, details of an unpublished research proposal, and a PHP script.
  • OpenAI previously took ChatGPT offline due to a bug that showed titles from one user’s chat history to unrelated users, and there have been concerns about the AI divulging private data included in its training material.
  • Companies like Apple have restricted their employees' use of ChatGPT and similar sites due to concerns about potential data leakage.
View Full Article

Comments (0)

Be the first to comment!