Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Tell HN: ChatGPT is the biggest privacy disaster ever

Dec 05, 2023 - news.ycombinator.com
The article discusses privacy concerns related to the use of ChatGPT, an AI developed by OpenAI. It highlights that personal information shared with ChatGPT is used by default to train future versions of the AI, with the setting for this feature turned on automatically. The author also points out that it's difficult for users to prove they've disabled this setting, as it's saved only on their device, not on OpenAI's servers. Additionally, a new feature allows users to give personal instructions with each message, potentially including personal details, and the setting for keeping prompts private doesn't affect these instructions.

The author expresses concern about these privacy issues and speculates that the European Union might intervene, but it could take time. They compare the situation to past privacy issues with Facebook and suggest that OpenAI's CEO, Sam Altman, is pushing boundaries. The author also notes that when users turn off the "Chat history & training" setting, the chat sometimes clears itself, which they suspect might be intentional to annoy privacy-conscious users. The article ends with a note that OpenAI's models still include training data 1:1.

Key takeaways:

  • ChatGPT by default uses personal information shared by users to train future versions of the AI.
  • The setting that allows ChatGPT to use your chats for training is automatically turned on and is only saved on your device, not OpenAI's servers, making it hard to prove if you've disabled it.
  • There's a feature where you can give personal instructions with each message, which is not affected by the 'Chat history & training' privacy setting. To disable this, you have to email OpenAI.
  • Turning off the 'Chat history & training' setting sometimes results in the chat clearing itself, which could potentially be an intentional move to discourage users from prioritizing privacy.
View Full Article

Comments (0)

Be the first to comment!