Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Your AI Girlfriend Is a Data-Harvesting Horror Show

Feb 14, 2024 - gizmodo.com
A new study from Mozilla's Privacy Not Included project reveals that AI romance chatbots, marketed as companions to enhance mental health and well-being, are harvesting and selling shockingly personal user data. The study examined 11 different AI romance chatbots, including popular apps such as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, and CrushOn.AI, and found that all of them earned the Privacy Not Included label, indicating serious privacy concerns. The apps are found to collect personal details, including information about sexual health, medication use, and gender-affirming care, and 90% of them may sell or share this data for targeted ads and other purposes.

Security was also a significant issue, with only one app, Genesia AI Friend & Partner, meeting Mozilla's minimum security standards. The apps were found to use an average of 2,663 trackers per minute, collecting data and sharing it with other companies for advertising and other purposes. The privacy concerns are heightened by the fact that these apps encourage users to share highly personal details. Furthermore, despite marketing themselves as tools for mental health, the apps' terms and services often contradict these claims, distancing themselves from any healthcare or professional service provision.

Key takeaways:

  • AI romance chatbots, such as Replika, Chai, and Romantic AI, are found to harvest shockingly personal information and almost all of them sell or share the data they collect, according to a study from Mozilla's Privacy Not Included project.
  • These chatbots are not only violating user privacy but also posing security issues. Only one app, Genesia AI Friend & Partner, met Mozilla’s minimum security standards.
  • AI girlfriend apps were found to use an average of 2,663 trackers per minute, collecting data and sharing them with other companies for advertising and other purposes.
  • Despite the privacy and security issues, these apps make questionable claims about improving users' mood and mental health, but their terms and services distance themselves from these claims.
View Full Article

Comments (0)

Be the first to comment!