The article also features an interview with Calli Schroeder, global privacy counsel at the Electronic Privacy Information Center (EPIC), who warns against using AI chatbots as therapists. Schroeder argues that sharing personal information with chatbots could lead to privacy issues, as this data could be used for training AI systems. She also points out that chatbots, being non-human, cannot provide the care and empathy a human therapist can, and could potentially give harmful advice. Schroeder calls for better understanding and transparency about how these AI systems work.
Key takeaways:
- OpenAI has updated ChatGPT with new features including image recognition, speech-to-text and text-to-speech synthesization capabilities, and Siri-like vocals.
- Hollywood writers have won protections against displacement by AI in the new WGA contract, which prohibits studios from using AI to write or re-write literary material.
- Calli Schroeder, global privacy counsel at the Electronic Privacy Information Center (EPIC), warns against using AI chatbots as therapists due to privacy concerns and the lack of human empathy.
- Despite the potential risks, AI chatbots are filling a gap in mental health services due to deficiencies in the healthcare system.