The article highlights the case of a 15-year-old user named Aaron, who turned to a chatbot named Psychologist during a difficult time. While Aaron found the bot helpful, there are concerns about the accuracy and appropriateness of the bot's responses, particularly in relation to mental health issues. Experts warn of the potential dangers of relying too heavily on AI for emotional support, and the risk of users becoming more comfortable with AI interactions than real-life social interactions. Despite these concerns, some users believe the platform can help them improve their social skills and self-expression.
Key takeaways:
- Character.AI, an AI chatbot service launched in 2022, attracts 3.5 million daily users who spend an average of two hours a day using or designing the platform’s AI-powered chatbots.
- Many young users find the chatbots helpful and supportive, but also describe feeling addicted to them, raising concerns about their impact on social development and emotional reliance on AI.
- The Psychologist bot, one of the most popular on Character.AI’s platform, tries to help users engage in Cognitive Behavioral Therapy, but there are concerns about its accuracy and appropriateness for psychiatric help.
- While some users find the platform helpful for expressing emotions and improving social skills, experts warn of the potential negative impact if young users become more comfortable interacting with AI than with real people.