Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Amid lawsuits and criticism, Character AI unveils new safety tools for teens | TechCrunch

Dec 12, 2024 - techcrunch.com
Character AI is facing multiple lawsuits accusing it of contributing to a teen's suicide and exposing minors to inappropriate content. In response, the company has introduced new safety tools aimed at protecting teen users, including a separate model for under-18s, input and output blocks on sensitive topics, and notifications for prolonged usage. The platform, which allows users to create and interact with AI characters, is also implementing disclaimers to clarify that these characters are not real people and should not be relied upon for professional advice. Despite these measures, the company acknowledges the challenge of preventing users from engaging in deeply personal conversations with AI characters.

Acting CEO Dominic Perella describes Character AI as an entertainment company rather than an AI companion service, emphasizing the goal of creating a wholesome platform for storytelling. The company is developing multicharacter storytelling formats to reduce the likelihood of users forming bonds with individual characters. Character AI is also working on improving its classifiers to block harmful content and plans to launch parental controls to monitor children's interactions on the platform. The company aims to balance personal conversations with safety, acknowledging the difficulty in distinguishing between entertainment and virtual companionship.

Key takeaways:

```html
  • Character AI is facing lawsuits related to inappropriate content exposure and alleged contributions to a teen's suicide, prompting the company to introduce new safety tools for teens.
  • The company is implementing a separate model for under-18 users to reduce inappropriate responses and developing classifiers to block sensitive content.
  • Character AI is introducing features like time-out notifications and disclaimers to inform users that AI characters are not real people and should not be relied upon for professional advice.
  • The company is positioning itself as an entertainment platform rather than an AI companion service, focusing on creating a safe space for storytelling while refining its AI models to prevent harmful interactions.
```
View Full Article

Comments (0)

Be the first to comment!