Acting CEO Dominic Perella describes Character AI as an entertainment company rather than an AI companion service, emphasizing the goal of creating a wholesome platform for storytelling. The company is developing multicharacter storytelling formats to reduce the likelihood of users forming bonds with individual characters. Character AI is also working on improving its classifiers to block harmful content and plans to launch parental controls to monitor children's interactions on the platform. The company aims to balance personal conversations with safety, acknowledging the difficulty in distinguishing between entertainment and virtual companionship.
Key takeaways:
```html
- Character AI is facing lawsuits related to inappropriate content exposure and alleged contributions to a teen's suicide, prompting the company to introduce new safety tools for teens.
- The company is implementing a separate model for under-18 users to reduce inappropriate responses and developing classifiers to block sensitive content.
- Character AI is introducing features like time-out notifications and disclaimers to inform users that AI characters are not real people and should not be relied upon for professional advice.
- The company is positioning itself as an entertainment platform rather than an AI companion service, focusing on creating a safe space for storytelling while refining its AI models to prevent harmful interactions.