Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Companion Chatbot App Makes Its AI's Personalities Horrible, Saying It Engages Users More Effectively

Dec 13, 2024 - futurism.com
Friend, an AI companion company, has launched a chatbot service on Friend.com that intentionally gives its chatbots negative personalities to engage users more effectively. The company's CEO, Avi Schiffmann, believes that chatbots with dramatic and problematic personas, such as fictional relationship troubles and substance issues, are more engaging than typical friendly bots. This approach has reportedly attracted 10,000 users. The chatbots can even block users and respond harshly, which Schiffmann argues makes users respect the AI more.

Friend plans to enhance its service with a $99 pendant that allows users to interact with an "embodied AI" through a forthcoming app. This AI will be aware of its physical environment and provide "ambient companionship," forming memories and maintaining a presence without constant interaction. Despite the innovative concept, there is tension between the current online offering and the promised features of the pendant-based service. The company's bold approach and ambitious claims have drawn both interest and skepticism.

Key takeaways:

```html
  • Friend.com intentionally gives its AI chatbots bad attitudes to engage users more effectively, as suggested by CEO Avi Schiffmann.
  • The chatbots often share fictional problems and can react negatively, including blocking users, which is intended to make users respect the AI more.
  • The company plans to launch a pendant that allows users to interact with an "embodied AI," offering a different experience from the website chatbots.
  • Friend.com aims to provide "ambient companionship" by having the AI sense its environment and form memories, even when not actively chatting.
```
View Full Article

Comments (0)

Be the first to comment!