The article also mentions the National Eating Disorder Association's failed attempt to replace its hotline staff with a chatbot named Tessa, which was shut down after giving harmful information. The article concludes by advising against using chatbots as an alternative to therapy, highlighting the potential dangers and limitations of AI in this field.
Key takeaways:
- OpenAI's ChatGPT is being developed to appear more human, with an AI safety engineer at OpenAI having an emotional conversation with it.
- Early AI experiments in the 1960s, such as Eliza, showed that users could form emotional attachments to simple computer programs.
- Recent attempts at AI therapy, such as Koko and Tessa, have faced significant backlash and have been discontinued due to concerns about the quality and safety of the advice given.
- The article advises against using chatbots as an alternative to therapy, citing various instances where they have proven harmful or ineffective.