This incident is one of several where AI has given harmful or nonsensical advice, including recommending recipes involving mosquito repellent and chlorine gas, and suggesting eating rocks. Despite these issues, companies continue to integrate AI into customer service applications, prioritizing automation over accuracy. The article suggests that cooking may be a domain that does not require AI integration, as AI platforms often lack understanding of the subject matter.
Key takeaways:
- An AI agent named FungiFriend gave dangerous advice on a Facebook group about mushroom foraging, suggesting cooking methods for a poisonous mushroom.
- The incident was reported by 404 Media and brought to attention by Rick Claypool, the research director for the consumer safety group Public Citizen.
- This is not the first time AI has given harmful advice, with previous incidents including an AI suggesting recipes involving mosquito repellant and chlorine gas, and another encouraging users to eat rocks.
- Despite these incidents, companies continue to integrate AI into customer service applications, prioritizing cost-saving over the potential risk of providing incorrect information to the public.