Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft Copilot Generates Demons When Prompted for Abortion Rights Images, Employee Says

Mar 06, 2024 - gizmodo.com
Microsoft engineer Shane Jones has publicly called for the removal of the company's AI image generator, Copilot Designer, due to its production of disturbing images. Despite Jones' repeated warnings to Microsoft, the company has largely ignored his concerns, prompting him to request government intervention. The AI tool, which generates inappropriate images in response to basic prompts, is in violation of Microsoft's Responsible AI guidelines, which aim to minimize the potential for stereotyping, demeaning, or erasing identified demographic groups.

Jones' concerns have been largely overlooked by Microsoft, which only has the resources to investigate the most egregious errors reported by users. Despite the controversy, it is unlikely that Microsoft will pause Copilot, even though Google was recently forced to pause its AI image generator due to similar issues. This incident highlights the broader cultural battle faced by AI tools, with many facing backlash over their safeguards and censorship, and the ongoing issue of content moderation.

Key takeaways:

  • Microsoft Engineer Shane Jones has publicly warned that the company's AI image generator, Copilot Designer, produces disturbing images and needs to be removed from public use.
  • Jones has urged Microsoft to remove the tool until better safeguards can be put in place, but his requests have been ignored, leading him to ask government regulators to intervene.
  • The AI tool, which runs on OpenAI’s image generator, Dall-E 3, has been producing images that violate Microsoft’s Responsible AI guidelines, including demeaning images of women and pro-choice advocates.
  • Despite receiving over 1,000 product feedback messages daily, Microsoft only has enough resources to investigate the most serious errors, leading to ongoing issues with the AI tool.
View Full Article

Comments (0)

Be the first to comment!