Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft Engineer Sickened by Images Its AI Produces

Mar 07, 2024 - futurism.com
A Microsoft AI engineer, Shane Jones, has warned the Federal Trade Commission (FTC) and Microsoft's board about the company's Copilot Designer AI image generator, previously known as the Bing Image Creator, producing disturbing and unsafe imagery. Jones, who tests Microsoft's AI products for potential dangers, found that the AI was generating violent and illicit images, including those supporting harmful biases and conspiracy theories. Despite raising concerns, Microsoft reportedly failed to take action or investigate.

Jones has since escalated his concerns, publicly urging Microsoft to remove Copilot Designer from public use until better safeguards are in place. He also called for a change in the AI's "E for everyone" rating, arguing it is not safe for children. While Microsoft has stated it is committed to addressing employee concerns, Jones highlighted the lack of a reporting mechanism for harmful images and the absence of regulation limiting AI companies' products.

Key takeaways:

  • Microsoft AI engineer, Shane Jones, has warned the Federal Trade Commission (FTC) and Microsoft's board about the company's Copilot Designer AI image generator producing disturbing and unsafe imagery.
  • Jones, who tests Microsoft's AI products for potentially dangerous behavior, found that the AI was generating violent and illicit underage imagery, as well as supporting destructive biases and conspiracy theories.
  • Despite raising the issue with Microsoft, Jones claims the company failed to take action or conduct an investigation, leading him to escalate his concerns to government officials.
  • Jones has called for the removal of the Copilot Designer from public use until better safeguards are in place, and for Microsoft to amend the "E for everyone" rating in app stores, arguing that the AI is not safe for children.
View Full Article

Comments (0)

Be the first to comment!