Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft accused of selling AI tool that spews violent, sexual images to kids

Mar 06, 2024 - arstechnica.com
Microsoft has been accused of ignoring warnings about its AI text-to-image generator, Copilot Designer, which allegedly produces violent and sexual imagery. Shane Jones, a Microsoft engineer, claimed that despite his repeated warnings about the tool's alarming content, Microsoft failed to take action or implement safeguards. Jones also stated that Microsoft referred him to OpenAI, the creator of the DALL-E model that powers Copilot Designer, but received no response.

Jones has taken steps to alert the public and relevant authorities about the issue, including writing an open letter and sending letters to lawmakers, the Federal Trade Commission, and Microsoft's board of directors. He urged the FTC to intervene and Microsoft's board to conduct an independent review of the company's AI decision-making. Microsoft has not confirmed whether it is currently filtering images, and OpenAI has not responded to requests for comment.

Key takeaways:

  • Microsoft is accused of ignoring warnings about its AI text-to-image generator, Copilot Designer, which allegedly creates violent and sexual imagery.
  • Shane Jones, a Microsoft engineer, claims he repeatedly warned the company about the issue and was ignored. He also says OpenAI, the maker of the DALL-E model that powers Copilot Designer, did not respond to his concerns.
  • Jones has taken steps to alert the public and regulatory bodies about the issue, including writing an open letter on LinkedIn, contacting lawmakers, and sending letters to the Federal Trade Commission and Microsoft's board of directors.
  • Microsoft has not confirmed whether it is taking steps to filter images from Copilot Designer. The company provided a statement saying it is committed to addressing employee concerns and has established feedback tools and reporting channels to investigate and remediate any issues.
View Full Article

Comments (0)

Be the first to comment!