Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft Engineer Warns Company's AI Tool Creates Violent, Sexual Images, Ignores Copyrights - Slashdot

Mar 06, 2024 - it.slashdot.org
Shane Jones, an AI engineer at Microsoft, has raised concerns about the AI image generator, Copilot Designer, developed by Microsoft using OpenAI's technology. Jones, who has been testing the product for vulnerabilities, found that the tool generated inappropriate images, including violent and sexualized scenes, which contradicted Microsoft's responsible AI principles. Despite reporting his findings internally, Microsoft has not removed the product from the market.

Jones, who has been with Microsoft for six years, is part of a group of employees and outsiders who test the company's AI technology in their free time. After his concerns were acknowledged but not acted upon by Microsoft, Jones was referred to OpenAI. When he received no response, he posted an open letter on LinkedIn, calling for the removal and investigation of DALL-E 3, the latest version of the AI model.

Key takeaways:

  • Shane Jones, an AI engineer at Microsoft, has been testing the company's AI image generator, Copilot Designer, and has found it to be generating inappropriate and disturbing images.
  • The AI service has produced images depicting violent and sexualized scenes, underage drinking and drug use, and controversial political topics, which are against Microsoft's responsible AI principles.
  • Jones has been reporting his findings internally since December, but Microsoft has been unwilling to take the product off the market and instead referred him to OpenAI.
  • After not hearing back from OpenAI, Jones posted an open letter on LinkedIn asking the startup's board to take down DALL-E 3, the latest version of the AI model, for an investigation.
View Full Article

Comments (0)

Be the first to comment!