Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Copilot image generation 'systemically' flawed, says Microsoft whistleblower

Mar 06, 2024 - theregister.com
Shane Jones, a machine-learning engineer at Microsoft, has publicly voiced his concerns about the safety of Microsoft's Copilot's text-to-image tool. Jones alleges that he found vulnerabilities in OpenAI's DALL-E 3, which is used by Copilot Designer to generate images from text, allowing him to bypass safety measures and generate inappropriate images. Despite raising these issues internally and urging Microsoft to remove Copilot Designer from public use until better safeguards are in place, Jones claims his concerns have been ignored.

Jones has since taken his concerns to the US Senate and House of Representatives, leading to meetings with the Senate Committee on Commerce, Science, and Transportation. He criticizes Microsoft for not having appropriate reporting tools for potential problems with its AI products. In contrast, Google paused the text-to-image capabilities of its Gemini tool to address similar complaints. Microsoft and OpenAI have not yet responded to Jones' allegations.

Key takeaways:

  • Shane Jones, a machine-learning engineer at Microsoft, has raised serious safety concerns about Microsoft's Copilot's text-to-image tool, alleging that it can generate inappropriate and objectionable images.
  • Jones claims that despite his repeated warnings, neither Microsoft nor OpenAI, which supplies the underlying AI technology for Copilot, have addressed these issues.
  • He has taken his concerns to the FTC, the US Senate and House of Representatives, and has had meetings with the Senate Committee on Commerce, Science, and Transportation.
  • Jones also criticized Microsoft's lack of appropriate reporting tools for potential problems with its AI products, stating that the company's Office of Responsible AI doesn't have any reporting tool aside from an email alias that resolves to five Microsoft employees.
View Full Article

Comments (0)

Be the first to comment!