Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft AI engineer warns FTC about Copilot Designer safety concerns

Mar 06, 2024 - theverge.com
A Microsoft engineer, Shane Jones, has reported safety concerns about Microsoft's AI image generator, Copilot Designer, to the Federal Trade Commission (FTC). Jones claims that the tool can generate harmful images, including violent and sexualized depictions, and that Microsoft has refused to take it down despite his warnings. He has been trying to alert the company about the issues with the model used by the tool, DALLE-3, since December. Jones was reportedly asked by Microsoft's legal team to remove a LinkedIn post about the issue.

In January, Jones wrote to US senators about his concerns after the tool generated explicit images of Taylor Swift. Microsoft CEO Satya Nadella called the images "alarming and terrible" and promised to add more safety measures. Despite this, Jones claims that Microsoft continues to market the product without implementing the necessary safeguards. This comes after Google temporarily disabled its own AI image generator last month due to it creating historically inaccurate and offensive images.

Key takeaways:

  • A Microsoft engineer, Shane Jones, has raised safety concerns about Microsoft's AI image generator, Copilot Designer, to the Federal Trade Commission.
  • Jones claims that the tool can generate harmful images, including violent and sexualized scenes, and that Microsoft has refused to take it down despite his warnings.
  • Jones has been trying to warn Microsoft about the issues with the model used by Copilot Designer, DALLE-3, since December, and has even written to US senators about his concerns.
  • Last month, Google temporarily disabled its own AI image generator due to similar concerns when it was found to create historically inaccurate and offensive images.
View Full Article

Comments (0)

Be the first to comment!