Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft engineer sounds alarm on company's AI image generator in letter to FTC

Mar 06, 2024 - businessinsider.com
A Microsoft employee, Shane Jones, has written a letter to the Federal Trade Commission (FTC) expressing concerns about the safety of Microsoft's AI image creator, Copilot Designer. Jones claims the tool produces harmful content reflecting sex, violence, bias, and other inappropriate themes. He alleges that Microsoft denied his request to make the tool safer, and he is now asking the FTC to educate the public about the risks associated with using Copilot Designer, especially for educational purposes.

Jones also claimed that Microsoft failed to implement his suggested changes, such as adding disclosures to its product and changing the rating on its Android app from "E for Everyone" to "Mature 17+". He has previously voiced his concerns publicly and written to US senators about the public safety risks linked to AI image generators. This comes after Google paused access to its image generation feature on Gemini due to similar concerns.

Key takeaways:

  • A Microsoft employee, Shane Jones, has written a letter to the Federal Trade Commission (FTC) expressing concerns about Microsoft's AI image creator, Copilot Designer, which he claims produces harmful content reflecting sex, violence, bias, and other inappropriate themes.
  • Jones alleges that despite his repeated requests to Microsoft to remove the tool from public use until better safeguards are in place, the company has failed to act.
  • He has also accused Microsoft of marketing Copilot Designer as a safe AI product for all users, including children, despite being aware of its potential to generate harmful images.
  • This is not the first time Jones has raised concerns about AI image generators; he previously urged AI giant OpenAI to remove its model DALL-E, which powers Copilot Designer, from public use.
View Full Article

Comments (0)

Be the first to comment!