In January, Jones wrote to US senators about his concerns after the tool generated explicit images of Taylor Swift. Microsoft CEO Satya Nadella called the images "alarming and terrible" and promised to add more safety measures. Despite this, Jones claims that Microsoft continues to market the product without implementing the necessary safeguards. This comes after Google temporarily disabled its own AI image generator last month due to it creating historically inaccurate and offensive images.
Key takeaways:
- A Microsoft engineer, Shane Jones, has raised safety concerns about Microsoft's AI image generator, Copilot Designer, to the Federal Trade Commission.
- Jones claims that the tool can generate harmful images, including violent and sexualized scenes, and that Microsoft has refused to take it down despite his warnings.
- Jones has been trying to warn Microsoft about the issues with the model used by Copilot Designer, DALLE-3, since December, and has even written to US senators about his concerns.
- Last month, Google temporarily disabled its own AI image generator due to similar concerns when it was found to create historically inaccurate and offensive images.