Jones has taken steps to alert the public and relevant authorities about the issue, including writing an open letter and sending letters to lawmakers, the Federal Trade Commission, and Microsoft's board of directors. He urged the FTC to intervene and Microsoft's board to conduct an independent review of the company's AI decision-making. Microsoft has not confirmed whether it is currently filtering images, and OpenAI has not responded to requests for comment.
Key takeaways:
- Microsoft is accused of ignoring warnings about its AI text-to-image generator, Copilot Designer, which allegedly creates violent and sexual imagery.
- Shane Jones, a Microsoft engineer, claims he repeatedly warned the company about the issue and was ignored. He also says OpenAI, the maker of the DALL-E model that powers Copilot Designer, did not respond to his concerns.
- Jones has taken steps to alert the public and regulatory bodies about the issue, including writing an open letter on LinkedIn, contacting lawmakers, and sending letters to the Federal Trade Commission and Microsoft's board of directors.
- Microsoft has not confirmed whether it is taking steps to filter images from Copilot Designer. The company provided a statement saying it is committed to addressing employee concerns and has established feedback tools and reporting channels to investigate and remediate any issues.