Jones claims that Copilot Designer has "systemic problems" in generating harmful content and urged its removal from public use until these issues were resolved. He pointed out that the tool lacked appropriate restrictions, often generating images that sexually objectify women even with unrelated prompts. In response, Microsoft stated that it is "committed to addressing any and all concerns employees have in accordance with their company policies" and appreciates its employees' efforts in "studying and testing their latest technology to further enhance its safety."
Key takeaways:
- A Microsoft AI engineer, Shane Jones, has raised concerns about the safety of the company's AI tool, Copilot, claiming it can generate violent and sexualised images.
- Jones has reported his concerns to the Federal Trade Commission and Microsoft's board of directors, but claims no action has been taken.
- Despite the company's promotion of Copilot as a safe tool for businesses and creative endeavours, Jones argues that it has systemic problems and should be removed from public use until these issues are resolved.
- Microsoft responded to Jones' claims, stating that it is committed to addressing any concerns employees have and appreciates their efforts in studying and testing their latest technology to enhance its safety.