Jones has since taken his concerns to the US Senate and House of Representatives, leading to meetings with the Senate Committee on Commerce, Science, and Transportation. He criticizes Microsoft for not having appropriate reporting tools for potential problems with its AI products. In contrast, Google paused the text-to-image capabilities of its Gemini tool to address similar complaints. Microsoft and OpenAI have not yet responded to Jones' allegations.
Key takeaways:
- Shane Jones, a machine-learning engineer at Microsoft, has raised serious safety concerns about Microsoft's Copilot's text-to-image tool, alleging that it can generate inappropriate and objectionable images.
- Jones claims that despite his repeated warnings, neither Microsoft nor OpenAI, which supplies the underlying AI technology for Copilot, have addressed these issues.
- He has taken his concerns to the FTC, the US Senate and House of Representatives, and has had meetings with the Senate Committee on Commerce, Science, and Transportation.
- Jones also criticized Microsoft's lack of appropriate reporting tools for potential problems with its AI products, stating that the company's Office of Responsible AI doesn't have any reporting tool aside from an email alias that resolves to five Microsoft employees.