The company has large teams working on developing safety systems in line with its responsible AI principles. These include content filtering, operational monitoring, and abuse detection to prevent misuse of the system and create a safer environment for users. Microsoft's Code of Conduct prohibits the use of its tools for creating adult or nonconsensual intimate content.
Key takeaways:
- Microsoft has introduced more protections to Designer, an AI text-to-image generation tool, after it was used to create nonconsensual sexual images of celebrities.
- The AI-generated nude images of Taylor Swift that went viral on Twitter last week were reported to have come from 4chan and a Telegram channel where people were using Designer.
- Microsoft's Code of Conduct prohibits the use of their tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce such content may result in loss of access to the service.
- Microsoft has large teams working on the development of guardrails and other safety systems in line with their responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system.