Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft Closes Loophole That Created AI Porn of Taylor Swift

Jan 29, 2024 - 404media.co
Microsoft has introduced additional safeguards to its AI text-to-image generation tool, Designer, which was being misused to create nonconsensual explicit images of celebrities. The move comes after 404 Media reported that AI-generated explicit images of Taylor Swift, which recently went viral on Twitter, originated from 4chan and a Telegram channel where Designer was being used. Microsoft is investigating the reports and has warned that any repeated misuse of its tools for creating adult or nonconsensual content may result in loss of access to the service.

The company has large teams working on developing safety systems in line with its responsible AI principles. These include content filtering, operational monitoring, and abuse detection to prevent misuse of the system and create a safer environment for users. Microsoft's Code of Conduct prohibits the use of its tools for creating adult or nonconsensual intimate content.

Key takeaways:

  • Microsoft has introduced more protections to Designer, an AI text-to-image generation tool, after it was used to create nonconsensual sexual images of celebrities.
  • The AI-generated nude images of Taylor Swift that went viral on Twitter last week were reported to have come from 4chan and a Telegram channel where people were using Designer.
  • Microsoft's Code of Conduct prohibits the use of their tools for the creation of adult or non-consensual intimate content, and any repeated attempts to produce such content may result in loss of access to the service.
  • Microsoft has large teams working on the development of guardrails and other safety systems in line with their responsible AI principles, including content filtering, operational monitoring and abuse detection to mitigate misuse of the system.
View Full Article

Comments (0)

Be the first to comment!