Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Microsoft CEO Horrified by AI-Generated Taylor Swift Images

Jan 29, 2024 - futurism.com
Microsoft CEO Satya Nadella has expressed concern over the proliferation of pornographic deepfakes of Taylor Swift on social media, particularly on platform X. In an interview with NBC, he called for more "guardrails" to ensure safer content and stated that tech companies could govern more than they believe. This comes after Microsoft was implicated in the scandal when users on 4chan and Telegram manipulated Microsoft's AI image generator to create explicit images of Swift. Microsoft has reportedly addressed the issue with an update.

The situation has caused widespread outrage, prompting even the White House to call on social media companies to reassess their role in preventing the spread of such content. In response, X-formerly-Twitter blocked all searches for "Taylor Swift" on its platform, a move that has proven ineffective. The article criticizes tech companies for their lack of preparedness in dealing with the harmful content enabled by the AI technology they fund. It also highlights the lag in laws surrounding the use of such technology, leaving victims like Swift with little legal recourse.

Key takeaways:

  • Microsoft CEO Satya Nadella has called for more 'guardrails' to ensure safer content, following the spread of pornographic deepfakes of Taylor Swift on social media.
  • Microsoft's AI image generator Designer was used to create these explicit images, but the company claims to have addressed the issue with an update.
  • The White House has described the situation as 'alarming' and called on social media companies to reexamine their role in preventing such content from spreading.
  • Despite Nadella's call to action, the article suggests that tech companies are ill-prepared for the reality of the AI they're funding, and that laws surrounding the use of this technology are lagging far behind.
View Full Article

Comments (0)

Be the first to comment!