Over the weekend, users noticed they could no longer search for Taylor Swift's name, marking one of X's most significant content moderation efforts since Musk's takeover. The decision to limit searches around Swift's AI-generated images is one of the more notable instances of content moderation on the platform. The actors’ union, SAG-AFTRA, and Swift herself have condemned the images, with the union stating that such content should be illegal if created without consent. X has policies against "non-consensual nudity" and "synthetic and manipulated media," both of which include the Swift AI images, and appears to be acting in accordance with these guidelines.
Key takeaways:
- Elon Musk's platform X halted searches for Taylor Swift after AI-generated explicit photos of the pop star went viral, marking one of the few times the platform has blocked controversial content.
- The photos drew immediate backlash from Swift's fans, prompting X to block accounts sharing the content and remove the most viral posts.
- The platform has been embroiled in a content moderation scandal recently, with a rise in anti-Semitism and white supremacy driving away advertisers.
- X has policies against 'non-consensual nudity' and 'synthetic and manipulated media', and the decision to limit searches around the Taylor Swift AI images marks a significant instance of content moderation.