The incident has highlighted the increasing problem of deepfake pornography, which makes up 98% of all deepfake videos online, with women being disproportionately targeted. Despite a 550% increase in deepfake videos posted online last year, there is currently little recourse for victims. The misuse of Swift's image may prompt more urgent action against AI-generated explicit content.
Key takeaways:
- Taylor Swift's likeness was used in a series of AI-generated explicit posts that went viral, sparking renewed calls for legislation to combat the threats of AI and deepfakes.
- Democratic Rep. Joseph Morelle has proposed a bill, the Preventing Deepfakes of Intimate Images Act, which seeks to make it illegal to share deepfake pornography without consent and would allow victims to sue the creators and distributors of such material while maintaining anonymity.
- According to the State of Deepfakes report published in 2023, over 95,000 deepfake videos were posted online last year, a 550% increase over 2019, with deepfake pornography making up 98% of all deepfake videos online, and women being disproportionately targeted.
- Despite the increasing prevalence of deepfakes, there is a lack of existing legislation to protect victims, leaving many women with little recourse after reporting the creators and distributors of their own deepfakes.