In response to this issue, there is a push for legislative action, such as the "Take It Down Act," which aims to criminalize the publication of non-consensual intimate images and mandate their removal from platforms within 48 hours. Dorota Mani, a mother of a deepfake victim, is advocating for increased education and involvement from schools and officials to address the problem. The study underscores the urgent need for societal response and legal measures to combat the spread of deepfake pornography among teens.
Key takeaways:
- One in eight American teenagers under 18 personally know someone who has had an AI-generated pornographic deepfake made of them, and one in 17 teens has been directly victimized by AI deepfakes.
- The survey conducted by Thorn included 1,200 people aged 13 to 20 and highlighted the ease with which deepfake content can be created and shared, disproportionately affecting teen girls and women.
- Recent incidents at schools, such as Westfield High School in New Jersey and Lancaster Country Day School in Pennsylvania, illustrate the growing prevalence of deepfake pornographic images among teens.
- The "Take It Down Act," if enacted, would criminalize the publication of non-consensual intimate images, including deepfakes, and require platforms to remove them within 48 hours, with potential fines for non-compliance.