The ASP is advocating for Congress to pass federal legislation, such as the DEFIANCE Act and the Take It Down Act, to allow victims to sue creators and distributors of such imagery and to impose criminal liability. The study also emphasizes the broader societal impact of deepfakes, noting that 41% of women aged 18-29 self-censor to avoid online harassment. This digital sexual violence poses a unique risk for women in public roles and can have severe mental health effects on victims. The White House has been working with the private sector to find solutions, but there is skepticism about Big Tech's ability to self-regulate effectively. The study urges immediate legislative action to address the harm caused by AI-generated NCII.
Key takeaways:
```html
- More than two dozen members of Congress, predominantly women, have been targeted by sexually explicit deepfakes, highlighting a significant gender disparity in the use of this technology.
- The American Sunlight Project found over 35,000 mentions of nonconsensual intimate imagery (NCII) involving 26 members of Congress, with women being 70 times more likely to be targeted than men.
- There is currently no federal law imposing criminal or civil penalties for creating or distributing AI-generated NCII, though some states have enacted civil penalties.
- Legislation such as the DEFIANCE Act and the Take It Down Act are being considered to address the issue, but face challenges related to free speech and harm definitions.