Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI enters Congress: Sexually explicit deepfakes target women lawmakers

Dec 11, 2024 - 19thnews.org
A recent study by the American Sunlight Project (ASP) reveals that over two dozen members of Congress, predominantly women, have been targeted by sexually explicit deepfakes. The study highlights a significant gender disparity in the use of this technology, with women in Congress being 70 times more likely than men to be victimized. The findings, which identified over 35,000 mentions of nonconsensual intimate imagery (NCII) involving 26 members of Congress, underscore the evolving risks for women's participation in politics and civic engagement. Despite the removal of much of the imagery following the study's release, the issue persists, with researchers noting that such removals do not prevent the material from being shared again. The study calls attention to the lack of federal legislation addressing AI-generated NCII and the broader implications for democracy and free speech.

The ASP is advocating for Congress to pass federal legislation, such as the DEFIANCE Act and the Take It Down Act, to allow victims to sue creators and distributors of such imagery and to impose criminal liability. The study also emphasizes the broader societal impact of deepfakes, noting that 41% of women aged 18-29 self-censor to avoid online harassment. This digital sexual violence poses a unique risk for women in public roles and can have severe mental health effects on victims. The White House has been working with the private sector to find solutions, but there is skepticism about Big Tech's ability to self-regulate effectively. The study urges immediate legislative action to address the harm caused by AI-generated NCII.

Key takeaways:

```html
  • More than two dozen members of Congress, predominantly women, have been targeted by sexually explicit deepfakes, highlighting a significant gender disparity in the use of this technology.
  • The American Sunlight Project found over 35,000 mentions of nonconsensual intimate imagery (NCII) involving 26 members of Congress, with women being 70 times more likely to be targeted than men.
  • There is currently no federal law imposing criminal or civil penalties for creating or distributing AI-generated NCII, though some states have enacted civil penalties.
  • Legislation such as the DEFIANCE Act and the Take It Down Act are being considered to address the issue, but face challenges related to free speech and harm definitions.
```
View Full Article

Comments (0)

Be the first to comment!