The ASP is advocating for federal legislation to address this issue, with proposed bills like the DEFIANCE Act and the Take It Down Act aiming to establish penalties for creating and sharing such imagery. Despite bipartisan support in the Senate, these bills face challenges in the House due to concerns about free speech and harm definitions. The study underscores the urgent need for action, as AI-generated NCII not only affects public figures but also poses broader risks to national security and discourages women from participating in politics. In the absence of legislative action, the White House is working with the private sector on solutions, though there is skepticism about Big Tech's ability to self-regulate effectively.
Key takeaways:
- More than two dozen members of Congress, primarily women, have been targeted by sexually explicit deepfakes, highlighting a significant gender disparity in the use of this technology.
- The American Sunlight Project (ASP) found over 35,000 mentions of nonconsensual intimate imagery involving 26 members of Congress on deepfake websites, with women being 70 times more likely to be targeted than men.
- There is currently no federal law imposing criminal or civil penalties for creating and distributing AI-generated nonconsensual intimate imagery, though some states have enacted civil penalties.
- Legislation such as the DEFIANCE Act and the Take It Down Act, which aim to address these issues, have passed the Senate but face challenges in the House regarding free speech and harm definitions.