The article also highlights the case of YouTube influencer Gabi Belle, who found fake nude images of herself online, and a 14-year-old girl in Spain whose clothed photos were turned into nude images using an AI "nudifier" app. The article calls for stronger regulations to address the issue, as the current rules do little to protect victims of deepfake porn.
Key takeaways:
- Artificial intelligence is being used to create fake pornographic images and videos, with a rise in AI tools that can 'undress' people in photographs or swap faces into pornographic videos.
- Victims have little recourse as there's no federal law governing deepfake porn, and only a few states have enacted regulations. Legal scholars warn that AI fake images may not fall under copyright protections for personal likenesses.
- AI-generated porn is particularly targeting women and teens, with a 2019 study by Sensity AI finding that 96 percent of deepfake images are pornography, and 99 percent of those photos target women.
- Google is working on more expansive safeguards to prevent nonconsensual sexual images from appearing in search results, but its protections for deepfake images are not as robust. Deepfake porn and the tools to make it show up prominently on the company’s search engines.