The rise in AI-generated child sexual abuse imagery has led to calls for legislation to make this type of pornography illegal. However, the effectiveness of such measures remains uncertain. Last year, the National Center for Missing & Exploited Children received 4,700 reports of AI-generated child porn. A 2023 Stanford University study also found hundreds of child sex abuse images in widely-used generative AI image data sets. The problem is further complicated by the use of open-source software to generate such content, making it difficult to control.
Key takeaways:
- A Florida man, Phillip Michael McCorkle, is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography.
- The investigation was initiated after the Indian River County Sheriff's Office received tips that McCorkle was using an AI image generator to create child sexual imagery and distributing it via the social media app Kik.
- The increasing prevalence of AI-generated child sexual abuse imagery has prompted lawmakers to push legislation to make this type of porn illegal, but the effectiveness of such measures is uncertain.
- A 2023 study from Stanford University revealed that hundreds of child sex abuse images were found in widely-used generative AI image data sets, indicating the severity and ubiquity of the problem.