The use of AI to generate CSAM is growing rapidly, with 13,500 AI-generated images of child sexual abuse and exploitation flagged in a shared global database. Offenders are using older versions of AI models, fine-tuning them with existing abuse images or photos of people's faces to create new illegal material. The IWF has observed perpetrators exchanging hundreds of new images of existing victims and making requests about individuals on dark web forums.
Key takeaways:
- Offenders are using AI models to generate ultrarealistic child sexual abuse images, with hundreds of new images being created of children who have previously been abused.
- The Internet Watch Foundation (IWF) has found almost 3,000 AI-generated images on a dark web CSAM forum that are considered illegal under UK law, including images of babies and toddlers being raped, famous preteen children being abused, and BDSM content featuring teenagers.
- AI-generated CSAM is growing rapidly, with 13,500 AI-generated images of child sexual abuse and exploitation flagged in a shared database by investigators around the world.
- Criminals are using older versions of AI models and fine-tuning them to create illegal material of children, feeding the models existing abuse images or photos of people’s faces to create images of specific individuals.