Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

The AI-Generated Child Abuse Nightmare Is Here

Oct 24, 2023 - wired.com
A new report from the Internet Watch Foundation (IWF) warns of the increasing use of AI technology to generate child sexual abuse material (CSAM). Offenders are using open-source AI models to create and share new images of previous abuse victims, with some even selling subscriptions to AI-generated CSAM. The IWF found nearly 3,000 AI-generated images considered illegal under UK law on one dark web CSAM forum, including images of babies and toddlers being abused, famous preteen children, and BDSM content featuring teenagers.

The use of AI to generate CSAM is growing rapidly, with 13,500 AI-generated images of child sexual abuse and exploitation flagged in a shared global database. Offenders are using older versions of AI models, fine-tuning them with existing abuse images or photos of people's faces to create new illegal material. The IWF has observed perpetrators exchanging hundreds of new images of existing victims and making requests about individuals on dark web forums.

Key takeaways:

  • Offenders are using AI models to generate ultrarealistic child sexual abuse images, with hundreds of new images being created of children who have previously been abused.
  • The Internet Watch Foundation (IWF) has found almost 3,000 AI-generated images on a dark web CSAM forum that are considered illegal under UK law, including images of babies and toddlers being raped, famous preteen children being abused, and BDSM content featuring teenagers.
  • AI-generated CSAM is growing rapidly, with 13,500 AI-generated images of child sexual abuse and exploitation flagged in a shared database by investigators around the world.
  • Criminals are using older versions of AI models and fine-tuning them to create illegal material of children, feeding the models existing abuse images or photos of people’s faces to create images of specific individuals.
View Full Article

Comments (0)

Be the first to comment!