The letter highlights the increasing difficulty in helping real child sex abuse victims due to the growing prevalence of AI-generated CSAM. It also notes that AI can create realistic sexualized images of children who do not exist but may resemble actual children. The prosecutors argue that while Congress has been considering AI regulation, the safety of children should not be overlooked as AI is already being used to harm them.
Key takeaways:
- The attorneys general from all 50 US states have signed a letter urging Congress to take action against the proliferation of AI-generated child sexual abuse material (CSAM).
- The bipartisan letter asks political leaders to establish an expert commission to study the means and methods of AI that can be used to exploit children and propose solutions to deter and address such exploitation.
- The prosecutors call on US lawmakers to expand existing CSAM laws, which don't yet explicitly account for the creation and distribution of synthetic child abuse content.
- The letter also highlights that AI can combine data from photographs of both abused and nonabused children to animate new and realistic sexualized images of children who do not exist, but who may resemble actual children.