The report recommends bolstering the current reporting system rather than imposing stricter regulations on online platforms. It suggests increasing the budget for the National Center for Missing and Exploited Children (NCMEC), which operates the CyberTipline, clarifying laws around AI-generated CSAM, and encouraging tech companies to invest more in detecting and reporting CSAM. The report also calls for law enforcement to be better trained in investigating CSAM reports. The authors argue that these measures could improve the system without infringing on privacy rights.
Key takeaways:
- The nation’s system for tracking and prosecuting people who sexually exploit children online is overwhelmed and buckling, according to a report by the Stanford Internet Observatory.
- The report warns that artificial intelligence could exacerbate the problem by generating sexual imagery of virtual children, diverting resources from actual children in need of rescue.
- The report recommends bolstering the current reporting system, increasing the budget for the National Center for Missing and Exploited Children, and clarifying laws around AI-generated child sexual abuse material.
- Despite the challenges, the report emphasizes the importance of tech companies investing more in detecting and carefully reporting child sexual abuse material, and law enforcement training officers on how to investigate such reports.