The Commission's proposal, which would require digital services to use automated technologies to detect and report CSAM, has been under active negotiations since May 2022. However, the proposal has been criticised for potentially being ineffective and posing a risk to fundamental freedoms. The Commission has yet to respond to the ombudsman's recommendation, and it remains to be seen where the CSAM file will land.
Key takeaways:
- The European Union's ombudsman has urged the Commission to release more information about its communications with Thorn, a US company that sells AI technologies for detecting child sexual abuse material (CSAM).
- The recommendation follows a complaint by a journalist who had requested public access to documents sent to the Commission by Thorn.
- Critics have suggested that the Commission's proposal to use surveillance technologies to detect CSAM has been influenced by lobbyists promoting child safety tech who could benefit commercially from such laws.
- The Commission's proposal has also raised concerns about the potential infringement on citizens' right to privacy and the effectiveness of the technology in combating child sexual abuse.