The open letter, signed by 270 experts, addresses recent amendments to the draft CSAM-scanning regulation proposed by the European Council, arguing they fail to address fundamental flaws in the plan. The experts warn that the proposal could lead to "unprecedented capabilities for surveillance and control of Internet users" and could undermine a secure digital future. They also argue that the proposal's approach to reducing false positives by defining a "person of interest" via an automated assessment is still likely to result in a high number of false alarms.
Key takeaways:
- A push by the European Union to require messaging platforms to scan private communications for child sexual abuse material (CSAM) could lead to millions of false positives per day, according to security and privacy experts.
- The EU proposal would require platforms to scan for known CSAM and use detection scanning technologies to identify unknown CSAM and grooming activity, leading to accusations of technosolutionism and potential privacy breaches.
- Experts argue that the proposal is technologically impossible and will not achieve its aim of protecting children from abuse, but instead will undermine internet security and privacy by forcing platforms to deploy blanket surveillance.
- The latest amendment proposed by the European Council still fails to address fundamental flaws with the plan, according to an open letter signed by hundreds of academics and researchers, including those from Harvard Kennedy School, Johns Hopkins University, IBM, Intel, and Microsoft.