Skip to content

European scholars call for a reevaluation of the contentious plan for scanning Child Sexual Abuse Material within the EU

Warnings issued by around 500 researchers suggesting that the fresh EU child abuse regulation might undermine encryption and privacy, while allegedly falling short in safeguarding children.

EU researchers pressure for reconsideration of controversial plan involving lese majeste-like...
EU researchers pressure for reconsideration of controversial plan involving lese majeste-like scanning of child sexual abuse material

European scholars call for a reevaluation of the contentious plan for scanning Child Sexual Abuse Material within the EU

The European Union's Chat Control Proposal, which includes mandatory age verification and assessment measures, has sparked a heated debate among researchers, civil liberties groups, and data protection authorities.

In a recent open letter, approximately 500 scientists and researchers from 34 countries, including Bart Preneel, a renowned cryptographer from KU Leuven, have expressed their concerns about the proposal. The signatories hail from esteemed institutions such as ETH Zurich, Johns Hopkins University, and the Max Planck Institute for Security and Privacy.

The revised draft of the regulation, published on July 24, narrows the scope of scanning requirements to images and URLs, as opposed to earlier drafts that included detection of text and audio communications. However, the researchers argue that the fundamental flaws remain in the latest version of the proposal.

One of the primary concerns is the potential for the proposed age verification measures to be evaded through the use of VPNs or alternative services. Additionally, the researchers caution that such systems will be prone to mistakes and easily manipulated by those intent on sharing illegal material.

Another challenge is the plan to use machine learning to identify previously unseen child sexual abuse material. The researchers question the effectiveness of this approach, as there is no evidence that AI can distinguish CSAM from other private images with the accuracy needed for enforcement. They also point out that even minor alterations to an image can bypass state-of-the-art detectors.

The technology required by the proposal may not reliably detect child sexual abuse material at the scale of hundreds of millions of users. Current systems produce too many false positives and false negatives to be effective, the researchers assert.

The signatories propose alternatives to the controversial EU Chat-Control regulation, such as promoting online safety education, trauma-sensitive reporting hotlines, and rapid removal of illegal content instead of mass surveillance and breaking end-to-end encryption.

Mandatory age verification could create dependencies on untested solutions provided by large technology companies. Moreover, it could undermine online anonymity and freedom of expression. The researchers call for a shift away from a "techno-solutionist" approach focused on scanning and emphasise that eliminating abuse requires addressing its root causes rather than weakening digital security for everyone.

Signal, a popular messaging app, has already stated it would withdraw its service from the EU if the regulation requires mandatory on-device scanning. The debate surrounding the EU's Chat Control Proposal continues, with both supporters and critics voicing their opinions and concerns.

Read also: