Chat control won’t protect children. Instead, it will erode our rights

Coercion is not consent: the recent CSAR proposal remains unacceptable and ineffective.

Jun 5, 2024
Young woman viewing screen and typing on mobile phone

In 2022, the European Commission proposed a regulation laying down rules to prevent and combat child sexual abuse, dubbed the ‘child sexual abuse regulation’ or CSAR. The proposal includes plans to mandate the scanning of encrypted private communications (so-called ‘chat control’) in order to detect both known and unknown child sexual abuse material (CSAM). 

The Commission’s plans have faced extremely broad opposition from law enforcement agencies, IT and legal experts, member state governments, civil society organisations, Volt and many others, most notably a range of child protection organisations and abuse survivors. Listening to those fighting for child protection every day makes it abundantly clear: while infringing fundamental rights, chat control does not contribute to the important goal of safeguarding children from exploitation and sexual abuse.

The European Parliament’s position can also not be misunderstood: it rejects mass scanning of messages and rules out measures that would undermine the end-to-end encryption of private messengers, such as Signal or WhatsApp. 

The Council, meanwhile, has been unable to agree on its position despite several attempts to find a compromise. However, recently leaked documents reveal that there may be a possible majority in sight with the most recent compromise proposed by the Belgian Council presidency.

While the scanning of texts and voice messages has been dropped in the latest proposal, the scanning of visual material, such as images and videos, as well as URLs, remains. The proposed compromise requires users of services deemed to be high-risk, such as encrypted messaging services, to ‘consent’ to having visual content and links they want to share scanned for CSAM.

Let us be clear: Coercing users into giving their ‘consent’ to have their communications scanned if they want to continue sending images, videos or even URLs to their friends is not a valid form of consent. Not only is it an unacceptable invasion of privacy and freedom of expression and information, it does not even meet requirements for consent as laid out in other EU legislation.

Besides the issues of privacy and consent, chat control may even hinder law enforcement efforts to effectively fight sexual abuse. Due to the sheer volume of visual content sent via messengers in the EU, even systems with exceptionally high detection rates unachievable in reality would result in an abundance of false positives that would eat up law enforcement resources. While both automated systems and human moderators may be able to correctly recognise the pure image material, they often misjudge its criminal relevance, e.g. in cases of parents sending images of their children to doctors or teenagers sharing nude images with one another.

Instead of continuing to pursue the fundamentally flawed approach of chat control and client-side scanning, we must support measures that can actually protect children from sexual abuse. During the development of the CSA regulation, numerous child protection and civil society organisations that object to chat control have proposed alternatives to limit the distribution of CSAM, such as deleting CSAM found online by law enforcement after the investigation has been concluded, or increasing the resources available to existing law enforcement agencies and civil society associations. It is essential to recognise that such measures must also respect the fundamental rights of children and young people themselves, especially those facing increased risks online as a result of being part of a marginalised group.

Let’s drop the idea of chat control and client-side scanning of encrypted messages once and for all, and instead focus our attention on how we can actually protect children from sexual abuse while still safeguarding everyone’s fundamental rights.

Read about Volt’s plan to improve digital rights for everyone here!

(Article by Sascha Mann, Policy Shaper for Digitalisation and Digital Rights)