EU wants to screen private chats in the future

The sexual abuse of children has increased significantly in the past decade, also and especially on the Internet, as EU Interior Commissioner Ylva Johansson explained at the presentation in Brussels on Wednesday. 85 million images and videos were reported in a single year, she quoted a report – although reporting is currently voluntary and only a handful of companies are involved, according to Johansson. They want to counteract this in the future: “We will find you,” said the commissioner in the direction of the perpetrators.

At the presentation in Brussels, Johansson said that today it is not possible to “protect children”. She also pointed out that more than half of child pornography material is stored on servers in Europe. According to Johansson, the package now presented should make it easier to identify material containing child abuse.

EU Home Affairs Commissioner Ylva Johansson

APA/AFP/Kenzo Tribouillard

Johansson presented the EU plans on Wednesday in Brussels

New EU center to work with Europol

Three different types of sexual abuse are to be combated by the EU Commission’s proposals. Specifically, it is about the dissemination of already known depictions of child abuse, new, previously unknown material and the initiation of contacts, i.e. grooming, with children.

According to the EU, communication services are affected by the new rules, such as messengers and social networks such as WhatsApp, Instagram and Signal, app stores, Internet providers and hosting services. The prerequisite is that they offer their services in the EU.

The newly established EU Center against Child Abuse is responsible for the technical implementation – it is supposed to be an independent authority that works closely with Europol. And: The monitoring should only be allowed to be carried out with an “identification order”, and an independent authority in the respective member country should also be involved.

Expert: EU proposal “avoids sensitive decisions”

At the press conference, it was emphasized several times that the screening should be as “less intrusive as possible” for the privacy of the citizens. According to Johansson, the proposal is compatible with the General Data Protection Regulation (GDPR), among other things. She also referred to the ePrivacy regulation, which already allows providers to scan data traffic for spam and malware. Companies would look for it “because of profit” – now it should be their duty to also check for child pornographic material.

Programmer's screens

Getty Images/EyeEm/Rohane Hamilton

It is not mentioned how the proposals are to be implemented technically

But the question of how the EU Commissioner imagines that exactly went unanswered on Wednesday, even when asked. It was repeatedly referred to the result that counts – not the way to get there. “The EU Commission’s proposal avoids sensitive decisions that can have a major impact on privacy,” Alexander Fanta from the news site told “The Commission is thus passing the ball on to the platforms and the administrative level – an evasive maneuver intended to take the wind out of critics’ sails.”

Client-side scanning not excluded

Fanta also pointed out that the EU does not explicitly rule out any method: “This means that the new EU center against child abuse could even propose scanning all images and videos directly on the users’ devices, so-called client-side scanning. On the other hand, there is considerable resistance from data protection officers, because even encrypted communication could be controlled across the board.”

In fact, the feedback from data protection officials was primarily skeptical in advance: the German Chaos Computer Club (CCC), for example, spoke out against such measures. “This client-side scanning would not be the first excessive and misguided surveillance method justified with the fight against child abuse,” it said in an open letter on Monday.

Privacy advocates: Measures “unprecedented”

“Clearly, victims of child abuse need better help, but chat control is an over-the-top approach, easily bypassed, and hits in all the wrong places.” ineffective on top of that: “Criminals are already using distribution channels that would not be affected by these scans, and will easily evade the scans in the future,” according to the CCC.

“The notion of indiscriminate 24/7 scanning of private communications by hundreds of millions of people in the EU that they expect to be private is unprecedented,” Ella Jakubowska of European quoted Ella Jakubowska as saying Digital Rights (EDRi).

Activists see basis for further measures

Of course, data protection activists are not against investigations into child abuse – the fear that is often expressed in this context, however, is that this will create the technical and legal framework for further invasions of privacy. Such methods could also be used against terrorism and extremism – and unauthorized access cannot be completely ruled out either.

In any case, it could still take some time before the Commission’s proposal is implemented: the EU Parliament will deal with it first – reactions there were mixed in advance. Later, the Commission, Parliament and above all the member states will have to come to an agreement before the regulation takes effect.

Greens and NEOS express concerns

Süleyman Zorba, network policy spokesman for the Greens, criticized the project: “This attack on the fundamental right to family and private life is disproportionate and completely misses the important goal of child protection. Trade in abusive content mainly takes place on illegal platforms and not on common messenger services.”

NEOS data protection spokesman Niki Scherak wrote in a broadcast: “The planned procedure places all Internet users under general suspicion and endangers free communication and everyone’s privacy.” He warns: “Anyone who can search for a term today can search for anything tomorrow.”

Leave a Comment