The European authorities in charge of the protection of private life ruthlessly slashed the Commission’s initiative in the fight against child pornography.
It is a subject for which the slightest reservation is inherently very complicated to express, given the issues of child protection and the moral imperative that there is to combat all child pornography content — CSAM in English, either child sexual abuse material. Yet it is on this difficult ground that key European bodies have positioned themselves.
In a joint press release published on July 29, 2022, and relayed by 01 Clear on August 3, the European Data Protection Supervisor and the European Data Protection Board, which brings together national EU bodies, such as the Commission Nationale de l’Informatique et des Libertés (CNIL) in France, spoke to warn Brussels.
In question ? L’initiative of the European Commission in the fight against child pornography, precisely. Presented during the month of May, it contains a provision which has given rise to lively debate, since it wishes to impose a scanning of all messages, including those which are encrypted, to detect any CSAM content and to take the appropriate measures, if necessary. applicable.
This initiative comes as the Old Continent wishes to update its regulatory framework on the protection of minors. But at the presentation of its plan, Brussels was the subject of strong criticism for a text which threatens the privacy of Internet users, in particular specialists in computer security and representatives of NGOs for the defense of digital freedoms.
A poorly written and excessive text
The opinion shared by the Committee and the Comptroller at the end of July points in the same direction. Yes, any sexual abuse of a child is ” a particularly serious and heinous crime “, which must be fought relentlessly. But this cannot be done by eroding other fundamental principles on which European societies are based.
And if it can be justified to limit punctually privacy rights and data protection (this occurs, for example, when a court authorizes the police to search a home), this must be done respecting the essence of fundamental rights and limiting these restrictions “ to what is strictly necessary and proportionate “.
And to translate a little more directly the feeling of the Committee and the Comptroller regarding the Commission’s proposal, this text is neither done nor to be done. ” In its current form, [elle] may pose more risk to individuals, and by extension to society as a whole, than to the criminals prosecuted by the CSAM “. The disavowal is strong.
And it does not stop there. ” There is a risk that the proposal will become the basis for widespread and indiscriminate scanning of the content of virtually all types of electronic communications “, because the writing is wobbly and vague. Not clear, detailed and precise enough, in summary. It is therefore necessary to reformulate the text to avoid such excesses.
The use of automatic tools for this work, based on “artificial intelligence”, is also viewed with suspicion, because such scanning of messages and media is a potential source of errors – cases of false positives are not lacking as soon as a task of this nature is automated — and, therefore, of unwarranted intrusions into the privacy of individuals.
The opinion of the two supervisory bodies, which is developed in a 36-page document, is therefore highly concerned about the potential consequences that this European regulation will have on the privacy and personal data of individuals. The copy is largely to be reviewed, even if the two parties have also noted potentially promising proposals.
The two authorities cite a future European center dedicated to the fight against CSAM. Very well, but they plead for moderation in the exchange of personal data between this center and Europol. These transfers should be made on a case-by-case basis, to avoid overly flexible sharing. So far, we are talking about full access to the appropriate data.
Finally, both the Committee and the Supervisor recalled that end-to-end encryption should not be a victim sacrificed on the altar of the fight against child pornography.
This one ” fundamentally contributes to privacy and confidentiality of communications, freedom of expression, innovation and the growth of the digital economy “. It’s out of the question ” prevent or otherwise discourage the use of end-to-end encryption [qui] would seriously weaken the role of encryption at all “.