Child sexual abuse online: effective measures, no mass surveillance

On Tuesday, the Civil Liberties Committee adopted its position on new measures to protect children online by preventing and stopping child sexual abuse.

The draft Parliament position was adopted by the Committee on Civil Liberties, Justice and Home Affairs with 51 votes in favour, 2 against, and 1 abstaining. Inter-institutional negotiations were authorised with 48 in favour, 2 against, and 4 abstaining.

To protect children online, the new rules would mandate internet providers to assess whether there is a significant risk of their services being misused for online child sexual abuse and to solicit children, and to take measures to mitigate these risks. MEPs want mitigation measures to be targeted, proportionate and effective, and providers should be able to decide which ones to use. They also want to ensure that pornographic sites have adequate age verification systems, flagging mechanisms for child sexual abuse material (CSAM) and human content moderation to process these reports.

To stop minors being solicited online, MEPs propose that services targeting children should require by default user consent for unsolicited messages, have blocking and muting options, and boost parental controls.


Detection orders

To avoid mass surveillance or generalised monitoring of the internet, the draft law would allow judicial authorities to authorise time-limited orders, as a last resort, to detect any CSAM and take it down or disable access to it, when mitigation measures are not effective in taking it down.

In addition, MEPs emphasise the need to target detection orders to individuals or groups (including subscribers to a channel) linked to child sexual abuse using “reasonable grounds of suspicion”.

In the adopted text, MEPs excluded end-to-end encryption from the scope of the detection orders to guarantee that all users’ communications are secure and confidential. Providers would be able to choose which technologies to use as long as they comply with the strong safeguards foreseen in the law, and subject to an independent, public audit of these technologies.


EU Centre for Child Protection

The law would set up an EU Centre for Child Protection to help implement the new rules and support internet providers in detecting CSAM. It would collect, filter and distribute CSAM reports to competent national authorities and Europol. The Centre would develop detection technologies for providers and maintain a database of hashes and other technical indicators of CSAM identified by national authorities.

The Centre would also support national authorities as they enforce the new child sexual abuse rulebook, conduct investigations and levy fines of up to 6% of worldwide turnover for non-compliance.

MEPs finally propose to create a new Victim’s Rights and Survivors Consultative Forum to make sure that victims’ voices are heard.


Quote

Rapporteur Javier Zarzalejos (EPP, Spain) said: “To meet this compelling challenge effectively, we have found a legally sound compromise supported by all political groups. It will create uniform rules to fight the sexual abuse of children online, meaning that all providers will have to assess if there is a risk of abuse in their services and mitigate those with tailor-made measures. As a last resort, detection orders can be used to take down abusive material still circulating on the internet. This agreement strikes a balance between protecting children and protecting privacy.”


Next steps

The draft Parliament position still needs to be endorsed by the plenary. On 20 November, the start of negotiations will be announced, and MEPs have until the end of the following day to object. If a sufficient number choose to do so, there will be a vote during the same session.