By Foo Yun Chee
BRUSSELS (Reuters) – EU lawmakers agreed on Wednesday to draft rules requiring Alphabet’s Google, Meta and other online services to identify and remove online child pornography, saying that end-to-end encryption would not be affected.
The draft rule on child sexual abuse material (CSAM), proposed by the European Commission last year, has been a bone of contention between advocates of online safety measures and privacy activists worried about surveillance.
The European Union executive came up with the CSAM proposal after the current system of voluntary detection and reporting by companies proved to be insufficient to protect children.
EU lawmakers have to thrash out the final details with member states before the draft can become legislation in a process that may be finalised next year.
The proposed legislation forces messaging services, app stores and internet access providers to report and remove known and new images and videos, as well as cases of grooming.
An EU Centre on Child Sexual Abuse will be set up to act as a hub of expertise and to forward reports to the police.
To avoid mass surveillance, EU lawmakers beefed up detection orders to allow judicial authorities to authorise time-limited orders to find and delete CSAM. These can only be issued if there is reasonable grounds of suspicion of child sexual abuse.
Companies would also be able to choose the technology used to detect such offences, as long as this is subject to an independent, public audit.
The decision by lawmakers to exempt end-to-end encryption from the draft rules drew praise from privacy activists.
“The European Parliament’s position removes indiscriminate chat control and allows only for targeted surveillance of specific individuals and groups reasonably suspicious of being linked to child sexual abuse material with a judicial warrant,” The European Liberal Youth (LYMEC) said.
(Reporting by Foo Yun Chee; Editing by Alexander Smith)