EU plans to mandate scanning of encrypted messages for CSAM

"As is, this proposal would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk."

What you need to know

  • The EU appears set to unveil some very controversial legislation to try and stop the spread of child sexual abuse material.
  • A leaked new proposal appears to mandate the scanning of messages for CSAM content.
  • The provision appears to include encrypted messages and measures for detecting grooming that could extend beyond images to text as well.
  • WhatsApp chief Will Cathcart said the move "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk."

Update: 5/11 6:38 am ET: The EU has now published its proposals to fight child sexual abuse, including the document as leaked below.

A new leaked document appears to reveal that the EU is planning to mandate that the providers of messaging services like WhatsApp and iMessage must scan messages in order to detect child sexual abuse material (CSAM) and grooming of children within messages.

The document, shared online by Alec Muffett, states that the voluntary actions of platforms alone "have proven insufficient" to combat child sexual abuse online, and states:

The proposed Regulation consists of two main building blocks: first, it imposes on providers obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges, and, second, it establishes the EU Centre on Child Sexual Abuse as a decentralised agency to enable the implementation of the new Regulation.

The legislation includes "uniform obligations, applicable to all providers of hosting or interpersonal communication service offering such services in the EU's digital single market, to perform an assessment of risks of misuse of their services for the dissemination of known or new child sexual abuse material or for the solicitation of children (together defined as 'online child sexual abuse')" as well as more targeted obligations for certain providers "to detect such abuse, to report it via the EU Centre, to remove or disable access to, or to block online child sexual abuse material when so ordered."

The legislation appears to include statements that providers who use end-to-end encryption technology (like WhatsApp):

Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation. That includes the use of end-toend encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users

Muffett described the paragraph as "We want a backdoor, but we don't want just anyone to be able to use it. Only us good guys."

Professor of cryptography at John Hopkins, Matthew Green, stated "Speaking of actual free speech issues, the EU is proposing a regulation that could mandate scanning of encrypted messages for CSAM material. This is Apple all over again," in reference to Apple's own CSAM scanning debacle last year that would have seen Apple check iCloud images for CSAM hashes, a measure possibly taken in anticipation of legislation like the one the EU appears to be working on.

Green said the document "is the most terrifying thing I've ever seen" and that the EU was "proposing a new mass surveillance system that will read private text messages." This is a reference to provisions beyond CSAM detection that will try to detect the "grooming" of children in messages, the implications of which seem to be clear:

Green said the measure was " the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR."

Will Cathcart, head of WhatsApp, which is one of the best iPhone apps for messaging on devices like the iPhone 12 and iPhone 13, said the plans were "incredibly disappointing" because they fail to protect end-to-end encryption. Cathcart says the measures "would force companies to scan every person's messages and put EU citizens' privacy and security at serious risk" and that mandating the proposed system "built for one purpose in the EU" could be used to undermine human rights "in many different ways globally."

The document is reportedly set to be officially unveiled on May 11.

Comments are closed.