All yours WhatsApp photos, iMessage texts and Snapchat videos could be scanned to check for child sexual abuse images and videos under newly proposed European rules. Experts warn the plans could undermine the end-to-end encryption that protects billions of messages sent every day and hamper people’s privacy online.
Today, the European Commission unveiled long-awaited proposals aimed at tackling the huge volumes of child sexual abuse material, also known as CSAM, uploaded to the web every year. The proposed law creates a new EU center to deal with child abuse content and introduces obligations for tech companies to “detect, report, block and remove” CSAM from their platforms. The law, announced by European Home Affairs Commissioner Ylva Johansson, says tech companies have failed to voluntarily remove abusive content and it is welcomed from child protection and safety groups.
Under the plans, technology companies – ranging from web hosting services to messaging platforms – could be ordered to “discover” both new and previously discovered CSAMs, as well as potential “trimming” cases. Detection can occur in chat messages, files uploaded to online services, or on websites that host offensive material. The plans echo Apple’s efforts since last year to scan people’s iPhone photos for offensive content before they are uploaded to iCloud. Apple halted its efforts after widespread backlash.
If passed, the European legislation will require technology companies to carry out risk assessments of their services to assess the levels of CSAM on their platforms and existing prevention measures. If necessary, regulators or courts can then issue “discovery orders” requiring technology companies to begin “installing and operating technologies” to detect CSAM. These discovery orders will be issued for specific periods of time. The draft law does not specify what technologies must be installed or how they will work – these will be checked by the new EU center – but says they must be used even when end-to-end encryption is available.
The European proposal to scan people’s messages has been met with dismay from civil rights groups and security experts who say it is likely to undermine the end-to-end encryption that has become standard for messaging apps such as iMessage, WhatsApp and Signal. “It’s incredibly disappointing to see a proposed EU internet regulation fail to protect end-to-end encryption,” WhatsApp chief Will Cathcart tweeted. “This proposal would force companies to scan every person’s messages and put the privacy and security of EU citizens at serious risk.” Any system that weakens end-to-end encryption can be abused or extended to look for other types of content, say the researchers.
“You either have E2EE or you don’t,” says Alan Woodward, professor of cyber security at the University of Surrey. End-to-end encryption protects people’s privacy and security by ensuring that only the sender and receiver of messages can see their content. For example, Meta, the owner of WhatsApp, has no way to read your messages or mine their content for data. The draft EU regulation says solutions must not weaken encryption and says it includes safeguards to ensure this does not happen; however, it does not give details on how this will work.