Press "Enter" to skip to content

UK Rips Facebook’s Encryption Plan for Messenger, Saying It Would Impede Efforts to Stop Child Sexual Abuse

By Robert Bateman

The U.K. government has hit Facebook’s plan to implement end-to-end encryption on its Messenger platform, claiming the move would hamper efforts to combat child sexual abuse. 

But experts told Digital Privacy News that the government’s alternative proposals were technically infeasible and would pose an unacceptable risk to privacy.

End-to-end encryption would mean messages and content sent via Messenger were unintelligible to anyone without access to the sender or recipient’s device. The change also would mean that messages could no longer be scanned in transit to detect illegal content.

‘Blind Itself’

At a charity event last month, U.K. Home Secretary Priti Patel said that Facebook intended to “blind itself” to child sexual-abuse material on its platform, and that the company should proceed only “in a way in which is also consistent with public protection and child safety.”

But Patel’s remarks have been met with skepticism and alarm by privacy experts — some of whom told Digital Privacy News that her objections were counterproductive to online safety efforts.

Facebook should proceed only “in a way in which is also consistent with public protection and child safety.”

U.K. Home Secretary Priti Patel.

“Encryption is the only thing protecting internet communications from being surveilled by companies, criminals — and worse,” said Danny O’Brien, strategy director for the Electronic Frontier Foundation (EFF) in San Francisco. 

“These calls are like demanding we give up the locks on our houses for the safety of the children inside.

“It doesn’t make any sense technically, and it makes even less sense socially,” he said.

Government Responds

A U.K. government spokesperson told Digital Privacy News: “While we remain in favor of strong encryption, end-to-end encryption can pose an unacceptable risk. 

“If implemented without considering public safety, it will prevent any access to messaging content and severely erode tech companies’ ability to tackle the most serious illegal content on their own platforms, including child abuse,” the spokesperson said.

“These calls are like demanding we give up the locks on our houses for the safety of the children inside.”

Danny O’Brien, Electronic Frontier Foundation.

“Companies should not be blind to this abhorrent issue on their platforms — last year alone, U.S. technology companies made 21.4 million referrals of child sexual abuse, equating to around 65 million images.

“The home secretary has been clear that the industry must step up, and companies must not introduce end-to-end encryption in such a way as to blind themselves and law enforcement to these terrible crimes,” the representative said.

Letter to Facebook

The U.K. repeatedly has attacked Facebook’s proposals to encrypt Messenger communications by default, which were first announced by CEO Mark Zuckerberg in March 2019.

Last October, Home Secretary Patel was among several signatories to an open letter calling on Facebook not to encrypt its platform “without including a means for lawful access to the content of communications.”

“Companies should not be blind to this abhorrent issue on their platforms.”

U.K. Government spokesman.

The letter cited data from the National Center for Missing and Exploited Children (NCMEC), a nonprofit in the U.S. to which tech companies can report child sexual-abuse material, suggesting that implementing end-to-end encryption could slash reports to the center’s tip line by more than half.

However, February research published by Facebook suggested that more than 75% of its NCMEC disclosures related to “non-malicious” users, whose sharing of illegal material likely was motivated by factors other than the intent to harm a child, such as “outrage or poor humor.”

Comments at Charity Event

Home Secretary Patel’s April 12 comments came at an event hosted by the U.K. charity National Society for the Prevention of Cruelty to Children (NSPCC).

Andy Burrows, head of the society’s child-safety online policy, told Digital Privacy News that Facebook played “a crucial role in detecting and disrupting child sexual abuse.” 

“End-to-end encryption in its current form risks engineering away Facebook’s ability to identify abuse,” he said, claiming that the company’s plans “could lead to an estimated 70% drop in global child-abuse reports.”

But Burrows said NSPCC was not “calling for a ‘backdoor’ for law enforcement.” 

“End-to-end encryption in its current form risks engineering away Facebook’s ability to identify abuse.”

Andy Burrows, NSPCC.

“We want Facebook to invest in engineering solutions to ensure child-abuse detection can be maintained in end-to-end encrypted environments and young users can be protected from abuse at an early stage.”

A report last year by UNICEF claimed that end-to-end encryption “impedes efforts to monitor and remove child sexual-abuse materials.” 

But the report also emphasized that it is “currently unclear how many investigations or arrests directly derive” from Facebook’s reports of such material or “how many fewer would have been made with end-to-end encryption implemented.”

No ‘Absolutist Positions’

UNICEF further cautioned against “absolutist positions” on end-to-end encryption, emphasizing that the technology was “necessary to protect the privacy and security of all people using digital communication channels.” 

“It’s right to say we should move away from absolutist positions on a particular technology and start asking questions about the risks that people are identifying, and how we address them,” said Jim Killock, executive director of the U.K. charity Open Rights Group.

“The key thing is to focus on the results required, rather than the methods to achieve them.”

Jim Killock, Open Rights Group.

Killock said he remained confident that there were “methods of detecting abusive actors that do not rely on demands to end or limit encryption.

“The key thing is to focus on the results required, rather than the methods to achieve them,” Killock told Digital Privacy News. 

“It is also necessary that the people involved in these conversations include technology experts beyond the technology companies, to help evaluate whether what they are saying is reasonable or not.”

Robert Bateman is a writer in Brighton, U.K.

Sources: