Two senior technical directors at GCHQ, the UK’s cyber intelligence agency, have published a new paper analysing how technology companies could protect children from sexual abuse online.
The impact of child sexual abuse can last a lifetime even if the abuse takes place online. Research by the Independent Inquiry into Child Sexual Abuse found survivors often suffer serious physical and mental health conditions in later life.
One of the challenges in tackling this online abuse is the growing number of services offering end-to-end encryption, technology which often undermines the existing safety features that many companies use to detect child sexual abuse material.
But without using end-to-end encryption, any hacker or even lawful authority – and perhaps even workers at the messaging company – who could access the service’s internal controls would be able to read those messages.
The new paper is written by Dr Ian Levy, technical director at the UK’s National Cyber Security Centre (NCSC) – a part of GCHQ – and Crispin Robinson, the technical director cryptanalysis at GCHQ, both trained mathematicians and career intelligence officers whose work involves tackling child sexual abuse online.
They describe seven “harm archetypes” to frame the problem in a new way, covering everything from children being groomed by offenders through to adults sharing indecent images of children out of shock, and note how each of these harmful behaviours has a particular technical profile that can be addressed in a specific way.
‘Child sexual abuse is a societal problem’
In particular it recommends revisiting a recent controversial proposal by Apple of pre-emptively scanning all iPhones for child sexual abuse material (CSAM) as a potential solution to some harms, if it were architected properly to protect against others.
Chief among the fears of academics and security experts was that Apple’s system could be modified to search for non-CSAM images that may be of interest to government authorities. The company subsequently indefinitely delayed the proposal.
Although the 67-page document is not meant to represent British government policy the authors acknowledge hoping to help develop policy about tackling online abuse on a global basis.
It is published as the government’s Online Safety Bill faces an extensive delay due in part to criticisms about its unscientific approach to defining the harms that internet users can experience online.
The paper was completed long before the delay to the bill was announced.
It comes as the government proposes including an amendment which will give regulators the power to force technology companies to stop sexual abuse of children on their platforms.
Dr Levy and Mr Robinson write: “Child sexual abuse is a societal problem that was not created by the internet and combating it requires an all-of-society response.
“However, online activity uniquely allows offenders to scale their activities, but also enables entirely new online-only harms, the effects of which are just as catastrophic for the victims,” they add.
“We hope this paper will help the debate around combating child sexual abuse on end-to-end encrypted services, for the first time setting out clearly the details and complexities of the problem.”
‘Barriers to child protection are not technical’
The authors say that the problem is “much more complex than other government needs, such as exceptional access” in reference to a previous collaboration in 2018.
Then the pair wrote an article published in Lawfare, a popular US blog about national security, calling for a “more informed” debate about end-to-end encryption and the “exceptional access” that law enforcement might need to those services.
They proposed as a solution at the time secretly introducing another end to these messaging services, ensuring that law enforcement could access the communications.
It was only a hypothetical proposal, but it proved extremely controversial and has not been adopted by most platforms that offer end-to-end encryption.
It successfully provoked dozens of high-profile articles discussing the merits of the idea, from academia, civil society, and industry – although most of them were critical and did not offer any solutions to the problem described.
The authors hope their new paper invites more constructive engagement.
Andy Burrows, who heads child safety online policy at the NSPCC, described the paper as an “important and highly credible intervention” which “breaks through the false binary that children’s fundamental right to safety online can only be achieved at the expense of adult privacy.
“The report demonstrates it will be technically feasible to identify child abuse material and grooming in end-to end-encrypted products. It’s clear that barriers to child protection are not technical, but driven by tech companies that don’t want to develop a balanced settlement for their users.
“The Online Safety Bill is an opportunity to tackle child abuse taking place at an industrial scale. Despite the breathless suggestions that the Bill could ‘break’ encryption, it’s clear that legislation can incentivise companies to develop technical solutions and deliver safer and more private online services.”