Alex Murray, NCA’s director of threat leadership, said that the decline on Facebook contrasted with other top social media firms, which had increased detection and reports over the same period.
He said the introduction of encryption had prevented Facebook from seeing the illegal behaviour on its own platform, which then hindered the NCA’s ability to gather evidence, conduct investigations, safeguard sexually abused children and arrest the perpetrators.
“The widespread roll-out of end-to-end encryption by major tech companies, without sufficient consideration for the actual harm it will cause, is putting users in danger,” said Murray.
He said Facebook’s decrease had made children on the site “less safe”. He added: “Tech companies cannot protect children and their other customers and live up to the Online Safety Act when they choose not to see illegal behaviour, often victimising the most vulnerable, on their own systems.
“They are unable to proactively identify offending taking place, or provide evidence of such offending on request. [End-to-end encryption] design choices can massively reduce companies’ ability to detect, prevent and report the abuse of children.”
The data to be published by the National Center for Missing and Exploited Children (NCMEC) will also show a 20% drop in reports of child sexual abuse in 2024 by Elon Musk’s X platform, Google, Discord and Microsoft. It is the first reverse in the overall number of reports after a near doubling in the last six years.
‘Potentially losing a child’
In evidence on Facebook to a Senate committee, seen by The Telegraph, Michelle DeLaune, chief executive of NCMEC, said: “When a platform voluntarily chooses to blind itself to child sexual exploitation by disabling its ability to detect and report abuse, it is not just losing a report – it is potentially losing a child.
“Every lost report can represent a child who may never be identified, rescued, or safeguarded. It means the child’s ongoing abuse and repeated re-victimisation will continue unchecked, while offenders remain free to exploit more victims in the shadows.”
Paul Waugh, a member of the Commons culture committee, said companies like Facebook should not have been allowed to introduce encryption without guaranteed “safeguards for our security agencies and police”.
“Twenty years ago, someone like Gary Glitter had to go to the other side of the world to prey on children. Someone like Jeffrey Epstein had to create his own private paedophile island. Now, these monsters, all they have to do is go on to set up a group on Facebook Messenger,” he said.
Paul Stanfield, chief executive of the Edinburgh-based Childlight – Global Child Safety Institute, accused companies like Facebook of putting profit before children’s safety.
“He has left children isolated in the dark, vulnerable to grooming, extortion and abuse, while perpetrators act with impunity,” he said.
Trans-Atlantic legal battle
NCMEC had been expecting a decline in the raw numbers because of a change in reporting methods under which social media firms were asked to “bundle” related reports together to streamline the process and reduce duplication of incidents linked to a single “viral event”.
However, when it “unbundled” and analysed the data to make a year-on-year comparison, it found the overall number of incidents had declined from 36.2 million in 2023 to 29.2 million in 2024, with Facebook accounting for the biggest drop.
NCMEC told the US Senate committee the “likeliest factor” for the fall was Facebook’s implementation of end-to-end encryption that began in December 2023. The decline contrasts with all independent data showing online child sexual abuse is increasing and is being amplified by AI technology.
The disclosure comes amid a trans-Atlantic legal battle in which British security officials have demanded “backdoor” access to Apple users’ encrypted data as part of their efforts to combat child sexual abuse, terrorism and other illegal activities online.
Apple is fighting the demand and has taken the unprecedented step of removing its highest-level data security tool from customers in the UK.
Yvette Cooper, Home Secretary, is understood to have raised the “catastrophic risks” of end-to-end encryption earlier this year when she met Nick Clegg, then vice-president of Meta, the owner of Facebook, Instagram and WhatsApp.
Ministers have been advised that although the online safety act gives them powers to compel firms to develop detection technologies for child abuse, they can only be exercised at the end of a lengthy regulatory process that could take a “few years” to deploy.
‘No more excuses’
The Government warned it was prepared to introduce further legislation if tech firms failed to remove child abuse content.
“There can be no more excuses for the tragedies that this report highlights. Under the Online Safety Act, technology companies are already obliged to remove child abuse circulating on their networks from their platforms or face significant fines. We will not hesitate to go further if that is what it takes to keep our children safe online,” said a government spokesman.
A spokesman for Facebook said: “We partnered with NCMEC to streamline our reporting process by grouping duplicate viral or meme content into a single cybertip.
“This contributed significantly to the drop in cybertips last year, and allowed NCMEC and law enforcement to more easily manage and prioritise them. We’ll continue working with NCMEC to make our reports as valuable as possible and we expect to continue to report more than any of our peers.”