Tech firms forced to tackle ‘tsunami of online child abuse’ by changing online safety law | Science and technology news
New laws will give regulators the power to force tech companies to stop child sexual abuse on their platforms.
The amendment to the Online Safety Act announced today by the Home Office will allow Ofcom to require big tech companies like Facebook and Google to use their best efforts to prevent, identify and eliminate child sexual abuse.
The move was welcomed by the National Society for the Prevention of Cruelty to Children (NSPCC), which said it would help stem the so-called “tsunami of online child abuse.”
The change is a small but significant boost to the powers of Ofcom, which will become the technology and social media regulator when the proposed Online Safety Bill comes into force.
This allows Ofcom to insist on evidence that child sexual abuse is being tackled, even as the technology behind the platform changes.
Meta, which owns Facebook, WhatsApp and Instagram, has announced plans to effectively lock down Facebook Messenger and Instagram direct messages with end-to-end encryption, a technology that keeps conversations safe but can also make them inaccessible to anyone trying to protect her.
Pros and cons of encryption
Home Secretary Priti Patel has strongly condemned Meta’s encryption plans, calling them “morally wrong and dangerous”, and law enforcement agencies including Interpol and Britain’s National Crime Agency (NCA) have criticized the technology.
But Whitehall officials insist they are not against encryption itself, just the problems it poses for law enforcement and police forces, who need direct evidence of involvement in child sex abuse to investigate and make arrests.
The Internet Watch Foundation was successful last year blocked 8.8 million attempts by UK internet users to access videos and images of abused children.
Faced with exploitation on this scale, officials argue they must at least maintain their current level of access, which depends on tech companies reporting instances of abuse to authorities.
The case of David WilsonFor example, posing as girls online to elicit sexually explicit images from boys was launched, according to a report by Meta. Wilson was sentenced to 25 years in prison in 2021 after admitting 96 offences.
The new law will give Ofcom the power to insist that tech companies inside and outside the UK identify and remove child sexual abuse content, potentially giving the UK regulator the power to break encryption around the world.
But officials argue that doesn’t mean apps and other services can’t be encrypted, saying there are technologies that can give police forces access to the material they need without compromising privacy.
The new law will oblige tech companies to take action against child sexual abuse “where proportionate and necessary”, giving Ofcom the ability to balance user safety and child safety.
But while this move may sound like a peace settlement on the contentious issue of encryption, it may not mean the end of the conflict.
Read more: Online Safety Law: Illegal and harmful content could bypass new safety laws, MPs warn
New law banning cyberflashing to be included in Online Safety Bill
“Tsunami of Online Child Abuse”
Attempts by Apple to scan iPhone images for known child sexual abuse images were delayed last year after an outcry from privacy activists.
Dubbed NeuralHash, the system is designed to identify images in a privacy-friendly manner by performing the analysis locally on the phone rather than in Apple’s data centers. However, privacy activists argued that the software could be misused by governments or authoritarian states.
Whitehall officials say the fears are overblown, citing findings from the Safety Tech Challenge Fund, a government-funded collaboration with industry to develop technology that can “protect children in end-to-end encrypted environments” – such as an algorithm that switches the camera automatically off when it detects nudity filming.
The announcement of the law change comes as police data obtained by the NSPCC showed what the charity described as a “tsunami of online child abuse”.
Freedom of Information requests filed by the charity found that offenses involving sexual communication with a child have increased by 80% in four years, rising to 6,156 in the last year since records began – an average of nearly 120 offenses per week.
Sir Peter Wanless, NSPCC chief executive, welcomed the change to the Online Harms Bill, saying it would strengthen the protection of private messages.
“This positive step shows that there doesn’t have to be a compromise between data protection and the detection and disruption of child abuse material and care,” he told Sky News.