Apple Delays Adopting Controversial Child Safety Features | News from science and technology
Apple has announced that it is postponing the rollout of controversial child safety features that were slated to hit the market later this year.
The length of the delay is not clear, but the company faced significant criticism following the announcement of its child sexual abuse materials (CSAM) detection system in August Scan iPhone pictures automatically before they are uploaded to iCloud.
Most of all, the concerns of academics and security professionals were that the system might be modified to look for non-CSAM images that might be of interest to government agencies.
In a statement on Friday, Apple described the plans as “features designed to help protect children from predators who use means of communication to recruit and exploit them, and limit the spread of CSAM”.
It states: “Based on feedback from customers, stakeholders, researchers and others, we have decided to take additional time in the coming months to gather input and make improvements before releasing these critically important child safety features. “
Andy Burrows, director of online child safety policy at NSPCC, said, “This is an incredibly disappointing delay. Apple was on track to introduce really meaningful technological solutions that would undeniably make a world of difference in keeping children safe from online abuse and “could have set an industry standard.
“They tried to take a proportionate approach that looked for images of child abuse in a privacy-conscious manner and that balanced user security and privacy.
“We hope that in the face of criticism, Apple will consider asserting itself rather than postponing important child protection measures,” added Burrows.
In an initial response to fears expressed, Apple said it would reject “such requests” by governments to “force Apple to add non-CSAM images to the hash list” – referring to the list of fingerprints in use to identify abusive material without sharing that material yourself.
“We have previously challenged ourselves to create and implement government-mandated changes that compromise user privacy and have steadfastly opposed those requests. We will continue to reject them in the future, ”added the company.
The company then added that it would use a hash list of material provided by authorities in multiple countries to further reduce the risk that a single authority might attempt to exploit the system for surveillance purposes.
Apple’s ability to withstand demands from government agencies has been questioned by Professor Steven Murdoch of University College London, who stated that the company’s refusal “to develop new functions for unlocking an iPhone“is” different than adding a hash to an existing database “.
Apple stated, “Let’s be clear that this technology is limited to recognizing CSAM stored in iCloud, and we are not going to comply with any request from any government to expand it.”
Professor Murdoch noted similarities with an existing system in the UK that Internet Service Providers (ISPs) could use to block abusive material and then were forced to expand to less serious crimes such as intellectual property infringement.
ISPs – including British Sky Broadcasting Limited, then owner of Sky News – lost in court when they tried to challenge this.
In his ruling, Judge Arnold stated, “The contracts do not require ISPs to acquire new technology: they already have the technology required. In fact, most ISPs now have greater technical capacity to implement such orders than they did three years ago. “
Thank You For Visiting. Please Support This Site By SHARING And Following Us In The Social Networks.