Apple will report child sexual abuse images on iCloud to the law

0


Steve Proehl | Corbis unpublished | Getty Images

Apple will report images of child exploitation uploaded to iCloud in the US to law enforcement, the company said on Thursday.

The new system recognizes images called Child Sexual Abuse Material (CSAM) using a process called hashing, which converts images into unique numbers that correspond to that image.

Apple started testing the system on Thursday, but most U.S. iPhone users won’t get into it until an iOS 15 update later this year, Apple said.

The move aligns Apple with other cloud services that are already scanning user files, often using hashing systems, for content that violates their terms of use, including images depicting child exploitation.

It also represents a test for Apple, stating that its system is more private to users than previous approaches to removing illegal images of child sexual abuse because it uses sophisticated cryptography on both Apple’s servers and user devices and does not scan actual images. only hashes.

However, many privacy-sensitive users still shy away from software that notifies governments of the contents of a device or in the cloud, and may react negatively to this announcement, especially as Apple has been vocal in defending device encryption and in countries with less voice protection than ours

Law enforcement agencies around the world have also pressured Apple to weaken encryption on iMessage and other software services like iCloud to investigate child exploitation or terrorism. Thursday’s announcement is a way for Apple to address some of these issues without abandoning some of its technical principles related to user privacy.

How it works

Before an image is stored in Apple’s iCloud, Apple matches the hash of the image with a hash database provided by the National Center for Missing and Exploited Children (NCMEC). Starting with an update to iOS 15, this database will be distributed in the iOS code. The comparison takes place on the user’s iPhone and not in the cloud, Apple said.

Then, when Apple detects a certain number of infringing files in an iCloud account, the system uploads a file that allows Apple to decrypt and view the pictures on that account. A person manually reviews the images to confirm whether there is a match or not.

Apple can only review images that match content already known and reported to these databases – for example, it will not be able to recognize the parents’ photos of their children in the bathtub because those images are not part of the NCMEC database .

If the person doing the manual review concludes that the system did not make a mistake, Apple will disable the user’s iCloud account and send a report to NCMEC or notify law enforcement if necessary. Users can file a complaint with Apple if they think their account has been accidentally flagged, an Apple representative said.

The system only works on pictures uploaded to iCloud that users can turn off, Apple said. Photos or other images on a device that have not been uploaded to Apple servers are not part of the system.

Some security researchers have raised concerns that this technology could potentially be used to identify other types of images, such as photos of a political protest. Apple said its system is built to only work and work only with images cataloged by NCMEC or other child protection organizations, and the way the cryptography is created prevents it from being used for other purposes is used.

Apple could not add additional hashes to the database, it said. Apple said it will show its system to cryptography experts to confirm that it can detect illegal images of child exploitation without compromising user privacy.

Apple unveiled the feature on Thursday along with other features designed to protect children from predators. As a separate feature, Apple uses machine learning on a child’s iPhone with a family account to obscure images that may contain nudity, and parents can be notified when a child under 13 receives sexual content in iMessage. Apple also updated Siri with information on how to report child exploitation.


Share.

Leave A Reply