Apple will inform police if they believe you have abusive images on your iPhone

Reading time icon 2 min. read

Readers help support MSPoweruser. When you make a purchase using links on our site, we may earn an affiliate commission. Tooltip Icon

Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more

iphone privacy

The Financial Times reports that Apple is working on a system which would scan photos on your iPhone for images of child abuse and then contact police if detected.

The so-called neuralMatch system has been trained on a database from the National Center for Missing and Exploited Children, and photos on your handset and uploaded to iCloud will be scanned continuously.

If an image suggestive of abuse is detected, the image will be referred to a team of human reviewers who will then alert law enforcement if an image is verified.

The system would initially be US-only.

Apple is of course not doing anything different from other cloud storage companies, though on-device scans is an exception.

In their support document Apple explains the benefits:

Apple does not learn anything about images that do not match the known CSAM database.

Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.

The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.

Users can’t access or view the database of known CSAM images.

Users can’t identify which images were flagged as CSAM by the system

The big concern of course is false positives and the potential consequence of these. While Apple says the risk is “extremely low”, due to the law of large numbers, if the risk is 1 in 1 million, it would mean 1000 of Apple’s billion iPhone users may end up having to explain themselves to the police despite not doing anything wrong.

The other, less immediate concern, expressed by the EFF, is that the system may be broadened by mission creep, or by pressure by totalitarian governments to include other imagery, for example of terrorist acts or symbols used by dissents or even LGBT imagery, a favourite targets if increasing right-wing governments in Eastern Europe.

The feature will be rolling out as part of iOS 15.

via Engadget

More about the topics: apple, Privacy

Leave a Reply

Your email address will not be published. Required fields are marked *