Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,106
38,860


Apple today shared a document that provides a more detailed overview of the child safety features that it first announced last week, including design principles, security and privacy requirements, and threat model considerations.

iphone-communication-safety-feature.jpg

Apple's plan to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos has been particularly controversial and has prompted concerns from some security researchers, the non-profit Electronic Frontier Foundation, and others about the system potentially being abused by governments as a form of mass surveillance.

The document aims to address these concerns and reiterates some details that surfaced earlier in an interview with Apple's software engineering chief Craig Federighi, including that Apple expects to set an initial match threshold of 30 known CSAM images before an iCloud account is flagged for manual review by the company.

Apple also said that the on-device database of known CSAM images contains only entries that were independently submitted by two or more child safety organizations operating in separate sovereign jurisdictions and not under the control of the same government.
The system is designed so that a user need not trust Apple, any other single entity, or even any set of possibly-colluding entities from the same sovereign jurisdiction (that is, under the control of the same government) to be confident that the system is functioning as advertised. This is achieved through several interlocking mechanisms, including the intrinsic auditability of a single software image distributed worldwide for execution on-device, a requirement that any perceptual image hashes included in the on-device encrypted CSAM database are provided independently by two or more child safety organizations from separate sovereign jurisdictions, and lastly, a human review process to prevent any errant reports.
Apple added that it will publish a support document on its website containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, Apple said users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the support document. No timeframe was provided for this.

In a memo obtained by Bloomberg's Mark Gurman, Apple said it will have an independent auditor review the system as well. The memo noted that Apple retail employees may be getting questions from customers about the child safety features and linked to a FAQ that Apple shared earlier this week as a resource the employees can use to address the questions and provide more clarity and transparency to customers.

Apple initially said the new child safety features would be coming to the iPhone, iPad, and Mac with software updates later this year, and the company said the features would be available in the U.S. only at launch. Despite facing criticism, Apple today said it has not made any changes to this timeframe for rolling out the features to users.

Article Link: Apple Outlines Security and Privacy of CSAM Detection System in New Document
 
Last edited:
  • Like
Reactions: ikir
30 images seems high (I know it could include false positives). And now that all this info is public those losers will know how to get around it
Apple is trying so hard to cover themselves. It's not going to work.

They are really trying to brainwash the world with this CSAM.
 
Mass surveillance of a billion iPhone users for what – now that every criminal has been warned?

Since it is on the device it looks like a first step, the second step could be a neural network detecting new images (taken with the camera).

It's just unacceptable – I won't update software or hardware.
 
It’s funny how Apple deeply believes that we just don’t understand the feature. I fully understand it, and I’m against it. Not because of the feature itself (who could be way more intrusive than that) but because of the risk of abuses of that backdoor.
Apple said it will have an independent auditor review the system. Are you kidding me?

Respect our Privacy and Human rights.

Apple is really treating us like guinea pigs.
 
Last edited:
Apple is trying so hard to cover themselves. It's not going to work.

They are really trying to brainwash the world with this CSAM.

Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
I just don't want Apple to be scanning iCloud period. It's a way to look over and go through our privacy. What if information gets leak to the government or the criminals. Who's held responsible for that?

Find an alternative way to catch criminals. And, why Apple is even getting involve?
 
Last edited:
I just don't want Apple to be scanning iCloud period. It's a way to look over and go through our privacy.
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
I think Sally Mann had more than 30 photos in her Immediate Family exhibition. According to many they would all fall foul. All child abuse according to many. It is just a bad idea. AND all the kerfuffle about Pegasus.
 
and people will refuse to read anything about it and continue to lash out about privacy.

maybe actually understand what really is going on, how realistically it would affect you in a negative way, and what Apple's end game with this tech really is.

and no, "total invasion of your privacy" is not a real answer.
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature
 
Lots of difficult words in that explanation. I don’t think that tactic is going to work either.

People read “on device spying” and that’s that.

It baffles me that a company like Apple monumentally messed up their PR twice this summer: with AM Losless and now with this.

They should have just kept quiet about the CSAM thing and added to their iCloud T&C that they would scan for CSAM content. Nobody would have cared.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.