Exactly. Nothing is fault free. It’s disgusting that an employee could see a picture, if flagged up incorrectly, a PRIVATE picture of my daughter in the paddling pool for example. It’s just unacceptable. [snip]
I was a little bit unnerved when the news initially broke about CSAM and lots of talking heads took to twitter to lambast something they had very little information about. As I read more about it, I've gotten far more comfortable. I think, unfortunately, there's still way too much FUD (Fear, Uncertainty, Doubt) as illustrated by your comment above.
It doesn't sound like you have, but I would urge you (and others who express similar concerns) to read at least part of Apple's CSM Detection Technical Summary:
It plainly answers questions such as yours in the introduction. it does start getting a little dense the further into the document, especially if you fork off and read the linked Apple PSI (Private Set Intersection) document, but key take aways can be found right in the Introduction (included below)
CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). This process is secure, and is expressly designed to preserve user privacy.
CSAM Detection provides these privacy and security assurances:
- Apple does not learn anything about images that do not match the known CSAM database.
- Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
- The risk of the system incorrectly flagging an account is extremely low. In addition, Apple manually reviews all reports made to NCMEC to ensure reporting accuracy.
- Users can’t access or view the database of known CSAM images.
- Users can’t identify which images were flagged as CSAM by the system.
For detailed information about the cryptographic protocol and security proofs that the CSAM Detection process uses, see The Apple PSI System.
Last edited by a moderator: