Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
”…a database of known CSAM images…”

Wait, that exists?
Yes I believe I read that NCMEC is going to provide the hashed database to Apple.

Edit: John Gruber said in his piece:

“The database will be part of iOS 15, and is a database of fingerprints, not images. Apple does not have the images in NCMEC’s library of known CSAM, and in fact cannot — NCMEC is the only organization in the U.S. that is legally permitted to possess these photos.”
 
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
and why would they not do that server side? with other clouds services ?
 
It’s exactly the opposite. Apple has chose the hard way (client-side) to protect your privacy as no data leaves your device.
If the check is done server-side, your photos need to be decrypted leaving them potentially vulnerable and access to Apple (and maybe other parties).

There shouldn’t be any type of scanning or “checking” at all.
 
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.
What "analysis is done on device"? If you had actually read what Apple is doing this is only regarding images loaded up the Cloud. Heck, Youtube has something similar regarding copyright flagging videos before they are actually posted. Learn how the flipping tech works.
 
So you don't think the below applies in this case [Link to 'slippery slope' fallacy]?
Only in the very specific case of the post you are replying to. Elsewhere in this debate, plenty of arguments have been presented as to why doing A might increase the risk of Z. Some (knowledgable) groups are claiming that this scanning would be illegal in the EU, others that it might be unconstitutional in the US, and/or that the clauses in the T&Cs whereby users agree to this are unconscionable - if Apple wins a court ruling on that subject, it could set/break a legal precedent which would make future expansion easier. Then there's the self-evident fact that by possessing the facility, Apple could be legally forced to (ab)use it - true, they could also be legally forced to create such a facility from scratch, but already having the infrastructure makes it easier.

Also, if you follow these threads you'll see some people defending Apple by saying something like "your photos are only checked if you choose to uploaded them to iCloud"... When did it get to be OK that your photos can be scanned on iCloud? Did people dismiss any fears about that as "slippery slope fallacies?"

Anyway, these "logical fallacies" sites themselves tend to play to the fallacy that every debate is over a "falsifiable" fact that can be proven true or false. They mix up genuine logical fallacies of the "Dogs have 4 legs, Felix has 4 legs, therefore Felix is a dog" variety with common argument tropes that are often used as weak or fallacious arguments. Or to put it another way "Bob used 'slippery slope' in a fallacious argument, therefore all 'slippery slope' arguments are fallacies." - To be fair, most of them cover themselves once you read past the first sentence.

Still, from that site, "Want to share this fallacy on Facebook? Here's a button for you:" is the funniest line I've read all week. :)
 
So which law in US is Apple mandated to follow? It would be good to know because none of the articles cite this and it would mean google would have to follow this too?
Likely 18 U.S. Code § 2258 - Failure to report child abuse and related laws:
* 18 U.S. Code § 2258A - Reporting requirements of providers
* 18 U.S. Code § 2258B - Limited liability for providers or domain name registrars
* 18 U.S. Code § 2258C
* 18 U.S. Code § 2258D - Limited liability for NCMEC
* 18 U.S. Code § 2258E - Definitions
 
Last edited:
What "analysis is done on device"? If you had actually read what Apple is doing this is only regarding images loaded up the Cloud. Heck, Youtube has something similar regarding copyright flagging videos before they are actually posted. Learn how the flipping tech works.

The hash database is stored on your device, and will scan your photos as long as iCloud photos is turned on, even when you have no internet connection. Yes, the scanning is done *on your device*.
 
“The database will be part of iOS 15, and is a database of fingerprints, not images. Apple does not have the images in NCMEC’s library of known CSAM, and in fact cannot — NCMEC is the only organization in the U.S. that is legally permitted to possess these photos.”
So how are Apple going to manually check flagged accounts before reporting them to NCMEC - which they have promised - if they don't have the images that they supposedly matched? And, no, it's not necessarily always going to be a case of "this is a picture of a tree, not a naked child so it is obviously a false match" because the type of "fingerprint" they are using means that a false match is likely to be visually similar to a CSAM image in at least some way. Will Apple's checkers have the authority to make a judgement, or (more likely) will they be required to presume that a match is valid unless it's a glaringly obvious computer fault?
 
  • Like
Reactions: IG88
That has nothing to do with it. Apple doesn’t scan the photos or look what’s in it.
The software running on your iPhone analyses your photos before they are uploaded to see if they are visually similar to anything in a third party database of known CSAM images.

They're using a perceptual hash which has more in common with tagging faces in a photo than using a cryptographic hash to verify that two files are identical. The latter would be useless for this purpose, since changing a single pixel in an image would - by design - produce a different cryptographic hash. I'd give it a better-than-even chance that tagging your friends' faces in photos uses something very much like perceptual hashes.
 
and why would they not do that server side? with other clouds services ?
Server side, I’m pretty sure its already done in some jurisdictions when the data is not end-to-end encrypted. There’s a reason why the CCP want all Chinese data to reside in local servers hosted by local partners (and performance isn’t likely the top one)...

Besides, doing it locally saves money due to less computational power being used (and server GPUs are at least 2x the price of consumer ones for the same computing capacity).
 
  • Like
Reactions: ItWasNotMe
So how are Apple going to manually check flagged accounts before reporting them to NCMEC - which they have promised - if they don't have the images that they supposedly matched? And, no, it's not necessarily always going to be a case of "this is a picture of a tree, not a naked child so it is obviously a false match" because the type of "fingerprint" they are using means that a false match is likely to be visually similar to a CSAM image in at least some way. Will Apple's checkers have the authority to make a judgement, or (more likely) will they be required to presume that a match is valid unless it's a glaringly obvious computer fault?
Apple checks the contents of the safety vouchers which contain an encrypted hash of the match plus a "visual derivative" of the image.

Again, from Gruber's article:

"Furthermore, one match isn’t enough to trigger any action. There’s a “threshold” — some number of matches against the CSAM database — that must be met. Apple isn’t saying what this threshold number is, but, for the sake of argument, let’s say that threshold is 10. With 10 or fewer matches, nothing happens, and nothing can happen on Apple’s end. Only after 11 matches (threshold + 1) will Apple be alerted. Even then, someone at Apple will investigate, by examining the contents of the safety vouchers that will accompany each photo in iCloud Photo Library. These vouchers are encrypted such that they can only be decrypted on the server side if threshold + 1 matches have been identified. From Apple’s own description:

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Even if your account is — against those one in a trillion odds, if Apple’s math is correct — incorrectly flagged for exceeding the threshold, someone at Apple will examine the contents of the safety vouchers for those flagged images before reporting the incident to law enforcement. Apple is cryptographically only able to examine the safety vouchers for those images whose fingerprints matched items in the CSAM database. The vouchers include a “visual derivative” of the image — basically a low-res version of the image. If innocent photos are somehow wrongly flagged, Apple’s reviewers should notice."

Apple does not have the actual CSAM images - they cannot. Only NCMEC is allowed to have the actual images.
 
  • Like
Reactions: hans1972
What "analysis is done on device"? If you had actually read what Apple is doing this is only regarding images loaded up the Cloud. Heck, Youtube has something similar regarding copyright flagging videos before they are actually posted. Learn how the flipping tech works.

How about you read the flipping article I'm was commenting on?
...Craig Federighi, said that the on-device nature of Apple's CSAM detection method, compared to others such as Google who complete the process in the cloud...
 
None of those require on device scanning.
Agreed. But does having them sitting on an Apple - Server via iCloud Photos require reporting? If so, I would prefer to know about it before I upload the photo. The only way that it can be caught is by scanning during the upload process but before it is uploaded (ie. on device). From my understanding, that is what the process is intending to do.
 
What "analysis is done on device"? If you had actually read what Apple is doing this is only regarding images loaded up the Cloud. Heck, Youtube has something similar regarding copyright flagging videos before they are actually posted. Learn how the flipping tech works.
There is a difference here. YouTube allows the video to be uploaded and then scans it for flagrancy. In this case, the images are scanned BEFORE they are uploaded. (ie. on device). However the "on device" scan takes place during the upload process rather than after.
 
But does having them sitting on an Apple - Server via iCloud Photos require reporting?
I would think so.

If so, I would prefer to know about it before I upload the photo.
You wont really get notified of anything, unless they find something that is. And you're assuming they still wont scan server side. Remember they haven't promised e2ee yet and they still have to account for iCloud photos not uploaded by an apple device.

The only way that it can be caught is by scanning during the upload process but before it is uploaded (ie. on device). From my understanding, that is what the process is intending to do.
That doesn't follow. There are many ways for something uploaded to a server to be sequestered until scanned.
 
None of those require on device scanning.

The law doesn’t necessarily mandate all the technically nitty gritty implementations to achieve a certain compliance.

And it’s not on device scanning remotely polled by Apple, it’s offline pre-labeling (not performed by Apple but by your device, not tethered to Apple in any way) to prevent illegal stuff from getting to Apple servers. Only once the pics have been uploaded on Apple servers the output of the scanning is collected by Apple. I will never not stress that the double blind on-device part of this process is just a technicality. Never. You say 100 times it’s on-device scanning and I will be there 100 times to correct all of you. I don’t make the rules.
 
  • Love
Reactions: Maximara
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.