Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They ARE Scanning your photos on device. From the Apple FAQ:

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Use whatever tech babble you want, "On-Device Matching" IS Scanning. Apple looks through your photos, finds naughty things, and tells the cops. All without warrant.
Matching is not scanning, the issue there is 99% of users don't know what an hash is. It is not the photos, just numbers.
 
  • Like
Reactions: VTECaddict
And if Apple announced that it was scanning iCloud photos for CSAM, there would be a lot less outrage.
Apple have been doing this since 2019. This latest annoucement is an evolution of that initiative and I believe will lead to E2EE for iCloud Photos.
 
I trust that eventually they will just keep adding things they will be scanning.

It's only good for areas where you have a lot of images and people are storing copies of these images.
Basically, people would have to store the same "iconic" photos.

Let's say a government wants to force Apple to use this system to find people engaging in illegal protests. How would they do it? Please describe each step which you believe had to be taken and by whom. And they it would work.
 
Who has appointed Apple as an investigator for paedophilia crimes?

Isn't that a job that the police. They are supposed to investigate on suspicion of a crime?

Why should Apple scan MY mobile or computer? What I have in my phone or computer is MY business.

If I have committed a crime, it is up to the police to investigate it, not Apple, which will mass spy on all units for the police.

This is a first step for total monitoring.
We are talking about pedophile. Google, Facebook already does that and not with an hash which guaranteed privacy. Only privacy of users with real CSAM images is at risk... poor guys.
 
good point. The rule breakers are free to move on while the rule followers are stuck with this.

Kinda like the people who refuse to get vaccine shots and the rest of us who did now have to put masks back on.
Thus it begs the question, what's the real purpose of this? Many experts in the field already pointed out that the result of this will be minimal because the root issues lay elsewhere.
 
I really don't get the argument that scanning on the device is more private than doing the scan part in iCloud on Apples server.
My three major thoughts are:

a) Now you get the ability to compare pictures you upload to iCloud to the CSAM database on millions of devices. This technology will attract the interest of law enforcement agencies, governments etc.
b) It makes no sense that the scanning part is done locally when your pictures aren't end to end encrypted in iCloud photos. Apple could do the scanning in the cloud, because Apple can access your cloud data anyway.
c) It would make no sense to do the scanning part locally if iCloud photos would be end to end encrypted. Because in this case Apple had no access.

If push came to shove by the government (that may have already happened), which would you prefer:

1. Apple scanning your phone's hashed photos against a table of hashed photos of child porn.

2. The government scanning your photo jpegs on Apple's servers, by government mandate.
 
Yeah but they'd be able to see ALL of them on iCloud. This limits them to only being able to see the ones that are flagged without decrypting anything on iCloud. They'd have to decrypt your entire iCloud library to do the scanning there. This keeps them from having to do that...

Again, I still don't know if I like that any better. But that's the logic behind it. A lot of people are theorizing that this will allow them to make iCloud fully E2EE, without them having any keys. But I still don't know if it's worth it.
Yeah for me I don't like that, because sometimes I take private photos and I do not upload them to the cloud. I like to think they stay on the phone, the idea that some guy at Apple could in theory be looking at them, doesn't appeal to me. Even if extremely unlikely, it is possible.

Everything on the cloud I upload knowing someone could look at it, now with this new system, every photo I take I would do so knowing someone could look at it, before it is even uploaded. This is 1984.
 
And the rest of us will still be subject to it...
Only if you have 30 image which matches CSAM know images from a independent database. In that casa your privacy will be at risk... but you are a pedophile. Remember matche is not scan, an hash is not an image.
 
No. Apple will be scanning your hashed photos against a table of hashed photos of child porn.

Would you prefer the government scan you photo jpegs on Apple's servers, by government mandate, should push come to shove?
Yes. Yes I could. Much more clear. Not my server so ok with abiding my rules to use. My device my control.
 
Only if you have 30 image which matches CSAM know images from a independent database. In that casa your privacy will be at risk... but you are a pedophile. Remember matche is not scan, an hash is not an image.
That’s a cover up lie and a random number Craig is throwing at us.
 
From the interview, the actual mechanism in place to scan photos doesn’t seem that bad (using hashes and needing 30 hashes to form a match — not actually viewing photos on your phone when they’re being uploaded to iCloud). Is the concern around this being abused by corrupt governments around the world? I’m just trying to understand in which ways this can be abused by authorities around the world.
Correct. The arguments against are almost all slippery slope type arguments. They do have merit, but I'm not sure they warrant the level of hysteria we've seen around this feature, until Apple shows otherwise.

I think what this situation has highlighted for many people is how much we have to trust Apple (or any OS provider). When Apple says they won't put random items in the list to check and that that list can only updated with an iOS build iOS users have to trust them. When Apple says they are not scanning all photos today, iOS users have to trust them. When Apple says this new feature will only be triggered when sending photos to iCloud, I think you get the point.

For a lot of people this was a bit of wake up call because 1) lack of knowledge and 2) Apple's privacy push.
 
Child porn is a crime. That’s ’s justification for this tool: WARRANT-LESS SCANNING OF YOUR PHONE FOR CRIMES. Today it’s one of the more shocking crimes. But what about tomorrow? Monitoring your camera and microphone for crimes in progress? Scanning all your finances for insider trading? Monitoring your GPS for failure to come to a complete stop at known stop sign intersections?

This also makes  liable for content, since they will now be filtering it, and can be sued when they fail to, forfeiting most protections from 2230.

There is little doubt why  is doing this: under pressure from law enforcement to unlock phones, which they have already taken a stand on not doing.  has therefore decided they will police the phones themselves, building in a back door for their own access to data for whatever purpose they decide is necessary.

Let’s not forget hacking. What’s stopping a Russian darknet freelancer from being hired by anyone to hack your iCloud Photos on the web, inserting matched-hash images which propagate to all your devices and result in your arrest and prosecution?
 
Last edited:
This is so sad. Craig was the one guy at Apple I respected the most. He has a great personality, speaking voice, and image that makes me wonder if he should’ve been a marketing executive instead of engineering VP. Apple knows he’s a guy people listen to, so they believe his charisma will calm people down. Even he can’t stop this train wreck.
 
What I don't understand is how Apple can get away with hosting known illegal-to-possess child porn images on its servers. Fingerprint matches, they are identified illegal images. Legally they should reject them immediately that they are identified and not permit them to exist on systems they own and control. Or maybe they don't understand they are blatantly breaking the law themselves, or don't care as they are "the good guys" and consider themselves immune from prosecution. Google rejects them out of hand and doesn't permit them on their servers. What makes Apple special here? And then Apple has a group of non-law-enforcement people looking at those child porn images to see if there are enough of them to report the uploader.
 
Last edited:
  • Like
Reactions: krspkbl
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.