Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, it does not. Please do a little research before continuing to spread misinformation. Apple servers are required for CSAM detection to occur, so if you don't use iCloud Photo Library, it doesn't work.

You are incorrect. I've read significant amounts of both their technical documentation as well as other "expert" analysis.

The CSAM hashes are on your phone. The system matches them with hashes on your phone. That happens no matter what. They don't leave the device unless iCloud Photos is enabled, but the hash matching happens on your device regardless.
 
Yes, I get that, 100%, and agree 100%. But that's not the point... please make sure to read my entire post, or simply disengage from the thread completely if you're going to lump all counterpoints together and dismiss them.

I explained why I equate the on-device scan to server-side scan, that was not meant as an answer to the whole post.

One thing at a time.

The on-device scan performed on iPhones that don’t use iCloud Photo it’s just a moot technicality since not even God, the Pope, Tim Cook, SuperMan, etc. could read those security vouchers until you activate iCloud Photo.

Good that you 100% agree on this.
 
I think I’ve finally gathered my thoughts on this:

1. This whole situation is fishy to me. We know the U.S. government has put pressure on Apple to weaken security and privacy before. Given that csam scanning can be thwarted by changing a single setting, it feels like this is malicious compliance.

2. this has put me back in perspective about smartphones. We’ve known that they’re privacy nightmares for over a decade now. Yet we’ve seemed to delude ourselves into thinking that we were secure.

3. I do appreciate the tightrope Apple is trying to walk here, using hashes and to scan for known photos is quite ingenious. Even if I believe that they shouldn’t scan a single ****ing thing, I can appreciate it from a technical perspective, trying to keep the neural matching off of their own servers.

For me, I think I’ll continue using Apple devices, but be more attentive to what I upload. I’m not a pedo so I have nothing to worry about currently, and if the U.S. gov comes after me for wrongthink, it’s probably because I’ve publicly stated my sentiments about them before. (Glowies can suck it)

I’m not gonna apologize for Apple at all, but I do think we had an unrealistic view of privacy from them that’s now been shattered.

After reading your post I have to ask “Why?” Did Apple even do this. It seems like a lot of work for very little gain.
 
No I’m just not understanding the discrepancy and providing examples. The person might not have the original picture but a modified picture. But that still gets flagged. But a similar picture with an adult/legal subject won’t? How? I thought modifications result in a match still. Remember, that person doesn’t have the original, but ONLY the modified one. But it’s still a match? Is that correct?

And it would be extremely helpful if it wasn’t for the patronizing responses. I said from the beginning I didn’t understand fully and looking for help.
But I gave you an answer…it is nearly impossible to match an offending image with an innocent image (photoshop or not).

Even if you could match more than one, there is a one in one trillion chance of that happening.

If you want to argue that the information Apple gave or even their odds provided are incorrect, then discuss it with them. If you need a more detailed explanation as to why the hashes from a personal photo cannot match those of a database photo, read the white paper again until you can understand as it was quite clear to me even without understanding the mind numbing details of how the tech works.

I honestly don;t think there is anyone on here that could provide you with the “how” beyond what is written in the white paper.

Plenty of people on here who may or may not read it, but will chime on every incorrect reason why it WON’T work as Apple claims. 🤣
 
I explained why I equate the on-device scan to server-side scan, that was not meant as an answer to the whole post.

One thing at a time.

The on-device scan performed on iPhones that don’t use iCloud Photo it’s just a moot technicality since not even God, the Pope, Tim Cook, SuperMan, etc. could read those security vouchers until you activate iCloud Photo.

Good that you 100% agree on this.

I agree that's how it works. I agree nobody can see or do anything with that information. I don't agree it's moot (per my home/HOA scenario in the previous comment).
 
Read the FAQ, trust Apple, have always encrypted what I consider to be sensitive info, nothing to see here, move along.
 
20 pages of individuals losing their minds over imagined, baseless scenarios? Very rational...

Yeah I agree on that. But there are a few who are actually trying to engage in some rational discussion here. Just because something doesn't fit into your narrative doesn't mean it's irrational and "dismissable".
 
  • Like
Reactions: 09872738
You are incorrect. I've read significant amounts of both their technical documentation as well as other "expert" analysis.

The CSAM hashes are on your phone. The system matches them with hashes on your phone. That happens no matter what. They don't leave the device unless iCloud Photos is enabled, but the hash matching happens on your device regardless.

But you 100% agree that this distinction is moot so we’re all good.
Those hashes are behind the event horizon of a crytographical black hole.
It’s like they don’t exist if a user doesn’t activate iCloud Photo.
 
  • Disagree
  • Like
Reactions: Mydel and oneteam
Yeah I agree on that. But there are a few who are actually trying to engage in some rational discussion here. Just because something doesn't fit into your narrative doesn't mean it's irrational and "dismissable".
I haven't dismissed or claimed that anyone who has posted a rational argument. Everyone's entitled to their opinion. However, when said individuals don't use fact to back up their thoughts, you have a recipe for chaos.
 
  • Like
Reactions: MozMan68
But you 100% agree that this distinction is moot so we’re all good.
Those hashes are behind the event horizon of a crytographical black hole.
It’s like they don’t exist if a user doesn’t activate iCloud Photo.

I just posted right above this, but I'll respond the same here.

I agree that nothing can be done with it while it's on my device. I don't agree that's moot, though (again, per previous comments).
 
Buzzword virtue signalling.
They’re using a creative way to pre-label illegal stuff in a double blind way that doesn’t get revealed until you upload the data to their server anyway.
Genius and elegant solution if you ask me.
Sneaky digging in my phone if you ask me. There is nothing elegant about it. Its invasion. Plain and simple.
 
I haven't dismissed or claimed that anyone who has posted a rational argument. Everyone's entitled to their opinion. However, when said individuals don't use fact to back up their thoughts, you have a recipe for chaos.

Then what's your take on this? I legitimately want to know, because I legitimately want to come around to Apple's side on this. You dismissed it before, but that's the concern that many have. We don't disagree that Apple is doing some great things from an encryption standpoint, or that there's even a risk by doing it this way.
 
You are incorrect. I've read significant amounts of both their technical documentation as well as other "expert" analysis.

The CSAM hashes are on your phone. The system matches them with hashes on your phone. That happens no matter what. They don't leave the device unless iCloud Photos is enabled, but the hash matching happens on your device regardless.


I think you may be splitting hairs here, but I think you're referring to this information from the above referenced document.

"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices." (page 4)


Just a little bit further down on that page:


"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against th database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.


Using another technology called threshold secret sharing, the system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images."


So, while the hashes are present on an iOS 15+ device, no matching/scanning is done unless the photo is uploaded to iCloud Photo Library.
 
  • Like
Reactions: dk001
But I gave you an answer…it is nearly impossible to match an offending image with an innocent image (photoshop or not).

But that does not track if modifications to an offending image are flagged. I can superimpose an adult in the picture and it still gets flagged. Right? Or are they overstating how much modification is flagged.
 
20 pages of individuals losing their minds over imagined, baseless scenarios? Very rational...
It's not baseless. We saw what happened to the US when Trump was in power. He was very authoritarian and while democrats cried out, he didn't give a single s*** because his constituents were cheering him on.

Now take a look at Germany and France. Authoritarian parties are on the rise because there are many people empowering them. If a new hashing requirement is signed into law, will Apple comply or exit the market(s)? And by market(s), I'm not referring to authoritarian states such as China because we know Apple won't exit existing authoritarian markets. I'm talking about western democracies that are reaching a tipping point.
 
It pre-scans on your device just as a courtesy to both you and them, then it wraps the results of the scan into an impossible-to-crack cryptographic vault than can only be opened if 2 things happen
1) you upload the pics to their servers (so the scan being on-device is MOOT, like the sound of a tree falling in a forest if nobody is listening)
2) you score multiple matching kiddie p0rn hashes

No human in this universe can look into those security vouchers if these 2 things don’t both happen.
Not sure about parallel universes, metaverses and multiverses.
Pre-scans? Please. Call it the way it is. Its scanning. if its moot whithout uploading why bother. Just do it on server side.
 
  • Like
Reactions: dk001 and 09872738
Sneaky digging in my phone if you ask me. There is nothing elegant about it. Its invasion. Plain and simple.
It's hardly "sneaky" if Apple has informed the general public about their plans.

It's also not invasion, and you have the right to opt-in or even disable iCloud Photo Library.
 
  • Like
Reactions: eatrains
I bet you the Chinese government will be very interested in this technology.

Notice how Apple keeps changing the wording and being very careful with the document. I hope this backfires and Apple catches themselves in a massive lawsuit.

Privacy matters. Apple: Let us (the consumers decide) if we want you to scan our iPhone.

Apple you are a TECH company. You are not a law enforcement. Don’t lose your vision. Stop chasing the $.

Reports like this will be out left and right…



Easiest way to avoid this spying technology.

1. Turn off iCloud photos and messages.
2. Do not login to your iPhone using your iCloud credentials.
3. Possibly use a fake account to continue using your iPhone. Or, simply do not login with your Apple ID at all.

View attachment 1816465

where is my grandparent Nokia? I might need it... and the Fujifilm camera.
 
It's not baseless. We saw what happened to the US when Trump was in power. He was very authoritarian and while democrats cried out, he didn't give a single s*** because his constituents were cheering him on.

Now take a look at Germany and France. Authoritarian parties are on the rise because there are many people empowering them. If a new hashing requirement is signed into law, will Apple comply or exit the market(s)? And by market(s), I'm not referring to authoritarian states such as China because we know Apple won't exit existing authoritarian markets. I'm talking about western democracies that are reaching a tipping point.
No, you're introducing American politics to a discussion about on-device image analysis that you have the right to opt-out of.
 
  • Haha
Reactions: HQNYC
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.