Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

I think you may be splitting hairs here, but I think you're referring to this information from the above referenced document.

"Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child-safety organizations. Apple further transforms this database into an unreadable set of hashes, which is securely stored on users’ devices." (page 4)


Just a little bit further down on that page:


"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against th database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.


Using another technology called threshold secret sharing, the system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images."



So, while the hashes are present on an iOS 15+ device, no matching/scanning is done unless the photo is uploaded to iCloud Photo Library.

Thank you for finding that as I was trying to dig it up myself again.

Maybe I am splitting hairs, but I still see no verbiage that states the matching/scanning does not happen if iCloud Photos is disabled. Nothing happens with that match until it is uploaded to iCloud Photos, but nothing clearly states the match does not happen if it is disabled.
 
The way I heard it described on a podcast was like this:

Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.

They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.

So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.

With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.

But as others have said... all of the big companies are doing similar things. So I dunno.
This is not about Apple's abilities, it's about the right Apple has to use these abilities.

What's Apple going to do, report a hash positive to the police? Doubt if this will hold in court.

Next is screening for dubious payments, mails, messages also being used in this or other context.

It isn't hard to create a file with a hash equal to a CSAM image, this is not a foolproof detection system.

Apple should abide the law when the law asks this from Apple, if Apple does not agree with the local law, the should not be present there.
 
The fact that it’s already being done doesn’t justify the action. Blocking cloud access does not stop the invasion of privacy because it is occurring on your device locally. This makes me want to become Amish. While I like the idea of the freedom phone the lack of security checks of apps makes it a prime target for spyware which is the whole issue. Spyware is now built in.
 
But that does not track if modifications to an offending image are flagged. I can superimpose an adult in the picture and it still gets flagged. Right? Or are they overstating how much modification is flagged.
Don’t assume the hashes are limited to one part of a picture or even a person in the picture. There could be a very identifiable piece of furniture or a cup on a table or writing on a wall…it’s the combination of these items that the a.I. looks at to determine if there is too high of a probability of matches in an image to override a “modified” part of the image to flag it as being “identical.”

Imagine this, you have one of these horrible images, but you completely wipe out the “horrible” part of the image..no one even in the picture anymore…and then upload that to iCloud. It still could be flagged but may not be simply because of the original parts of the image that may remain…they’re not going to give away all of the details on exactly what is or is not considered.

And then if your concern is, “What if someone sends me one of these edited photos that looks completely innocent?”

My next question would be, “Why did they do that and then why did you save it to your camera roll and upload to iCloud?”

The scenario is so preposterous that it isn’t even worth addressing.
 
As much as I love Apple, I'm not convinced. Doing it on a country by country basis, well, hmm, what will you do if a given country rules that iPhones may no longer be sold without the CSAM detection operational? Will Apple then abandon that market? I note that Apple has not made that commitment to date.

In the US, what if Congress and the FBI force Apple to include "other" hashes?

It is indeed a well-meaning but very slippery slope.
 
It's hardly "sneaky" if Apple has informed the general public about their plans.

It's also not invasion, and you have the right to opt-in or even disable iCloud Photo Library.
No. I can not opt out from the presence of the code on my device. That is the problem. Again. Understand. They can scan on their servers. There is no reason to do it on my phone beyond some other reasons they are not telling us about…yet.
 
  • Like
Reactions: 09872738
No. I can not opt out from the presence of the code on my device.
The presence of code on your device is entirely irrelevant should you opt-of of iCloud Photo Library.

That is the problem. Again. Understand. They can scan on their servers.
They could, but they're not. Your data is being kept on your phone.

There is no reason to do it on my phone beyond some other reasons they are not telling us about…yet.
Annnd the compulsory paranoia.
 
  • Haha
Reactions: dk001
Then what?
Suppose a non-CSAM picture is injected by the US govt in the NCMEC repository, then what?
Suppose it even escalates to human review inside Apple, then?
Apple’s reviewer would see it’s not kiddie p0rn and discard it.
As I said: the need a way to unlock the Phone or get into it otherwise.
At the US border, they will just compel you to unlock.
Government contractors have developed and released devices to unlock phones for years.
Alternatively, you can use a software exploit.

Does that give them access to more information than before? Not necessarily.
But it may allow them to scale up searches.

What can CBP do with 10 minutes of access to your phone? They can do superficial checks and searches, look at a couple photos - but they aren‘t going to pull a 50GB backup in a few minutes.

If the phone already contains a database compiled by on-device image analysis, it may authorities to do mass checks for certain content in a few moments.
 
  • Like
Reactions: dk001
Seriously? You think they dont know how? They put that in in first place. I dont understand why are you refusing to acknowledge that it is invasion on device. Ane thatnit can have serious consequences down the road.
You literally just said you could remove the code because it's easier than turning off iCloud Photo Library, so...
 
Don’t assume the hashes are limited to one part of a picture or even a person in the picture. There could be a very identifiable piece of furniture or a cup on a table or writing on a wall…it’s the combination of these items that the a.I. looks at to determine if there is too high of a probability of matches in an image to override a “modified” part of the image to flag it as being “identical.”

Imagine this, you have one of these horrible images, but you completely wipe out the “horrible” part of the image..no one even in the picture anymore…and then upload that to iCloud. It still could be flagged but may not be simply because of the original parts of the image that may remain…they’re not going to give away all of the details on exactly what is or is not considered.

And then if your concern is, “What if someone sends me one of these edited photos that looks completely innocent?”

My next question would be, “Why did they do that and then why did you save it to your camera roll and upload to iCloud?”

The scenario is so preposterous that it isn’t even worth addressing.

This helps. Thanks! This is all I wanted clarification on and too bad it took so long. And the idea might be preposterous but as a developer these details help me understand how something works.

The scenario and Apples claims were contradictory. They claimed any modification would still result in a match but a similar picture wouldn’t is a bit contradictory. Like a bathtub picture for example. Legit vs offending pic but the same angle and style.
 
The presence of code on your device is entirely irrelevant should you opt-of of iCloud Photo Library.
not the point
They could, but they're not. Your data is being kept on your phone.
and why is that? Ask that question. There would be no backlash and no suspicion.
Annnd the compulsory paranoia.
suuure. Somehow my grandparents are alive because they had that thing living in Germany in 30s.
 
  • Like
Reactions: 09872738
IMG_4148.PNG
IMG_4149.PNG

I, as an Apple employee, think CSAM goes against the Apple Credo. Apple can't predict what type of government the future holds, but it can control what is possible on it's devices.

By installing a backdoor on it's devices, Apple makes it known to every government that Apple is willing to forgo privacy. Tackling CP is a great cause. I truly believe Apple has well intentions, but who's to say the next authoritarian government won't force Apple to censor a minor issue? Apple will comply if it's the law.

A dam doesn't fail in just one day.
 
I just posted right above this, but I'll respond the same here.

I agree that nothing can be done with it while it's on my device. I don't agree that's moot, though (again, per previous comments).

I went back to read the previous comment more carefully, as you can understand the first and most urgent part to answer to was the one in which I was called out for being factually inaccurate.

I’ve read the analogy (yet to read the linked blog post) and it doesn’t make much sense honestly.

Apple is already inside your sacred space with both feet, they’re already sleeping on your sofa, since the OS is not open source you don’t know what’s really happening under the hood.

They will occasionally ask you to participate to initiatives that are not in your direct interest, like submitting error telemetry (you can opt-out of this with no functionality loss).

They will for sure update the terms and conditions to ask your consent about this too. This time opting out will cost you not having access to their iCloud Photo service (you will still be able to backup your precious photos to iCloud Backup, from what I’ve gathered), but what are you gonna do, this time the stakes are pretty high: hosting or not CSAM on their servers. Of course it can’t be opt out for people willing to store stuff on their (Apple’s) property. But as far as what happens within the (not so) sacred space of the local device, they will just stick an unreadable post-it to the fridge, nothing more. That post-it note will sit there doing nothing and being unreadable. It’s just a technicality like a thousand like it, nothing much to get triggered about or to equate to unrelated examples.

As for ”why are companies and banks asked by the State to sometimes police and report their customers”, that’s how it works. You can‘t bring 1M€ cash to a bank and you can’t upload CP to cloud storage hosts. The fact Apple is using a workaround that makes finding CP even more privacy-minded is frankly just a technicality. (and you can live a perfectly normal digital life by uploading those pics to another service, a bit different from the “you can’t leave your home with the stolen items, period” example).
 
  • Like
Reactions: jntdroid
So you’re bringing anti-semitism into a discussion about on-device image analysis? That’s pretty low, no less confusing…
Seriously.…your interpretation is infantile and your interpretation of the facts is heavily impaired. i can understand you are supporting Apple move. But you cant deny thatbit is creating backdoors, compromising devices and leaving us thinking whats next.
 
Google and others don't sell their products with a promise of privacy. That's the problem here. Apple does.
That’s why they implemented CLIENT-SIDE analysis so no data leaves your phone and Apple doesn’t access your photos.

You prefer server-side analysis where all your encrypted iCloud photos need to be decrypted first???

Theyre the first company to implement a privacy oriented solution to ban illegal SCAM content from their servers and everybody complains. Go figure.
 
  • Love
  • Disagree
Reactions: dk001 and MozMan68
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.