Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Scanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Wrong. Scanning your photos IS a violation of privacy, irrespective of you owning incriminating material or not. In fact, since most people are innocent in that regard, scanning is nothing less than a warrantless search, so basically a violation of constitutional rights
 
Last edited:
Until someone gets screwed up for taking a pic of their toddler taking a fun bath naked.

Or some unexpected glitch happens that causes a bunch of photos to be erroneously flagged, and suddenly parts of your photo library are being sent away for human review. If there's something juicy in there (private in terms of nudity, not illegal), you may care when you find out you were among the 0.001% of people affected by the glitch.

Personally doesn't bother me. I lead a very boring life, I take very few photos, and what is in my library is largely either a few family photos, a few pet photos, or more commonly hundreds and hundreds of screenshots I forgot to delete. If someone has to sift through my library as part of human review, I genuinely pity them. 😀
 
Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
 
  • Like
Reactions: pratikindia
Here's the thing. That's great that you don't have any of that on your device!

But if you ever live in a country that, for some reason, wants to find something on your device and have it flagged in order to charge you with a crime, this sets a dangerous precedent.

Surveillance technology, while often well-intentioned, can easily end up in the wrong hands for nefarious purposes.

Fully agree but it is not just that.

Any user surveillance feature compromises privacy as it introduces a new attack vector. Once the feature is implemented there is a risk that a bad actor corrupts the way it works at their own benefit.

To give a simple exemple, you can think of those TSA locks on suitcases. It doesn’t infringe your privacy in an unreasonable way because you don’t carry anything illegal in your suitcase and only TSA agents have the keys strictly to carry out lawful safety inspections, right? Yes … until someone else who wasn’t supposed to manages to get a copy of those keys and opens your suitcase with them.

Back to this feature, it could for exemple be hackers who manage to trick the phone into using a different hash database which instead of scanning for pictures of child abuse is targeting images with a certain political meaning. It could also be hackers finding a way to get the phone to report its findings to a private server.
 
Last edited:
Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.

Someone else just mentioned to me that images are not reviewed by a human. If that's the case, and the hash is all they ever need because it's so reliable, then this at least means your naughty nudes with/from the wife are not going out for human review - should there ever be a glitch. Phew 😀
 
Someone else just mentioned to me that images are not reviewed by a human. If that's the case, and the hash is all they ever need because it's so reliable, then this at least means your naughty nudes with/from the wife are not going out for human review - should there ever be a glitch. Phew 😀
Well. I would be more concerned about the images NOT showing the wife 😬😬
 
  • Haha
Reactions: kovalchuk71
Oh, good. So all the child pornographers and abusers will just disable it, leaving only the innocent people's photos available to be perused by random strangers in Cupertino. Great feature!
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).

There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!
 
Dunno, I don‘t know any. But given those folks are probably not all super bright and tech savvy… maybe quite a few?
Don‘t know really
That's true, I suppose. Considering most seem to operate on the "dark web", I would have to assume they have some tech experience.

Criminals are dumb though.
 
  • Like
Reactions: 09872738
Wrong. Scanning your photos IS a violation of privacy, irrespective of you owning incriminating material or not. In fact, since most people are innocent in that regard, scanning is nothing less than a warrantless search, so basically a violation of constitutional rights

And you agree to let them do it when you click agree on the terms and conditions every update.
 
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).

There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!
Yeah. It won‘t be either of those hashes because using one of those it would be impossible to detect images that are similar only or cropped. These would only detect exact matches, which is not what the system does
 
Last edited:
Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
I agree that such pictures are vile and disgusting and have no place on a sane person's phone. I think pedophiles should be put under the prison.

While I understand the purpose and manner of application by Apple in this regard, I am going to be turning off iCloud just out of principle. I am not going to run the risk of possible tech abuse whether intentional or not.
 
Last edited:
And you agree to let them do it when you click agree on the terms and conditions every update.
Potentially true, depending on local laws.

Still. At least its a violation morally. No company should even try something like it. If they do, they prove to me they cannot be trusted.

While hitherto I praised Apple for protecting peoples‘ privacy at least to some extent (other than Microsoft or Google, not to mention the worst of all which is of course Facebook) I may have to re-think that position. I probably just did.
 
Last edited:
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).

There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!

You're missing the step where the photos are "reviewed" by Apple (i.e. random strangers in Cupertino) before being sent to law enforcement.
 
I’m willing to give up a slice of my privacy to make that happen.
I am not

What's next....constant gps readings and location data being sent to the police if you are over speed limits; hackers instead of merely locking you out of files, not download these images on your devices with a bitcoin ransom; images/text that are against a political regime now result in arrest warrants....you name it
 
You're missing the step where the photos are "reviewed" by Apple (i.e. random strangers in Cupertino) before being sent to law enforcement.
What exactly they are doing would need to be clarified.

But the article indeed says that if flagged pictures are uploaded Apple “does a manual review” and “If CSAM content is found, the user account is disabled”.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.