As always, these things rarely works flawless.I'd be more concerned that their toddler is known part of the known CSAM image hashes
As always, these things rarely works flawless.I'd be more concerned that their toddler is known part of the known CSAM image hashes
Wrong. Scanning your photos IS a violation of privacy, irrespective of you owning incriminating material or not. In fact, since most people are innocent in that regard, scanning is nothing less than a warrantless search, so basically a violation of constitutional rightsScanning my photos for kiddie porn has no effect on my privacy, since I don’t have any kiddie porn.
Yeah. looks like they did not really think this throughOh, good. So all the child pornographers and abusers will just disable it, leaving only the innocent people's photos available to be perused by random strangers in Cupertino. Great feature!
Until someone gets screwed up for taking a pic of their toddler taking a fun bath naked.
What child porn producer/viewer would even be dumb enough to store it on iCloud?Yeah. looks like they did not really think this through
Dunno, I don‘t know any. But given those folks are probably not all super bright and tech savvy… maybe quite a few?What child porn producer/viewer would even be dumb enough to store it on iCloud?
Here's the thing. That's great that you don't have any of that on your device!
But if you ever live in a country that, for some reason, wants to find something on your device and have it flagged in order to charge you with a crime, this sets a dangerous precedent.
Surveillance technology, while often well-intentioned, can easily end up in the wrong hands for nefarious purposes.
Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
Well. I would be more concerned about the images NOT showing the wife 😬😬Someone else just mentioned to me that images are not reviewed by a human. If that's the case, and the hash is all they ever need because it's so reliable, then this at least means your naughty nudes with/from the wife are not going out for human review - should there ever be a glitch. Phew 😀
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).Oh, good. So all the child pornographers and abusers will just disable it, leaving only the innocent people's photos available to be perused by random strangers in Cupertino. Great feature!
That's true, I suppose. Considering most seem to operate on the "dark web", I would have to assume they have some tech experience.Dunno, I don‘t know any. But given those folks are probably not all super bright and tech savvy… maybe quite a few?
Don‘t know really
All ya'll fighting this. Lets see your photos. HA. 🤣
That's what everyone tells each other in North Korea.I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
Wrong. Scanning your photos IS a violation of privacy, irrespective of you owning incriminating material or not. In fact, since most people are innocent in that regard, scanning is nothing less than a warrantless search, so basically a violation of constitutional rights
Yeah. It won‘t be either of those hashes because using one of those it would be impossible to detect images that are similar only or cropped. These would only detect exact matches, which is not what the system doesYou should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).
There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!
I agree that such pictures are vile and disgusting and have no place on a sane person's phone. I think pedophiles should be put under the prison.Good! The use of hash values in comparing known CSAM is extremely accurate, and has been in use by multiple cloud providers for years now - and has resulted in thousands of NCMEC leads which have gone out to the local ICACs - which have resulted in numerous arrests and prosecutions. There's no reason to have CSAM on your phone, period. Apple, thank you.
Potentially true, depending on local laws.And you agree to let them do it when you click agree on the terms and conditions every update.
You should read how file hashing works and how it's used to identify CSAM. No one "peruses" your pictures. Your pictures produce a unique hash value using either the MD5 or SHA hash, which doesn't reveal anything about the content of your photo - that hash is then compared to a very large database of known CSAM hash values. If there's a match, it kicks out a report to the appropriate LE agency (in the US, it usually goes to a local ICAC).
There has only ever been one SHA-1 hash collision in history, when scientists at google and CWI Amsterdam spent a ton of CPU/GPU time trying to do it using a formula, and they were successful only once in what was nine quintillion attempts. So it's pretty accurate!
I am notI’m willing to give up a slice of my privacy to make that happen.
Yet you don’t invite strangers to your house of innocence.I agree with darcyf that if you're not doing anything wrong, then you have nothing to worry about.
What exactly they are doing would need to be clarified.You're missing the step where the photos are "reviewed" by Apple (i.e. random strangers in Cupertino) before being sent to law enforcement.