The cat is out of the bag.
Only for people who don’t understand how this works, what was ALREADY in place, what can be done server-side anyway, what other companies routinely do, etc.
No cat in sight. Lots of drama.
The cat is out of the bag.
Can Apple guarantee that every single person in the NCMEC, and all Apple employees involved in the review process are all saints and sinless?
Simply asking because any system can be compromised because of the human factor.
I find it scary how many people on this forum are losing their minds over a reasonable policy.
Apple is the proprietor of iCloud. They set the rules. If you don't like it, buy a Windows phone on eBay. They're cheap.
All companies check for CSAM of stored contents on their servers.
Apple does exactly the same at the moment.
Apple just moved the process from the server to the device. This is more privacy focused and paves the way for fully-encrypted photo libraries while still complying with child safety laws.
People just go with the hype and don't even bother to read the documentation of how this feature works 🤦♂️
Exactly my view too. It's a very well intentioned move but what stops it from being abused? If it was something that cannot be abused please am all for it. Scan away.
But when Apple does it, NOW everyone gets upset
Why is this?
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM?
How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).
It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted.
Precisely. Further investigation. If your photo library does not contain indecent images, your privacy isn't being invaded.It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.
Yes, it's a hype, absolutely. But a justified one in my opinion. Don't get me wrong, people who own child pornography are terrible and should be punished severely. But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).
Governments in the future could simply force Apple to track down people who are found to possess images with certain hashes. Thinking into the future, aren't you concerned about what this technology can be abused for and how much damage it can do?
When you think of privacy, what company comes to your mind first? Google or Apple?Google has been doing this with GMail since 2014
![]()
Google scans everyone’s email for child porn, and it just got a man arrested
Search giant trawls photos for illegal “digital fingerprints”www.theverge.com
No one bats an eye for that
But when Apple does it, NOW everyone gets upset
Why is this?
It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.
It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.
Over simplified I think; a flagged image will be reviewed for a false positive. So there is a window for Apple Employees to view your photos for 'review'.
Over simplified I think; a flagged image gets checked for a false positive. At this point there is a window for Apple Employees to access your photo library for 'review'.
Precisely. Further investigation. If your photo library does not contain indecent images, your privacy isn't being invaded.
Again, you do realize that it would take more than one picture matching to flag an account…and the chances of more than one picture “matching”, being reviewed by an Apple employee and NOT being child pornography is a one in one trillion chance of happening?
This is all the kool-aid Apple is feeding to the consumers. I honestly think a trillion was a wrong number being used.If you match enough times to breach the threshold. Apple have stated it's 1 in a trillion for a false positive, so what are the odds do you think you'll match enough false positives to be reviewed?
Yes a reasonable person would be OK with rock spiders being caught.How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.
This is all the kool-aid Apple is feeding to the consumers. I honestly think a trillion was a wrong number used.
You have proof of this?This is all the kool-aid Apple is feeding to the consumers.
You think? What would be a more reasonable number in your estimation?I honestly think a trillion was a wrong number being used.
A government could demand another list of pictures be added to the hash list and report the possessors. Say of a political opponent.
This is all the kool-aid Apple is feeding to the consumers. I honestly think a trillion was a wrong number being used.
And the Apple employee never makes mistakes? Or simply clicks OK to be able to finish work on time.Precisely. Further investigation. If your photo library does not contain indecent images, your privacy isn't being invaded.