Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I understand why some people don't see a problem with this. On it's face it sounds great. Who doesn't want to see people sharing CSAM get (ahem) ****ed.
The problem is, it isn't just "privacy" being mishandled by relatively transparent democratic governments that is at issue here, the real problem is that this technology can and will be weaponized by repressive governments around the world. Apple can deny it all they want but they have already demonstrated that they can and will bend to the CCP when their continued business in China is on the line. If you live within the reach of the PRC or even work in an Asia-adjacent field you should be concerned about this.
 
I find it scary how many people on this forum are losing their minds over a reasonable policy.

Apple is the proprietor of iCloud. They set the rules. If you don't like it, buy a Windows phone on eBay. They're cheap.
All companies check for CSAM of stored contents on their servers.
Apple does exactly the same at the moment.
Apple just moved the process from the server to the device. This is more privacy focused and paves the way for fully-encrypted photo libraries while still complying with child safety laws.

People just go with the hype and don't even bother to read the documentation of how this feature works 🤦‍♂️

Yes, it's a hype, absolutely. But a justified one in my opinion. Don't get me wrong, people who own child pornography are terrible and should be punished severely. But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).

Governments in the future could simply force Apple to track down people who are found to possess images with certain hashes. Thinking into the future, aren't you concerned about what this technology can be abused for and how much damage it can do?
 
Exactly my view too. It's a very well intentioned move but what stops it from being abused? If it was something that cannot be abused please am all for it. Scan away.

Everything can be abused. There is no limitation for abuse of technology regardless of whatever it may be.
 
But when Apple does it, NOW everyone gets upset

Why is this?
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM?

The other companies have been doing this in the cloud so far. And you don't use the cloud if you care about the data.
Apple, on the other hand, scans on the device. This means that data that is not in the cloud can also be scanned. At least the technology is there.
 
Last edited:
But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).
How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.
 
  • Like
Reactions: ohio.emt
How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted.
It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.
 
Yes, it's a hype, absolutely. But a justified one in my opinion. Don't get me wrong, people who own child pornography are terrible and should be punished severely. But the problem I see is the invasion of privacy of millions of people to convict a few (these people can also simply disable the upload of their pictures, then it has no benefits but just the destruction of privacy).

Governments in the future could simply force Apple to track down people who are found to possess images with certain hashes. Thinking into the future, aren't you concerned about what this technology can be abused for and how much damage it can do?

We keep talking about privacy, but what about people who keep taking out their phones and recording arguments or incidents that happen to them, "threaten" the other party by saying that they'll be on YouTube/TickTok or whatever, and then actually upload them to YouTube and usually end up in compilation videos such as "Karens Go Wild!" or "What This Mad Person Hit My Truck WIth A Banana".

That's a potential bigger invasion of privacy for me than automated CSAM tools are.
 
  • Like
Reactions: bryn0076
Google has been doing this with GMail since 2014


No one bats an eye for that

But when Apple does it, NOW everyone gets upset

Why is this?
When you think of privacy, what company comes to your mind first? Google or Apple?
Its like if the US implements a social credit system and someone says: China has been doing the same and nobody bats an eye.
 
It will be reviewed by an Apple employee for further investigation. That is how Apple is invading your personal privacy.

Again, you do realize that it would take more than one picture matching to flag an account…and the chances of more than one picture “matching”, being reviewed by an Apple employee and NOT being child pornography is a one in one trillion chance of happening?

Oh…and for the other idiots up above (not you), they are reviewing the flagged images. They do NOT have access to every pic in your iCloud account.
 
Over simplified I think; a flagged image will be reviewed for a false positive. So there is a window for Apple Employees to view your photos for 'review'.

The way I understand it... the hashes are for specific CSAM images. So it should be nearly impossible for one of my innocent images to be flagged against a *known* CSAM image.

But yeah... I see your point.

I'm not worried about it personally, though.

They say there's a one-in-a-trillion chance for a false positive to occur.

So even if they did flag one of my photos (unlikely due to the above)... I don't have any CSAM images on my devices. They'll look at my innocent image and move on.

I'm meh on this whole thing. ¯\_(ツ)_/¯

But I see why people think it could evolve into something more.
 
  • Like
Reactions: jabbr and MozMan68
A few red flags for me:
1: Facebook owned What’s App - is against it.
2: Epic Games which is mainly owned by a Chinese company is against it.
So two groups we know don’t care about your privacy as your date makes them money. I think they are more scared of what this change would force them to expose than they are over the privacy aspect. What are they hiding?
This whole topic being brought up, already exposed that companies do some sort of CSAM scanning - and a lot of it seems to be without user (sheepeople) knowledge.

As people seem to show fear about laws being enacted that would force Apple to use the system for something else - any politician who backs such a law/bill would effectively end their career in a democratic society (for the other types, Apple can just refuse to turn the feature on or can remove it). Apple has so many eyes on it, and they will fight to keep as much good PR as they can.
this change is a slippery slope, but it isn’t just Apple on that slope, every country and politician is right up there with them! Who messes up end slides down first matters a lot. It is also a slippery slope for the companies who already scan your stuff, as they will have to become more transparent about their process.

I support Apple: I just hope they don’t abuse it and don’t allow it to get abused.
 
Precisely. Further investigation. If your photo library does not contain indecent images, your privacy isn't being invaded.
Again, you do realize that it would take more than one picture matching to flag an account…and the chances of more than one picture “matching”, being reviewed by an Apple employee and NOT being child pornography is a one in one trillion chance of happening?
If you match enough times to breach the threshold. Apple have stated it's 1 in a trillion for a false positive, so what are the odds do you think you'll match enough false positives to be reviewed?
This is all the kool-aid Apple is feeding to the consumers. I honestly think a trillion was a wrong number being used.
 
  • Haha
Reactions: ohio.emt
How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.
Yes a reasonable person would be OK with rock spiders being caught.

I just don’t like the back door this creates. A government could demand another list of pictures be added to the hash list and report the possessors. Say of a political opponent. Would you be oK with that? Apples promises it won’t let such a government make apple do that. I reckon it would be better if it didn’t put itself in a position to make it possible in the first place.
 
That is just one of the problems with having stuff on somebody else’s computers.
It is often better to own than to rent.
 
  • Like
Reactions: 09872738
A government could demand another list of pictures be added to the hash list and report the possessors. Say of a political opponent.

So it's illegal to own images of political opponents? No, of course not. CSAM images are, hence we are here.
 
  • Like
Reactions: movielad
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.