Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Of course Apple will say that. Honestly, who knows? I’m sure when the real one ships someone will be able to make a collision on that one too. I mean, 30 images for Apple to check it? That’s a lot, and implies a lot of false positives. Apple isn’t backing down, so people need to boycott the iPhone 13. They would change their tune real fast then.
 
I wouldn't trust Nicholas Weaver as far as I can throw him.

This whole thing is his brainchild, he wrote up this idea, almost verbatim, back in 2019 for Lawfare blog. A national security blog.

Since then, he's been zealously defending this move by Apple. Using his credentials to lend credibility to this whole mess, without actually engaging with his peers in the field on the issues and concerns that have been raised.

His "rebuttals" can typically be summed up as intellectually dishonest at best and downright lacking in substance to withstand a simple sneeze at worst.

Take his actual argument for example:


This is plainly false. Since what has been accomplished is pre-image, meaning that the researchers have managed to create a collision by merely using a hash. The resulting image causes a collision but looks like random noise.

All that is necessary for an adverse actor is to use the noise as a mask over legal pornographic material and the user is screwed. Apple's human reviewer isn't going to do an entire CSI analysis to make sure if the people depicted in the photo (or their body parts) were underage at the time the photo was taken.

They'll ask themselves one question: could this be CSAM? If the answer is yes then the account gets blocked and a report is files.

As for his statement that "it would require the production of over 30 colliding images", that's just intellectually dishonest. They don't have to be 30 unique images, it could be 30 of the same. And even if it would require unique images, generating a colliding image is trivial both in effort and time as has been demonstrated, to then apply that colliding image to legal porn is even less of a feat.
On re-reading the Apple docs, you are right on this. Seems strange that Apple wouldn’t check that particular image produces that particular hash as a simple test for hash collisions. Or did I miss that step in their description?
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
You’re a little late to the party. You should take some time to read up on the information provided and what the public opinion is on both sides. The short answer is they are not scanning your iCloud photos. The feature which is in question is done on device using a thing called hashes (Best to read up first). Then a bunch of other things happen before it even gets flagged for review by humans and only the images that were flagged are reviewed, nothing more. I’m paraphrasing heavily.

There are people on both sides of the argument. I’m more optimistic that the feature works by design and Apple plans to squash any abuse of it. That isn’t universally felt so its a good idea to get educated.
 
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Don’t underestimate the stupidity of people who collect illegal photos or to whom they share them with not having that iCloud photos turned on.

I personally am for the feature because I trust that the system is trying to do what it claims and that any abuse of the system Apple will shut down or pat out if there are bugs. That’s my belief but you are welcome to yours.
 
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
You should take some time to understand how iCloud photos works. If someone emails or messages someone all of those illegal photos and they have the feature turned on it wouldn’t work. They have to move those photos into the photo library for them to then get uploaded to iCloud photos. Otherwise they are just in their messages or email app.

If their a law abiding person they’ll report that person to the authorities. In my case I would just hand over my phone to increase the chances of them catching that individual or entity. Inconvenient for me yes but worth it IMO to catch that scumbag.

Also even if it did work that way, it would certainly be an inconvenience and would probably become something some kids would prank other people over. At which point Apple would find ways to patch it out and law makers would create laws that would prohibit it, which technically already exists. But since that’s not how iCloud photos works it just an exercise in showing you that when systems are weak agencies and companies strengthen them.
 
  • Like
Reactions: hans1972
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
Yeah, they're so careful aren't they? Facebook reported 20 million CSAM images last year, but that's nothing right?
 
  • Like
Reactions: WiseAJ
I would love them to name one security researcher who they made this available to (outside of Apple).
 
  • Like
Reactions: PC_tech
This has already been addressed. The hash database is global. There is no regional variations distributed. Every Apple user will have the exact same set of hashes, and those hashes must come from two separate sources. So two governments would need to agree to participate in the corruption and abuse of the system. Unlikely they would get away with that.
And you believe the CCP doesn’t have enough power to influence Apple resulting in a change? Where are the iCloud servers for China located again? once Apple rolls out this surveillance software and these countries realize what they could use it for, it just makes it a governmental mandate away from abuse.
 
  • Like
Reactions: msackey
Ok, just hypothetical if apple scans icloud, find CSAM positives, checks these to be positive and reports this to, let's say, the police.

Anyone an idea if this will hold up in court? Can there be a conviction? Is evidence acquired usable?
 
This just came to my mind: I'm sure Apple will internally use CSAM to also identify leakers.
They recently went very aggressively against leakers, and CSAM tech is a nice method for this.
 
Anyone an idea if this will hold up in court? Can there be a conviction? Is evidence acquired usable?
Well, presumably it would be probable cause for seizure and search of your phone, and a court order to Apple to turn over the contents of your iCloud account.

...by which stage, any career or relationships you have will probably be in tatters so while it's still kinda important what the court ultimately decides, a lot of damage could already be done by that stage. You really, really don't want innocent people to be accused of CSAM possession even if you trust justice to prevail eventually.

Also: https://en.wikipedia.org/wiki/Prosecutor's_fallacy

Ultimately - this has to be a risk/benefit analysis: Small chance * huge consequential damage of false accusation vs. small chance * real but limited benefits of catching a genuine consumer of CSAM. Biggest benefit is probably to Apple since it reduces the chance of known CSAM ending up on their servers. It ain't gonna wipe out child abuse, unfortunately.

That is, of course, aside from the fine but important line in the sand that has been crossed by doing this on-device.
 
  • Like
Reactions: ItWasNotMe and Xtir
Hey Apple, you sell hardware and not software, why not make your software open source eh?

You can always restrict its use on non-Apple hardware via license.
 
  • Like
Reactions: Wildkraut
This just came to my mind: I'm sure Apple will internally use CSAM to also identify leakers.
They recently went very aggressively against leakers, and CSAM tech is a nice method for this.
🤦‍♂️They literally can't, comments like this prove that people really haven't read and understand what Apple is doing here.
 
  • Angry
Reactions: Wildkraut
And you believe the CCP doesn’t have enough power to influence Apple resulting in a change? Where are the iCloud servers for China located again? once Apple rolls out this surveillance software and these countries realize what they could use it for, it just makes it a governmental mandate away from abuse.
China already has Apple's China iCloud servers, why do they need access to a limited scope of information provided by the CSAM feature. They can already scan for whatever they want, whenever they want.
 
China already has Apple's China iCloud servers, why do they need access to a limited scope of information provided by the CSAM feature. They can already scan for whatever they want, whenever they want.
The point being there has already been a case where Apple gave into government pressure in order to have access to a market but there are many posts saying Apple said they will ”just say no” to requests to scan for other images.
 
The point being there has already been a case where Apple gave into government pressure in order to have access to a market but there are many posts saying Apple said they will ”just say no” to requests to scan for other images.
The point being made is that we all know apple practices, such as producing in countries with dubious human rights. By buying we, as buyers, are also enabling this now and in the future to happen.

What's next, scanning for CSAM-alike gayporn pictures in Russia, because this is by Russian law illegal? Or Uighur or any other minority anywhere. Or perhaps, known terrorists on our own turf?

To me I would encourage apple to scan CSAM, or any other illegal stuff. But only if being asked by an entity I trust, and for now (even less) apple isn't one of those.

But look at the poll hereabove, we still are buying and enabling regardless.
 
Make it all public. It’s for the children, right? What do you have to hide, Apple?

It won't help unless you trust Apple.

How would you know for sure that the public source of the algorithm is the one being used by Apple in production?
 
Too much risk is involved. No idea, why Apple is even allowing this in first place.

The fact anyone can reverse engineer. Being able to find the code inside the platform, scary stuff!

That's why it's better to have it on the device. You can extract the binary code and see what it does.

Apple knows this so they don't rely on this code being secret.
 
I'm pretty sure that this is exactly Apple's intention. They have to protect iCloud from hosting CSAM and probably don't have too much invested in actually catching pedophiles. If everyone who posts CSAM stops using iCloud and perhaps all Apple products, this is to Apple's benefit.

And users too. If no CP collector or terrorist used Apple products we could argue regulations should only include Android and Windows. And maybe Linux too.
 
This makes it even more concerning because it must mean that they are not self-critical enough to see the flaws in their system or potential abuse. They refuse to allow even the possibility that they have blindspots, because anyone who disagrees is just confused and ignorant.

We know it can be abused. But there are so many other ways to do surveillance which are much more effective, so we're not worried.

It's an utility calculation.

What benefits do I gain vs what is the probability of abuse and what effect will the abuse have on me.

Also some of us live in countries where we do trust our public institutions and even some for-profit companies.
 
Then again, the code is installed and has access to your photo library. Whether or not Apple turns off the function when you turn off iCloud, if someone else that hacks your phone is able to turn it on and use it for something Apple didn't intend as a zero day exploit is a real possibility. Its hard enough to keep things secure without them adding crap like this.

And they could do the same time thing for iCloud backup.
 
Yes. The account takeover and AirDrop threats are both extremely big attack vectors. This is going to be the new ransomware, except when done for targeted purposes, where the first warning you get will be your door coming off its hinges from the SWAT team breeching it.

If you believe Apple's one-in-a-trillion nonsense on this, you just need to keep reading and paying attention.

Are people really worried about SWAT police teams?

I read Facebook reported 20 million cases last year. If true it would be about 55 000 cases per day. Wouldn't the SWAP teams be overwhelmed?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.