Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
if you use iCloud for photos or device backups, they already are going through your photos.

if you don't use iCloud at all, you're in the minority in which Apple has to make the tough choice of catering to the majority of their customers.

In those two cases law enforcement needs a warrant. That means somebody in law enforcement outside of Apple has to have a reason to believe someone has committed a crime and has enough evidence to go to a judge and then demonstrate to that judge that there is probable cause to issue a warrant. I'm fine with that. My own device deciding to self report directly to law enforcement? That's a completely and totally different deal. The phone is deciding if it thinks I've committed a crime, nobody from law enforcement has to have a reason to suspect anything and the judge will be predisposed to believe there is probable cause and will sign off on the warrant.

No thank you. That's the very definition of Big Brother. I've disabled iCloud photos and will never turn it back on. I appreciate the intent but this is a bridge way too far for a company that claims it's all about privacy. What happens when there is a bug, and there will be bugs, and somebody innocent is reported? The cops will show up on their doorstep, or worse, at their office, throw them in cuffs and toss them in jail while treating them like they are guilty because their phone says they are. Apple has lost all credibility on the privacy front. They no longer protect it they now actively invade it. Noble cause or not, the 4th Amendment exists for a reason. This technology will be abused as all technology is. It's only a matter of time. They won't turn over the keys to a known killer's phone but they'll be happy to install software on all of their devices that will actively spy on you and can misreport to law enforcement that you're a pedophile knowing that there is software out there right now that can pwn your iPhone with a text message without ever having to click a link? Are you kidding me? Even if they can't trigger it to report they could certainly load some disgusting photos on your device once they've pwned it knowing Uncle Sam will show up and throw you in jail smearing your reputation in the process.

No thanks Apple. Figure out how Black Cube, other private "security firms" and other foreign bad actors including governments can pwn your products without you opening anything and take them over and do whatever they want before you put in something that can self report innocent end users who may not have actually loaded that data before you decide you'll self report content on your products to law enforcement. This will be abused. It's simply a matter of time. I'll be seriously looking at an Android device if Apple doesn't come to their senses and stop this. It's insanity they would consider doing this knowing that there are companies making millions selling software that can completely and totally own your device and spy on you.
 
In those two cases law enforcement needs a warrant. That means somebody in law enforcement outside of Apple has to have a reason to believe someone has committed a crime and has enough evidence to go to a judge and then demonstrate to that judge that there is probable cause to issue a warrant. I'm fine with that. My own device deciding to self report directly to law enforcement? That's a completely and totally different deal. The phone is deciding if it thinks I've committed a crime, nobody from law enforcement has to have a reason to suspect anything and the judge will be predisposed to believe there is probable cause and will sign off on the warrant.

No thank you. That's the very definition of Big Brother. I've disabled iCloud photos and will never turn it back on. I appreciate the intent but this is a bridge way too far for a company that claims it's all about privacy. What happens when there is a bug, and there will be bugs, and somebody innocent is reported? The cops will show up on their doorstep, or worse, at their office, throw them in cuffs and toss them in jail while treating them like they are guilty because their phone says they are. Apple has lost all credibility on the privacy front. They no longer protect it they now actively invade it. Noble cause or not, the 4th Amendment exists for a reason. This technology will be abused as all technology is. It's only a matter of time. They won't turn over the keys to a known killer's phone but they'll be happy to install software on all of their devices that will actively spy on you and can misreport to law enforcement that you're a pedophile knowing that there is software out there right now that can pwn your iPhone with a text message without ever having to click a link? Are you kidding me? Even if they can't trigger it to report they could certainly load some disgusting photos on your device once they've pwned it knowing Uncle Sam will show up and throw you in jail smearing your reputation in the process.

No thanks Apple. Figure out how Black Cube, other private "security firms" and other foreign bad actors including governments can pwn your products without you opening anything and take them over and do whatever they want before you put in something that can self report innocent end users who may not have actually loaded that data before you decide you'll self report content on your products to law enforcement. This will be abused. It's simply a matter of time. I'll be seriously looking at an Android device if Apple doesn't come to their senses and stop this. It's insanity they would consider doing this knowing that there are companies making millions selling software that can completely and totally own your device and spy on you.

You're having double standards. On one hand, you're saying Apple and the government currently are following the rules, going through judges, issuing proper warrants, and going through the motions before decrypting a photo library. But when it comes to this new on device scanning, you're saying it'll be abused left and right by Apple and/or the government as they see fit.

Sorry, but only one is true. Either they're already abusing and decrypting iCloud Photo Libraries as they see fit today and therefore this new CSAM detection can't possibly make matters worse OR they are aren't abusing the data today and they won't go full total invasion of your privacy via on device CSAM detection in the future.
 
  • Like
Reactions: StralyanPithecus
don't bother. we're done. you're deliberately avoiding the fact that it's easier for people to turn off iCloud and continue viewing child porn like the old fashion way without reprecussions as opposed to filming new content.

have a good one.
As long as we agree that none of this helps children it doesn't really matter if you agree that it will get worse. Apple should be held accountable if it does since it's clearly quite predictable and they feel the need to be data ranger.

It's not just iCloud on or off, and stop pretending it is.
 
Alright, I believe it's my fault that I have contributed to the overuse of analogies to the point where we all made them useless as we're all trying fix and adjust the analogies to suit our argument. Analogies are meant to simplify the situation, but instead we've all made it complex and they no longer fit the argument.

I was going to try and introduce another analogy to the "police station" analogy like a bill counter/counterfeit detector, but I can see how that'll add some confusion.

Let's just leave it at that, but glad to see someone properly discussing this as opposed to some of the other people I've talked to on here. Cheers.

I’m guilty of this as well. I love a good analogy. But that one I attempted to do, while helpful for me, was a perfect example of one getting more and more complicated the more I dug into it.
 
  • Like
Reactions: farewelwilliams
Rene Ritchie (of all people!) actually had an interesting solution. Instead of on-device scanning, Apple moves it to a cloud server that then relays it to iCloud, kind of like private relay. This way Apple can claim that they "never see your data" but all spying remains off device. This actually sounds like a decent compromise.
 
Rene Ritchie (of all people!) actually had an interesting solution. Instead of on-device scanning, Apple moves it to a cloud server that then relays it to iCloud, kind of like private relay. This way Apple can claim that they "never see your data" but all spying remains off device. This actually sounds like a decent compromise.
Not really. There is no excuse for any spying.
 
Not really. There is no excuse for any spying.
Obviously I would prefer if they stood up to the government and tried to implement E2EE in iCloud. Barring that, I would rather they scanned photos on their cloud rather than bundling spyware onto my device. But if Apple wants to keep claiming "we never see your data" and keep things off device as well, this would one way to do it. I would still like to see them letting researchers look at this proprietary "NeuralHash" technology, but making sure that spyware is not bundled in with iOS has to be the number one issue here. What Apple does with their cloud is (partly) their business. What I do on my iPhone is never their business.
 
Obviously I would prefer if they stood up to the government and tried to implement E2EE in iCloud. Barring that, I would rather they scanned photos on their cloud rather than bundling spyware onto my device. But if Apple wants to keep claiming "we never see your data" and keep things off device as well, this would one way to do it. I would still like to see them letting researchers look at this proprietary "NeuralHash" technology, but making sure that spyware is not bundled in with iOS has to be the number one issue here. What Apple does with their cloud is (partly) their business. What I do on my iPhone is never their business.
I worry that ranking options implies they should have a choice. I stand firm that when they rent me space on their server they treat it as an extension of my device.
 
I'm a full stack engineer and have setup image processing pipelines on colocated servers and as well as on AWS. I am without a doubt 100% sure it costs more.

Scanning every single photo uploaded by over a billion customers distributed across many data centers as opposed to having a billion CPUs already paid for and managed by users to do it? It's an easy answer. Server side scanning is more costly.
I don't think you're thinking this thoroughly through -- only counting one side of the cost. Those billions of phones have an energy cost too, and we users pay it -- that's what I've said -- and it'll end up more than your data centers. (a lot more) That's one of my main points, they're making *me* pay for it and I don't benefit in any way from it. In fact, it's a threat since there may be false positives.
 
  • Like
Reactions: huge_apple_fangirl
I don't think you're thinking this thoroughly through -- only counting one side of the cost. Those billions of phones have an energy cost too, and we users pay it -- that's what I've said -- and it'll end up more than your data centers. (a lot more) That's one of my main points, they're making *me* pay for it and I don't benefit in any way from it. In fact, it's a threat since there may be false positives.
This is yet another really good point.
 
I worry that ranking options implies they should have a choice. I stand firm that when they rent me space on their server they treat it as an extension of my device.
I strongly believe the need to be ranked. I can choose not to use their servers (although they should make it easier not to use iCloud). I cannot choose to not have a CSAM hash blacklist installed in my OS, short of using another phone (which, if it comes to that, I will). Apple putting spyware onto servers that belong to them because of government pressure is disappointing. Apple putting spyware onto devices that belong to me is abhorrent.
 
  • Like
Reactions: bobcomer
There has to be a better solution than this. I don’t know what tech it would be, but there’s no reason to scan my phone.
 
I strongly believe the need to be ranked. I can choose not to use their servers (although they should make it easier not to use iCloud). I cannot choose to not have a CSAM hash blacklist installed in my OS, short of using another phone (which, if it comes to that, I will). Apple putting spyware onto servers that belong to them because of government pressure is disappointing. Apple putting spyware onto devices that belong to me is abhorrent.
You can choose to not use their servers, but you can't choose to be refunded for the cost of the servers. A cost that is factored into all hardware purchases. You paid for it when you purchased a device even if you would have paid the same amount for the device if it didn't include iCloud services.
 
You can choose to not use their servers, but you can't choose to be refunded for the cost of the servers. A cost that is factored into all hardware purchases. You paid for it when you purchased a device even if you would have paid the same amount for the device if it didn't include iCloud services.
And the cost of the device certainly includes the cost of developing many features I may never use. There is nothing wrong with that. How is this different?

Anyway, if you truly believe there is no functional difference between Apple scanning their own servers and scanning your own iPhone then you have no reason to be outraged now- Apple has looked for CSAM on their servers before, as have all other cloud storage providers. So for you, nothing has changed.
 
  • Like
Reactions: turbineseaplane
I don't think you're thinking this thoroughly through -- only counting one side of the cost. Those billions of phones have an energy cost too, and we users pay it -- that's what I've said -- and it'll end up more than your data centers. (a lot more) That's one of my main points, they're making *me* pay for it and I don't benefit in any way from it. In fact, it's a threat since there may be false positives.

That makes no sense.

Calculate the "overhead" energy used on an energy efficient device for scanning photos vs server grade chips that suck energy whether or not there are photos to scan and you'll see servers would use more energy.

Not to mention the extra transfers between iCloud storage and processing servers while on device scanning requires ZERO extra transfers. And the overhead of keeping the server room cool with air conditioning, maintained by a team of DevOps/IT people at each of the several data centers around the world. Servers literally use more energy cumulatively compared to the overhead of extra energy from efficient phones. All of that extra cost gets factored into the iCloud bill people pay for every month which means the cost gets passed to the consumer. Unless you're surviving off of the 5GB free storage, you're paying for it either way. On device scanning is literally cheaper.

Want to save money? Charge from 0% to 100% at work. Free scanning energy.
 
Last edited:
Calculate the "overhead" energy used on an energy efficient device for scanning photos vs server grade chips that suck energy whether or not there are photos to scan and you'll see servers would use more energy.

Maybe you could enlighten us on the numbers?
It sounds like you've done the calculations and its informing your opinion?
 
Step 1
- the user activates iCloud Photos, basically surrendering his photos to Apple servers, like calling the police and saying “I am going to bring the whole content of my home to the local police station”
True. This is arguably not the perception of iCloud that Apple is going for with its privacy-focused advertising, but caveat emptor. Might change the public discussion of OS merits in the future though. Apple is often equated with privacy and seamless integration. The reality is, pick one.

Step 3
- said fingerprints are compared by a super smart trained AI to the fingerprints in the database
- the AI is needed not to look at the content of the picture (the content is no longer part of the equation since Step 2) but to have some leeway, some wiggle room to be able to catch slightly modified (cropped, etc.) versions of the known offending picture
- the system is engineered to only match the known offending old photos from the NCMEC repository, it can’t look for new/personal children-related content
"Super smart trained AI" - I work with state of the art machine learning models, and even the best of them make the occasional dumb mistakes, because ultimately it is a dumb method still far away from human thinking.

The system is looking at the content. The NeuralHash component (your step 2) works on "features of the image instead of the precise values of pixels," ensuring that "perceptually and semantically similar images" get similar fingerprints. Semantically similar, that is content matching. NeuralHash analyses the image content. If it was only about matching slight modifications, perceptual similarity would be sufficient. NeuralHash does more. Thus the fingerprint is among other things a content summary. A lot depends on the detail here, which in turn depends on the undocumented features Apple is looking for and the undocumented weights and thresholds of the system. "Two pink shapes" is more generic than "two nude humans" is more generic than "two people having sex" is more generic than "a man having sex with a boy" is more generic than "a grey-haired man..." and so on. The more detailed this goes, the closer we get to pixel perfect image comparison. We know Apple does not want that, so some level of genericness is preserved. Step 3 is comparing these image content summaries with the image content summaries from NCMEC.

Step 5
- the user uploads the photos to iCloud Photos just like HE promised to do in Step 1
- now and only now the company known as Apple Inc. is involved in any way
- at this time, Apple Inc. can do one thing and one thing only: count the positive matches security vouchers
- now 2 things can happen
1) the number of positive security vouchers is smaller than the threshold —> go to step 6a
2) the number of positive security vouchers is bigger than the threshold —> go to step 6b
True. The unspecified threshold is interesting, though. We know more than one matching picture is needed (so Apple won't do anything if they have one match, even if it is a perfect match, which is peculiar in its own right), but we do not know how many. Ten? Two?

Step 6a
- the security vouchers remain unreadable gibberish till the end of times, well after we are all dead
- not even Tim Cook, the Pope, God, Thanos with all the stones, etc. can crack their multi factor encryption, it’s like granpa Abe Simpson Hellfish unit treasure in that classic Simpsons episode, you need a set number of keys to open the vault, that’s why the “threshold” system is not a policy decision that could be changed easily by Apple Inc. but a technical safeguard that’s built-in in the system: no one could ever end up in Step 6b and Step 7 because of a single unlucky wrong match (or “false positive”)
- Apple Inc. says that a good ballpark estimate of the chance of getting enough false positives to surpass the threshold is 1 in 1 trillion per year; some people dismiss this as “yeah how do I know they’re not being too optimistic” but it should be pointed out that Apple Inc. has given 3 external experts some access to the system, and that even if that quote was wrong by tenfold (1 in 10^11 instead of 1 in 10^12) it would be still be an extremely rare event (one innocent account flagged every 117 years); moreover, the order of magnitude of said quote is perfectly plausible since we’re talking about the compound probability of multiple rare events (as an example, it would be easy to get to 1 in 10^12 as the compound probability of six 1 in 10^2 rare events)
First of all: Apple trusts its users so little that it suspects all of them of CSA, and it installs a black box into their personal property to check on them. To Apple, users are potential adversaries, who need to be checked and controlled. Information from Apple to its users must be read with this premise in mind. No claim from Apple should be taken at face value.

Your description of 6a assumes that all of this is perfectly implemented, without bugs or undocumented backdoors, and that the calculation is honest. There is no reason to make these assumptions. The trillion is hyperbole even under the most generous readings, as user accounts can differ by many orders of magnitude. External experts matter little - Apple picked them, and Apple has posited itself as our adversary. There is no basis of trust to fall back on, not any more. Apple needs to open-source this tool chain, so that we all can see what is going on in there.


Step 7 - HUMAN REVIEW
- now and only now the positive security vouchers, no longer gibberish, can be looked at by a human reviewer at Apple Inc. HQ
- the human reviewer will be able to look at a low-res version on the user’s supposedly offending photo
- if the low-res photo is something innocuous like a sunset, a bucket of sand, a cat, a goldfish, etc., (and remember: the matching is based on hashes, not content, so the content won’t necessarily be children-related, that’s not the kind of similarity the AI would catch, don’t worry about the pics of your kids, they have no more probability of being accidentally flagged than any other subject), the human reviewer will acknowledge the system made an error and discard it, no automatic calls to the cops
- if the low-res photo actually looks like actual kiddie p0rn (that gotta be the worst job on Earth and these reviewer are sometimes psychologically scarred), then Apple Inc. will disable your iCloud account and maybe report you or maybe not (depending on the follow up internal investigation)
The matching is described as taking content into account.

Also, you left out option three - the low-res photo looks like, well, the reviewer is not sure. Is it CSA or not? Are all those people adults? Consenting adults? Might be hard to tell with the blur. Is this a picture of a barely dressed kid or a young adult? If the former, is that legal? The reviewers will have to make decisions that are not nearly as clear cut as you describe. If they decide that they cannot rule out CSA and they would rather have the experts take a look, then we get to...

Step 8 - NCMEC Review
Here all bets are off, as we do not know how this works. If the questionable pics are not variants from those in their database, then they should drop the case. The only damage is several strangers having looked at private pictures. If it is a match, off to the police. What if it is not a match, but the NCMEC reviewer thinks this might be a hitherto unknown case of CSA? Can they ask the police to investigate?
 
I will not claim to have read all the “tech experts“ posts. However I would like it they or even you could point to an example of good software that has never failed. Software can have flaws, but the spirit/goal of what is being presented by Apple is to make it without flaws and to fix them if they come up. I doubt they spent a few minutes writing some code without testing before they published their results. Thats just not how it works in big companies.

I think your issue, and many others, is not that the software could be flawed but that you do not trust Apple or anyone one for that matter. I will be the first to admit I did not expect this from Apple. Obviously they feel its a big enough issue to take all the bad publicity over it and still try to make it work.

I‘m more optimistic in that I truly think they are trying to do exactly what they say they are trying to do. Also 1 in 1 trillion yeah I’ll take those odds all day long every year of my life. If I did get targeted I’d be okay because I’m not in the market for what they are looking for so it would be about as bad as getting a parking ticket or pulled over for having a tail light out. Which by the way is much more likely to happen and I’d be pretty civil if it did. I’m not everyone though so everyone will feel different.

As for some of the tech experts I have read so far, so not all of them, I can’t find a single one yet that can prove it will fail or that it will be abused. But hey, I can be wrong. I just don’t think Apple is trying to be malicious at all on this. In this case it is society that has the dirty mind.

Thanks for taking the time to reply.

Of course software fails, but in this case we are pretty much talking about a "spyware" software intentionally installed on iphones, running on local level.

Many people are worried about the political implications and how governments will push apple in the future. I admit I share the same concern. We have already seen how Apple stepped back in China and in Russia (I think)
Apple is a corporation not a humanitarian organization.

Regarding trust. In general I don't trust companies, I just buy products that I believe are the best purchase at the specific moment. Right now I have invested in apple ecosystem and I don't even have the time or energy to bother moving my files somewhere else. I will do it if I have to.

On a personal level, Apple lost my trust (a lot!) when they started selling devices with icloud stream ON by default without asking user's permission. Many people who had medical documents or sensitive material in their photo album found out that their files were on cloud without their consent!! But Apple once again knew better for all of us.

Regarding apple's statistics, how many times haven't we heard about problems that affect "only a small number of users" where we can assume that this number is much higher.
 
Last edited:
  • Like
Reactions: Shirasaki
This feature they are introducing doesn’t kill people
Objection!

It will be misused, that is 100% sure. Imagine what happens to dissidents in China/Russia/Your preferred authoritarian state if they are snitched out.

I believe the probability that the feature is going to have people killed (not to mention unlawfully imprisoned) is higher than its not going to have someone killed
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.