Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They framed the conversation by trying to focus the conversation around "Well we're just scanning against known hashes of material so don't worry your personal photos are safe" (i.e, focusing on the technical details of the scanning implementation they'll be releasing at launch) rather than letting people ask why any form of scanning should be happening in the first place.

Bingo!
And way way too much focus on the CSAM part of things -- which of course is hard to get anyone to be "against".

The particular specific content they are scanning for (CSAM in this case) isn't really relevant.
The tool to do this - on users devices - is the problem.

(I know you know this - just stating for clarity on my point in case it's read in isolation)
 
Last edited:
so your argument is Apple should not scan for CSAM at all ? that becomes more a moral argument not a privacy argument
If you want to be a diode when thinking of this case, I will give you this question: do you want to completely give up privacy to trade this phantom “safety” knowing you will not be charged with illegal materials, even though “legal or illegal” changes from time to time?
 
Looking at the issue from both sides simply shows how ugly human has evolved into ever since.

Every tool can and will be used for nefarious purposes. No need for step by step guide. No need to nitpick wording and argue otherwise.

Also, if private company like Apple is all out on mass surveillance, the government would more than likely to offset their burden of law to private companies and change the law to take full advantage of that.

It’s just unfortunate that there is literally no way for Apple or other companies to fall back this level of surveillance and all android manufacturers will follow up very quickly, ramping up the surveillance war while every single customer will be the victim sooner or later.

And I fear no amount of PR damage would cause Apple to roll back this surveillance software. They have released multiple articles showing commitment with no sign of backing down. And sales drop will be very minor (if happening at all) compared to last year given most parents would rejoice anyways.

The next thing to look into is how bad this can be when balance is already lost. Apple will be fine either way, and most Apple user (compulsory or voluntary) will have no choice but to offer unconditional surrender.
 
I think it is far from guaranteed to stay. Searching at will as a private company on private devices must open at lot of legal questions in many countries.

I still don't understand why they do it this way? Privacy respect and protection was a key image feature of Apple. Doing the same on their own US servers would have taken out most of the pain. But this is like planting a bug in your device and even putting bad outside database stuff on your phone.
 
Last edited:
so your argument is Apple should not scan for CSAM at all ? that becomes more a moral argument not a privacy argument
CSAM is evil. But it’s not the only evil thing in this world. Today they scan for CSAM. Tomorrow they scan for terrorist activities. The day after tomorrow they scan for murder plots. Who could possibly be against that? Where does it stop?

The same applies to what they scan. Today it is your photos. Tomorrow your videos. The day after tomorrow your text messages. Where does it stop?

It’s not even the government doing this but a private unaccountable corporation that cares about nothing but profit.
 
It’s like your asking, “Why should the search for stolen property happen at the store where they think they saw you steal it instead of at your home and without a warrant?”

Of course, stores could almost completely eliminate theft if they routinely searched everybody leaving the store - rather than hoping that a member of staff would spot the theft - but (leaving aside the question as to whether it would be legal in your jurisdiction) that would be highly visible, slow and expensive and almost certainly cause a backlash from customers.

The problem is, doing something analogous in tech, the intrusion is often invisible, fast and relatively cheap... and with most tech products now requiring you to accept a T&C document the sizeof War and Peace it's is trivial to get people to unwittingly waive their legal rights (...or at least bluff them into thinking that they have)...

Also, don't get too misty-eyed about protecting kids: if they're just checking against hashes of known CSAM then it's pretty trivial to modify an image so it won't have the same hash (...failing if the material has been modified is what hash matching is for - try and make it more "fuzzy" and you increase the risk of false positives) - this is really just about helping Apple prove that they've performed reasonable "due diligence" in the case that known CSAM shows up on their server. If this was some magic bullet that was going to decimate child abuse then maybe it would be worth risking a bit of privacy for - but I doubt it will put a dent into sophisticated, organised child abuse.
 
Of course, stores could almost completely eliminate theft if they routinely searched everybody leaving the store - rather than hoping that a member of staff would spot the theft - but (leaving aside the question as to whether it would be legal in your jurisdiction) that would be highly visible, slow and expensive and almost certainly cause a backlash from customers.
Except this feels a little more like having a theft detection tag on an item and it setting off the theft detection system as you walk out of the store.
 
You’re trivializing this. It isn’t some weird Qanon theory about the Illuminati. These are very real concerns about scenarios that are extremely plausible. Apple isn’t your friend. If they believe it is in the best interest of the company to flip the script on this, they will in a second. But let’s assume they are telling the truth and they wouldn’t do anything like that.

The next guy in charge might feel very differently. Apple today is not the same company it was 10 years ago and it won’t be the same 10 years from now. Companies merge and get bought and sold all the time and long standing company polices get erased in the process. Remember when Google used to have “Do no evil” as a part of it’s code of conduct until some new guys came in and removed it?
In addition: The next US gov´t might be totally different (again) than the present one. Same applies to other democracies (albeit perhaps on a longer time frame). So it´s delusional to think that because atm all is looking good it has to stay that way. And in case of changes, nobody could want a digital tool (weapon?) pointed at people formerly regarded as "innocent folks". But then, the tool already exists and will not perish. Too late then...
 
Marketing ia appearance and perception.
Past experience (US Government and other Nation States) states other wise to that “nothing”.

MHO YOMV
So you trusted all of their marketing before, but now you suddenly don't?

This is what I don't get. People SUDDENLY think Apple is evil for doing something they've been doing for a while now and for some reason they expect that their entire phone will be scanning and uploading data even though they clearly said it's only for CSAM and it's only for iCloud Photos and if the iCloud Photos is turned off, there's no scanning done whatsoever. I don't know how much clearer they can be about it.
 
So you trusted all of their marketing before, but now you suddenly don't?

This is what I don't get. People SUDDENLY think Apple is evil for doing something they've been doing for a while now and for some reason they expect that their entire phone will be scanning and uploading data even though they clearly said it's only for CSAM and it's only for iCloud Photos and if the iCloud Photos is turned off, there's no scanning done whatsoever. I don't know how much clearer they can be about it.
Scanning always occurs on device...it just doesn't matter if you are not saving to iCloud.

Still irrelevant to me.

Apple tracks device movement for traffic info (every device is defaulted to this setting and I would take a safe guess that 95%+ of users not only leave this on, but couldn't find the spot to turn it off anyway).

They do the same for improving Maps/routing...and in the same spot to turn off.

They scan on-device already and have for years, with privacy measures in place not nearly as progressive as what they have for this, yet practically no one on here that I can see ever posted multiple threads about how Apple is illegaly searching their phone.

The last time I checked, speeding and being in a certain place at a certain time could certainly be "illegal" depending on the circumstances and your phone already sends that info to them (or certainly has that capability). This is no different and has even more security added...and no, it is not a "back door." People need to stop using that term as it implies that someone else could have easy access into people's phones via this tool and that is simply not the case...there are already easier ways to do that, hah!
 
You’ve already said in other threads you still plan on getting an iPhone 13. If you still give Apple your money, why would they change course? Talk is cheap.
Hoping they don't implement it and still buying an iPhone can be mutually exclusive.

Unless he said he WON'T continue to give them their business if they continue with the scanning (I don't think he did).
 
  • Like
Reactions: dk001
So you trusted all of their marketing before, but now you suddenly don't?

This is what I don't get. People SUDDENLY think Apple is evil for doing something they've been doing for a while now and for some reason they expect that their entire phone will be scanning and uploading data even though they clearly said it's only for CSAM and it's only for iCloud Photos and if the iCloud Photos is turned off, there's no scanning done whatsoever. I don't know how much clearer they can be about it.

Don’t read into it.
I don’t trust Apple’s marketing. I admire what they have pulled off however they have had some mis-steps. Overall they are good at getting the message out.

I am looking at the process and trying to understand the “how” and “why”. Apple’s explanations to date have neither answered my questions in full nor given me that warm and fuzzy trust feeling.

As for the “Evil”, from what I and apparently many others are seeing, including professionals with far better understanding than myself, and asking why the apparent shift from “Your device is private” to “Your device is private except for …”. From there, based on past experience regarding the US Government (in my case), can easily envision what they would do with an access point like this.
 
Is this true? News reports and documentation contradict this.
They WERE scanning all photos in iCloud according to one report (no different than every other company out there), but this new method does the scan (hash comparison) on device for every image you have. If there is a match, it marks the image. If more than one marked image is uploaded to iCloud, Apple reviews the images to verify they are a true match.

At the end of the day, every image you have on your phone is compared to the hashed image database. Nothing happens though unless there are multiple matches AND those images are uploaded to iCloud.

And yes, as Apple has stated, the chances of having multiple innocent images that match multiple separate database images to even cause a review by Apple is one in one trillion odds.
 
They WERE scanning all photos in iCloud according to one report (no different than every other company out there), but this new method does the scan (hash comparison) on device for every image you have. If there is a match, it marks the image. If more than one marked image is uploaded to iCloud, Apple reviews the images to verify they are a true match.

At the end of the day, every image you have on your phone is compared to the hashed image database. Nothing happens though unless there are multiple matches AND those images are uploaded to iCloud.

And yes, as Apple has stated, the chances of having multiple innocent images that match multiple separate database images to even cause a review by Apple is one in one trillion odds.
The contradiction is that it's been reported that CSAM is not scanned for if iCloud Photos is disabled. Ergo, in this context, it does matter if you are saving to iCloud.
 
The contradiction is that it's been reported that CSAM is not scanned for if iCloud Photos is disabled. Ergo, in this context, it does matter if you are saving to iCloud.
That is incorrect as far as I know.

The photos are scanned regardless...it just doesn't matter if you have iCloud turned off.
 
That is incorrect as far as I know.

The photos are scanned regardless...it just doesn't matter if you have iCloud turned off.

So if iCloud Photos is disabled, the system does not work, which is the public language in the FAQ. I just wanted to ask specifically, when you disable iCloud Photos, does this system continue to create hashes of your photos on device, or is it completely inactive at that point?

If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image. None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos, is functioning if you’re not using iCloud Photos.

 
As for the “Evil”, from what I and apparently many others are seeing, including professionals with far better understanding than myself, and asking why the apparent shift from “Your device is private” to “Your device is private except for …”.
That seems to be the salient point. The practical implications of this specific issue are almost non-existent, but the principle of installing software on your device to scan your files on behalf of a third party - with promises not to expand its use - crosses an important line in the sand.
 
That seems to be the salient point. The practical implications of this specific issue are almost non-existent, but the principle of installing software on your device to scan your files on behalf of a third party - with promises not to expand its use - crosses an important line in the sand.
There is no proof at all that Apple is doing this....
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.