Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Honestly think the best way to get this through apples thick skull is for them to see a glimpse of the advertising campaign Samsung,lg and other competitors could bring…. It’s pretty simple to think of narrowly targeted 30 second spots saying how much they value your devices security and they would never install software on your device to spy on you….. like Apple does, they could even end with a blurb that they support cloud scanning to combat child endangerment …. That would be highly effective and devastating to Apple because it would be all true, one bad apple ruins the bunch
Samsung made fun of Apple for not including a charger… until Samsung stopped including a charger. I will let you decide for yourself if you think that pattern might be repeated.
 
  • Like
Reactions: Pummers
You literally do have that choice. Don’t use iCloud sync for photos and nothing gets scanned.
Yes! Pinky promise!
0nc0vn08_apple-privacy-billboard-bloomberg_625x300_08_January_19.jpg
 
You do realize that Apple basically made the assumption that every iPhone users, including yourself, to be a potential pedophile. If you like paying a premium for that "privilege," suit yourself.
Apple, Google, Microsoft, Dropbox, etc. already make that assumption about all of their cloud users.
 
2. “Because a user’s photos stored in iCloud are end to end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device...”

Quote from a TechCrunch article. Either I don‘t understand this correctly or I do understand it and it is incorrect or you are incorrect.
UPDATE: turns out I didn’t understand it correctly. I have to admit that, Krizoitz. Still not enthused about this change, but if it can be avoided by not using iCloud (which I don’t and never had) then at least that there is an effective opt out option.
 
2. “Because a user’s photos stored in iCloud are end to end encrypted so that even Apple can’t access the data, NeuralHash instead scans for known CSAM on a user’s device...”
Apple has the key to decrypt those iCloud photos.
 
Last edited:
  • Like
Reactions: PC_tech and dk001
Likely this was (Apple Board Director) Al Gore’s brainchild, who’s been obsessed with teenager sexuality since the 1980’s. See Senator Gore in 1985 scrambling to censor Frank Zappa:

 
  • Like
Reactions: PC_tech
"A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials."

Without taking a side in this, none of those things in the above warning could be forced on Apple with what Apple says they're doing. Apple has a list of hashes and they're checking images uploaded to iCloud to see if they match that hash. WeChat's content matching would require text content analysis — totally different. The India example appears to be the same kind of thing (or maybe a requirement for human pre-screening, which is more different still). And the Russia example is where Russia identified posts or pictures and demanded they be removed; which is absolutely not the same thing.

So... this article seems to be people urging Apple not to proceed with its plans, based on warnings that have little to do with what Apple is actually doing.
So what you’re saying is, there’s no way that China can make a database of banned images with their hashes (essentially what’s happening here just with abuse organizations making the hash database) and then say “Apple, add these hashes or you can’t sell here”. Yup, no way the Chinese
Government can come up with such a simple strategy and implementation.

child abuse is naturally horrible, but is the incidence rate remotely high enough (especially how currently if you just disable iCloud it doesn’t work) even worth risking these other things? Not even remotely in my opinion.
 
While I hate the intrusion and violation of privacy, despite the good intentions, because we all know it will be abused for something other than the original public intention.

I wonder if this is more Apple protecting itself, as some govt. entity has come after them for unknowingly hosting illicit child images/vids in the cloud.

That makes little difference. At least in the USA.
The law specifically states that Apple cannot be forced to scan for CSAM except via subpeona or warrant. Once found, they have a legal responsibility turning it over to authorities.
Is it on the iCloud? Likely. Just look at other cloud providers that elect to scan for it.
 
  • Like
Reactions: cmichaelb
Some people are afraid of the government, some aren't.

What people are forgetting is that this CSAM stuff has already been implemented by multiple vendors!
 
Samsung made fun of Apple for not including a charger… until Samsung stopped including a charger. I will let you decide for yourself if you think that pattern might be repeated.
For sure you're right, but that wont make me any easier on the originator. I'll eventually have to come to term with the idea of Big Brother being real.
 
Why would anyone dismiss the word “could” in today’s world….. if it’s an advantage to anyone it’s the same as “will” and in the case of Apple it changes to “compelled”
If you want to believe the sky is falling and reflexively assume will, feel free.

I won’t.
 
What people are forgetting is that this CSAM stuff has already been implemented by multiple vendors!
A lot of people complaining here would be fine with Apple scanning their cloud servers (as the other vendors do)
 
  • Like
Reactions: jk1221
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
Apple is straight up lying. They already self-censor material in Taiwan that would offend China even though it technically isn't illegal in Taiwan and China has not outwardly demanded it. Apple will absolutely, without any question whatsoever, use this technology to breach privacy in any country that mandates they do so by law. There is zero chance they would defy a nation like China or India in the name of privacy.
 
I just REALLY wish people would stop the (false) hot take "oh but Google and MS do it too!!!"

There's a HUGE difference between voluntarily using an online service or not, an informed decision and choice, and an OS update on your device that you have no say in.

That is the nuance here. I would have no issue if they scanned iCloud KNOWING they were going to do it. You consented to that and can choose another service. Same with Google and MS.

Have you watched Craig’s interview with the WSJ? He told everybody you can easily disable the CSAM scanning by not using iCloud Photos. Simple as that. You can also keep the rest of the iCloud going, CSAM concerns only your iCloud photo library.

PS I am not defending Apple here, just trying to understand if there is a real problem and the scope of it.
 
Have you watched Craig’s interview with the WSJ? He told everybody you can easily disable the CSAM scanning by not using iCloud Photos. Simple as that. You can also keep the rest of the iCloud going, CSAM concerns only your iCloud photo library.

PS I am not defending Apple here, just trying to understand if there is a real problem and the scope of it.
That won’t help you if the photo scanning becomes the norm and is used as a back door to look at everything else on your phone without your knowledge or consent.
 
  • Like
Reactions: Schismz and jk1221
Apple is straight up lying. They already self-censor material in Taiwan that would offend China even though it technically isn't illegal in Taiwan and China has not outwardly demanded it. Apple will absolutely, without any question whatsoever, use this technology to breach privacy in any country that mandates they do so by law. There is zero chance they would defy a nation like China or India in the name of privacy.

Straight up lying? That's quite an assertion. Please post your proof.
 
That won’t help you if the photo scanning becomes the norm and is used as a back door to look at everything else on your phone without your knowledge or consent.

It is still an assumption, isn’t it? Meanwhile, GrayKey (a hardware box) and, more recently, Pegasus (software), were successfully used to extract just about anything from an iPhone. Apple were patching these security holes very quickly.
 
Last edited:
That's the challenge. Apple says they "won't" and Apple says they "follow all applicable laws".
In this case they can't do both.
Which do they pick?
$$$$$
As a shareholder I'm totally fine with AAPL picking $$$$$, that's in my best interest, and what they're supposed to be doing.

Enabling warrantless search capabilities for the governments of planet Earth is really not part of building insanely great products.

Should've stuck with: we can't, that's not what we do, please get back to us when authoritarian government has expanded to the point of: "you must do this." We're not quite there yet, but big tech colluding with big government is enabling 1984 a whole lot faster.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.