Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don’t want ANY part of the scan on my phone. I don’t want any scan AT ALL, ANYWHERE.
It’s a technicality.
It comes with some perks (your pics not being routinely bothered once they’re on server in an encrypted state).
 
No it won’t. The scanning occurs when you upload photos onto iCloud, by creating a voucher that accompanies the uploaded file. If you want to prevent scanning, you don’t even have to turn off your internet connection - just turn off iCloud photo sync.

By the way, even if scanning WAS done on your device when the internet connection is off, it wouldn’t be “worse” as you claim. The results would, in that case, not go anywhere, and simply be sitting in the encrypted file store on your phone, where they cannot possibly hurt you.
My understanding from Craig.....

"...Craig Federighi, said that the on-device nature of Apple's CSAM detection method, compared to others such as Google who complete the process in the cloud..."
 
So what happens when Apple comes back with a "clean report." Do the people who are worried about the slippery slope stop worrying?
Nope, by definition the slippery slope and “anything could happen” arguments are unfalsifiable

 
  • Like
Reactions: I7guy
If it can be openly demonstrated Apple did implement CSAM checks with privacy in mind, as they claim, we can put this discussion to rest.

Good one! 🤣 You know, if we could demonstrate that the earth is a sphere, we can put all the flat earth discussions to rest too.

Unfortunately, some people will discount any evidence that goes against their conspiracy theory (or their misunderstanding . . . or their purposeful twisting of the truth) - often by claiming there was a payoff to falsify data/findings, etc.
 
  • Like
Reactions: JBGoode and giggles
Sadly as expected, users will just roll over and accept it no matter what Apple is found doing. The Public have short memories. This does not make it any less wrong. It is still an appalling decision which should be rescinded.
This is the kind of comment that makes it look like you have CSAM on your phone.
 
  • Angry
Reactions: boswald
In my opinion apple will be soon put an spyware in millions and millions iPhones, so why I must pay more for apple products is privacy will be same or less than others companys? Goobye Apple, welcome Windows debloated, android phone…
 
Last edited by a moderator:
  • Like
Reactions: BurgDog
It’s a technicality.
It comes with some perks (your pics not being routinely bothered once they’re on server in an encrypted state).
You don't know that they wont be scanned on iCloud too. Unless you have a quote of apple saying that -- I'm willing to listen on that issue.
 
  • Like
Reactions: PC_tech
That's the same nonsense as saying "if you have nothing to hide ..... "

Like you I don’t like people going “you sound like you have CSAM if you say this”.

On the other hand I also feel weird about reading the phrase “not for my own benefit”…well…in a roundabout way it’s kinda for any decent person (caring for abused children) benefit if life of CSAM collectors is made slightly harder.
 
On the other hand I also feel weird about reading the phrase “not for my own benefit”…well…in a roundabout way it’s kinda for any decent person (caring for abused children) benefit if life of CSAM collectors is made slightly harder.
I have no kids to worry about, and it's not my job to worry about CSAM. It really doesn't benefit me at all, and I own my device, I don't own what's in iCloud. (even if it's my stuff)
 
Good one! 🤣 You know, if we could demonstrate that the earth is a sphere, we can put all the flat earth discussions to rest too.

Unfortunately, some people will discount any evidence that goes against their conspiracy theory (or their misunderstanding . . . or their purposeful twisting of the truth) - often by claiming there was a payoff to falsify data/findings, etc.
Wow, this is a new one. So you are associating those who are concerned with their privacy with flat earthers.

Yup, if you don't have anything else to argue with, start calling names.
 
  • Like
Reactions: PC_tech and ececlv
Like you I don’t like people going “you sound like you have CSAM if you say this”.

On the other hand I also feel weird about reading the phrase “not for my own benefit”…well…in a roundabout way it’s kinda for any decent person (caring for abused children) benefit if life of CSAM collectors is made slightly harder.
The last bastion of anyone wanting to push this sort of thing through .... "what about the children ??" CSAM is abhorrent . That does not justify searching peoples property without good reason. Please don't try to peddle the crap that people who care about privacy don't care about abused kids.
 
Good one! 🤣 You know, if we could demonstrate that the earth is a sphere, we can put all the flat earth discussions to rest too.

Unfortunately, some people will discount any evidence that goes against their conspiracy theory (or their misunderstanding . . . or their purposeful twisting of the truth) - often by claiming there was a payoff to falsify data/findings, etc.
It might be that Apple was being a good corporate citizen to fight CSAM, a truly horrific plague. It might be for privacy. The one thing it ain't is required by law.....

In summary, 18USC 2258A clearly states a company must act on "actual knowldege". Moreover, a company is "not required to monitor", "not required to affirmatively search, screen or scan files and data".

So, the question is why?
 
On a sociological aspect I find this CSAM scandal interesting.

It’s apparent that many people (myself included) see the governments of the world as antagonistic , and saw Apple as an ally against that antagonistic force. People (rightfully) feel betrayed by Apple’s actions.

This begs the question of why we still live under a government we view as repressive? Why did we put trust in a private entity (who is always beholden to the governments of the markets they operate in) to protect us from our own country?
 
It's too late for me, I had the AppleOne Family Premium subscription and used over 1TB of my storage space with photos & documents. I've removed everything from iCloud and disabled iCloud Drive and backups on all my devices, and downgraded my plan to the lowest.
Such a bummer, I've heavily invested in the Apple ecosystem. All of my family's devices are Apple products (there are 7 of us) but iCloud will be restricted to Calander & contacts for us going forward.

I've lost my faith in Apple, after 30 years for me....
 
The last bastion of anyone wanting to push this sort of thing through .... "what about the children ??" CSAM is abhorrent . That does not justify searching peoples property without good reason. Please don't try to peddle the crap that people who care about privacy don't care about abused kids.
Except I didn’t say that.
I just said that the sentence “not for my own benefit” sounds weird to me personally.
And literally 2 posts above yours we have a user that (legitimately) does not super duper care about abused children if they’re not his own, sooo…looks like some people in “your camp” literally consider it of zero personal benefit to fight CSAM as a society.
 
  • Haha
Reactions: bobcomer
The vouchers include a “visual derivative” of the image — basically a low-res version of the image.
Apple does not have the actual CSAM images - they cannot. Only NCMEC is allowed to have the actual images.
You're really not seeing the contradiction there?

OK, lets take "basically a low-res version of the image" to refer to the user's image, not the NCMEC's illegal-to-possess (even in 'low res' form) image. How are they going to "check" that? They can't compare it with the NCMEC's original (which they aren't allowed to have) and they can't compare it with the user image (because the whole point of this is so that they don't need to 'open' your photos). What they can do - maybe - is rule out the possibility that they've just found 11 identical copies of the same photo. That's good - it goes some way to avoiding the "1 match is chance, 11 matches is child abuse" fallacy - but fairly limited: Their "NeuralHash" system is designed to still match images that have been cropped, scaled, or altered in quality, and if these "visual derivatives" are detailed enough to spot that 11 images are different crops/enlargements/colorizations of the same source, how are they better than Apple just looking at your photos?

Even if your account is — against those one in a trillion odds, if Apple’s math is correct

Well, that's the question. Look up Sally Clark and Prosecutor's Fallacy for examples of how easily highly qualified people, acting in good faith can get conditional probabilities badly wrong - especially when dealing with emotive subjects.

The stuff you quoted could easily describe a system that just rubber-stamped matches against the database and only protected against cases such as an iPhone being hacked or man-in-the-middled to inject fake vouchers - which is good, of course, but that's a very remote risk c.f. Apple getting their 'false positive' math wrong or innocent images getting into the official (third-party and not available for examination) database. A system like this really needs to be examined by independent experts who are being paid to aggressively pick holes in it - not to produce reassuring technobabble.

Or, there is a far, far simpler solution:

NeuralHash, running locally on this iPhone, has detected that the image you are trying to upload matches a known illegal CSAM image. Although false matches are possible, iCloud will not permit the uploading any image which fails NeuralHash screening. We recommend that you delete the image. No information about this incident has left your iPhone. To prevent future scanning, disable iCloud Photos and upload your filth to Google Drive instead, because we hate them.

Problem solved. Innocent user is protected against false matches or getting trojanned into downloading CSAM, Apple don't get anything they might be obliged to report on their servers, evil paedophile... is probably already encrypting their photos on an Android burner phone and wasn't going to get caught by this anyway.
 
Nope, by definition the slippery slope and “anything could happen” arguments are unfalsifiable

Yeah, and anyone who claims the NSA collected data on millions of Americans is a CONSPIRACY THEORIST.
Oh. Wait. https://web.archive.org/web/2015090...ames-r-clapper-interview-with-andrea-mitchell
 
  • Like
Reactions: ececlv and PC_tech
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.