Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, once 30 matches are found, apple uses those 30 safety vouchers to unlock the photos and then they run through a server-side perceptual hashing system that compares them to a totally different database that only resides on their server and then if they somehow make it through that system as well, then it goes to human review for further confirmation.

I got all of this information directly from Apple. https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

"Once Apple's iCloud Photos servers decrypt a set of positive match vouchers for an account that exceeded the match threshold, the visual derivatives of the positively matching images are referred for review by Apple. First, as an additional safeguard, the visual derivatives themselves are matched to the known CSAM database by a second, independent perceptual hash. This independent hash is chosen to reject the unlikely possibility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database. If the CSAM finding is confirmed by this independent hash, the visual derivatives are provided to Apple human reviewers for final confirmation. The reviewers are instructed to confirm that the visual derivatives are CSAM. In that case, the reviewers disable the offending account and report the user to the child safety organization that works with law enforcement to handle the case further." -- Page 13

Thanks. I wanted to ensure I was reading it correctly.

So either this functionality once sent to Apple uses function to create a totally separate hash of a different type from that given to Apple by the Center. So is this second "Center" giving Apple a new set of hashes or the actual photos? I suspect hashes as it would be illegal for Apple to have or use these unless "blessed" by the DOJ.
 
  • Like
Reactions: MozMan68
I'll just give one example of why this kind of site is ridiculous (which if you really look at it, is nothing more than an attempt to sell dumb t-shirts...there is no new information here...).

He uses the example of the kitten hashed image matching that of the dog hashed image....as if that means anything other than the fact that people can create these different images to "match". Yes, create them from a known image. They never talk about the chances that a random image you may have might match a known hashed image. It's so close to being impossible, they never discuss that...just that it CAN happen. And of course, show a dog and a cat...nochance those could be the same. Does he show you the hashes or how they were created to match? Does that process actually match what Apple is doing? Hmmm...no detail there for some reason.

Have you seen, read, heard of ANY image NOT being part of the CSAM database matched with a CASM image, much less 30?

Why not? Because, that database is about as secure (if not more) than your own phone much less iCloud.

Just because something CAN be done does not mean it is even remotely possible in this particular case.

Yes, someone COULD get access to the CSAM database...and then, grab 30 of those horrible images so they could create 30 innocent images with matching hashes....and then they COULD break into your phone somehow and upload those images....and then they COULD be flagged so Apple reviews them...only to find out what? That they aren't true matching images? You would never even know. It would be easier to go to child porn sites and grab HUNDREDS or THOUSANDS of images hoping they are known and then just upload those to your phone somehow (of course with no way of anyone tracking that you didn't do that).

It would be easier for me to find out where you live, grab a bunch of illegal stuff, put it in your house and simply call the cops on you. I guarantee you'd probably get in trouble for that. Why do people even consider the MINUTE chance that someone would or could use this system for anything other than catching predators/child pornographers? It simply makes no sense in the grand scheme of things.

I'll stick to my stance that this is about nothing more than trust. If for whatever reason, you don;t trust Apple anymore, there are other options out there. If you don't want to use iCloud, there are other options out there. If you want o have your stuff more secure and "private" than any other phone manufacturer or service provider out there, stick with Apple IMHO.

Did you check out some of the links?
The thwart test aside, there is some good information in there.

What it does though is add concern over Apple's silence. This has morphed from "Good solution. Let's see what needs to be fixed." into "Prove to us this works just like you claimed and cannot be used nefariously."
 
  • Like
Reactions: Pummers
Did you check out some of the links?
The thwart test aside, there is some good information in there.

What it does though is add concern over Apple's silence. This has morphed from "Good solution. Let's see what needs to be fixed." into "Prove to us this works just like you claimed and cannot be used nefariously."
Yeah...most of those links were provided over the past few weeks in other threads and even this one.

To me, Apple has proven their side as best they can.

The issue I have is not one person against this (from a tech side, not a philosophical side) has provided one true piece of evidence to show that this particular system will be hacked/used/accessed/whatever in any way other than stating that it COULD.

The proof will be in what happens once it is active. And then there will be crickets...;)
 
Yeah...most of those links were provided over the past few weeks in other threads and even this one.

To me, Apple has proven their side as best they can.

The issue I have is not one person against this (from a tech side, not a philosophical side) has provided one true piece of evidence to show that this particular system will be hacked/used/accessed/whatever in any way other than stating that it COULD.

The proof will be in what happens once it is active. And then there will be crickets...;)

I suspect it will be quiet once launched however if it is ever used anywhere in this fashion we will have a fiasco.

I still wonder why Apple is inserting itself in the "visual check" process instead of handing this over to the Center. Not only is it NVA but if they are looking for potential prosecution it mucks it up.

Odd....
 
  • Like
Reactions: Pummers
I'm sorry, but using iCloud Photo is not a "right"...
Did I suggest somewhere it was?

... and does not cripple anything other than easier access to your photos across Apple devices or via iCloud on the web.
Thank you for conceding my point :)

There are 100's of different options for you to back up photos that do not involve iCloud and are even available as apps on the phone that make accessing your photos across devices practically as easy as using iCloud (Amazon Photos, Flickr, Google Photos, etc.). Opening those is nearly no more effort than opening the photos app native to Apple devices.
Making Apple less useful, much less necessary, overall--another of my points :)

At the end of the day, you don't trust Apple because of the way they approached this, ...
Almost correct. Please re-read what I wrote in the post to which you followed-up.

At the end of the day, it is optional and Apple will continue to scan all of your photos for dogs, cats, flowers, POI's and so forth.
There's one significant difference between that and CSAM-scanning. Can you guess it?

Also, outside of those of us who care way to much about Apple and these things, I predict this will disappear from people's minds ...
You're arguing a non sequitur.

I asked my wife what she thought, and she had no clue what he was talking about (and didn't care). hah!
Mine, OTOH, was appalled. As were several other non-techies to whom I mentioned it. Their responses all ran along the lines of "They're going to do what?!?!" (Interestingly, none of them, save my wife, was an Apple user. Hmmm...)

I hate to say it, but 99% of the population will also not care and go about their lives. Yes, I do think they should "care", ...
Again: A non sequitur. What the madding crowd does or does not do has no impact whatsoever on my personal choices. Nor has it any bearing on my arguments. Besides...

"If fifty million people say a stupid thing, it is still a stupid thing." -- Anatole France
 
Did I suggest somewhere it was?


Thank you for conceding my point :)


Making Apple less useful, much less necessary, overall--another of my points :)


Almost correct. Please re-read what I wrote in the post to which you followed-up.


There's one significant difference between that and CSAM-scanning. Can you guess it?


You're arguing a non sequitur.


Mine, OTOH, was appalled. As were several other non-techies to whom I mentioned it. Their responses all ran along the lines of "They're going to do what?!?!" (Interestingly, none of them, save my wife, was an Apple user. Hmmm...)


Again: A non sequitur. What the madding crowd does or does not do has no impact whatsoever on my personal choices. Nor has it any bearing on my arguments. Besides...

"If fifty million people say a stupid thing, it is still a stupid thing." -- Anatole France

Maddening crowd? Again, horrible comparison to what I was stating.

You can’t be mad about something you don’t know about, you do know and don’t care about, or ends up not affecting your life in any way.

You could argue about that last one to your point, but that would be your opinion, not based on any facts supporting what MIGHT or COULD happen with this addition to iOS 15.
 

Here it is, it's the first time I've heard it explained like that.

Good video.
Some info in here I am unsure of - requires further digging - however it does possibly answer a nagging gap I kept running into on how this hash can recognize changes. This feeds into the Messages parental check for kids.

Interesting.
 
Last edited:
You could argue about that last one to your point, but that would be your opinion, not based on any facts supporting what MIGHT or COULD happen with this addition to iOS 15.
Sorry, meant to address this, too.

That wasn't meant to be taken literally in context, but as a way to point out that, just because such-and--such number of people do or say a certain thing, that doesn't make it right or correct. E.g.: <mom>If Johnny jumps of a bridge, will you jump off a bridge, too?</mom>
 
*tsk* Shall we do a word count of our respective posts to this thread to see which of us has the higher average word count?
Ah, touché. Don't feel bad though… I had something of an unfair advantage, with half-a-dozen people repeatedly directing their arguments and questions at me. You'll be pleased to know however that I'm tiring of the discussion and will give you a chance to catch up soon.

No. I'm never "ok" with anybody pawing through my stuff, at any time, or for any reason, without cause. (In legal terms: Probable cause, or at least reasonable articulable suspicion [RAS].)
The point I made to which you're responding does not speak at all to the presumption of innocence.
Are you a lawyer? You sound like one. One who likes to quibble over words. If you can't acknowledge any connection between the presumption of innocence and the onus on the prosecution to establish probable cause, then I'll wager you're more interested in scoring points than finding common ground and having a truly constructive discussion.

Or maybe I'm just jealous because I'm not a lawyer and don't really understand all those big words. :confused:

Freedom is actually pretty easy and not "nuanced" at all. Claiming freedom is "nuanced" is code talk from a statist for "Here's this delightful new way we're going to infringe on your freedom, but you shouldn't be upset, because it's for the good of society." (Which, in turn, is really code meaning for their good.)
Bottom line: Freedom means allowing others to do what they want, even if it's something you don't appreciate or with which you don't agree. … It is properly constrained only by "Your right to swing your fist stops at my nose." Thus I'm free to pursue my interests as I see fit, so long as they don't infringe on the rights of others to do the same.
I do envy the simplicity of your world—the clarity with which you see these things. You see, I'm trying to put each of these real-life situations in the appropriate category using the J Seymour Binary System™️ (JSBS): 'free to do', or 'not free to do'. Would you be a good sport and help me out?

  1. Jane owns a dog. She keeps him locked up in her backyard when she goes to work during the day. She owns the dog and the backyard. So I think I know this one: free to do right? (Oh, I forgot to mention that the dog barks all day when she is out. Her neighbour Bob works from home. The constant barking infuriates him and he finds it very hard to concentrate.)
  2. Bob, driven to distraction by the barking, turns his music up loud. He owns his house, and the stereo system, and he streams the music legally from Apple Music. Pretty sure I got this one: free to do! (The music bothers the dog intensely, which adds to the anxiety it already feels from being abandoned by Jane all day. But it's just a dog.)
  3. Bob likes to de-stress at the end of the day by taking a long bath. Unfortunately, he often forgets to close the curtains, and Jill, his other neighbour, has seen things she can't un-see more times than she cares to remember. I need help with this one, as it's Bob's home and you know… Bob's fist (and all his other body parts) stop well short of Jill's nose—they are fully contained within his house at all times.
  4. Jill has decided to erect a wall to block her view of Bob's house (and other… err… belongings). The wall will be fully contained within her property. She decides not to run it by her local council, as she's not a believer in 'those meddling bureaucrats'. (Besides, her uncle Barry is on the council, so she figures if anyone complains, she'll get Barry to pull some strings.) She hires a good engineer who assures her the wall will be structurally sound. Free to do?
  5. Bob hates the wall because it reminds him of all those Mexican drug dealers, criminals and rapists. He doesn't want a confrontation with Jill though, so he decides to paint his side of the wall blue, with some fluffy white clouds, and pretend it's not there. He figures Jill can't see his side of the wall anyway and will never know.
  6. Bob buys his paint from Garry. Bob suspects that Garry gets his merchandise illegally, but he doesn't know for sure, nor does he want to know. The paint is significantly cheaper than the paint at his local hardware store, and Bob needs the extra cash as he's saving up for the iPhone 13. (There are also rumors that Garry sells drugs to children, but that's all they are—unfounded rumors. Bob really doesn't want to go poking around in other peoples' stuff to see if maybe they're doing something with which he disagrees.)
I think if you can show me which JSBS slot to put each of these situations in, that'll really help me to see where you're coming from!
 
  • Like
Reactions: MozMan68

Here it is, it's the first time I've heard it explained like that.
Thanks for posting. I've watched it now, and while it was sensational viewing, I think you guys would do well to apply a liberal dose of that good old skepticism of yours.

Rob Braxman states (with assumed authority) that Apple and Google don't care at all about user privacy, only profit. That's a cynical position, and a little ironic coming from someone who appears to make his money selling products that promise to protect you from them. (i.e. He has a commercial incentive to convince you of this message.) Apple may or may not really care, but keep in mind, Apple makes its money from selling hardware and services, not from selling advertising and user data. What's the commercial incentive for Apple to compromise the privacy of its users?

Now, regarding the AI component of Apple's NeuralHash technology, which is really the meat of Rob's presentation… Yes, Apple uses AI on your device as part of the hashing process, which can be verified by reading Apple's CSAM Detection Technical Summary for ourselves (something I really ought to have read before engaging seriously in this discussion). But here's where he seems to go off the rails. He asserts (starting around 15:40) that Apple isn't really using a single hash for each image at all, but rather, they are using the AI to identify a whole collection of individual characteristics for each image and hashing them separately. He supposes these characteristics would include things like the faces of abused children (through facial recognition), body positions, nudity, and environmental context. Where is his evidence for this? Towards the start of the video, he makes it sound like his suspicions were confirmed by reading Apple's technical data, but later on (around 10:30) he talks like it's more of a conspiracy, postulating about 'something else at work here, something that's underneath the covers that we are not seeing, and they're not telling'. It honestly sounds to me like he's just making stuff up.

Here's what Apple's Technical Summary actually says about the AI component of NeuralHash:

The neural network that generates the descriptor is trained through a self-supervised training scheme. Images are perturbed with transformations that keep them perceptually identical to the original, creating an original/perturbed pair. The neural network is taught to generate descriptors that are close to one another for the original/perturbed pair. Similarly, the network is also taught to generate descriptors that are farther away from one another for an original/distractor pair. A distractor is any image that is not considered identical to the original.

In other words, machine learning was used to train and refine the algorithm to see 'perceptually identical' images (images which are essentially the same, but which might have different resolutions, minor cropping, colour differences, etc) as very similar, and substantively different images as very different. (While the whole system is very complex, that much is conceptually pretty simple and logical, right?) Finally, those numbers are passed to the hash function which is designed to output an identical hash for perceptually identical images.

Okay, but I know you're all reluctant to trust what Apple says. So let me ask you this… Why would Apple do it Rob Braxman's way? It's more complex. It requires the storing of multiple hashes. Heck, it would require that Apple have hoodwinked the NCMEC into storing hash tables multitudes bigger than claimed (or else they're in on it too!), or, as Rob actually claims (around 16.30), Apple has full possession of the database of CSAM images!

None of this makes sense, unless… Apple's real game is to build some kind of nefarious Big Brother surveillance system, the likes of which the world has never seen, and it starts right here folks! Well… they could be… I suppose. They could have been working on it for years. Where's the evidence? Good question. But you see, if you take Rob's word for it that this is how the technology works, that's ample evidence for the conspiracy, and the conspiracy is ample evidence that this is how the technology must work. Makes perfect sense once you start thinking that way.

Apple devs and execs must be banging their heads against that curved glass wall in frustration right now. Well, either that, or out of sheer terror that their evil plans have been foiled again by those dratted kids on MR!

The video is kind of still worth watching, because if nothing else, it makes you stop and think about all that other surveillance tech that is built into our phones to make our lives easier—you know, to show us where we left our phone, our keys, our friends… to give us live traffic reports and all the rest of it. Whether Apple goes ahead with this or not, they (and the same goes for Google) could build the stuff of nightmares if they really wanted to. It's a sobering thought.
 
With those thoughts, I'll sign off now. These days I usually know better than to get caught up in these debates, but I entered this one because it's too important an issue for the discussion to remain so one-sided. Thanks to everyone who kept it civil. Cheers.
 
With those thoughts, I'll sign off now. These days I usually know better than to get caught up in these debates, but I entered this one because it's too important an issue for the discussion to remain so one-sided. Thanks to everyone who kept it civil. Cheers.

Thanks.

For me end of day, irregardless of the impact (min to max), if Apple wants to scan for “illegal” stuff, they need to perform those activities off my device(s).
 
  • Like
Reactions: Pummers and Schismz
Are you a lawyer? You sound like one. One who likes to quibble over words.
Nope, though I do love the law. (Were it not for my age and lack of motivation, I might be inclined to take up law.) I am something of a wordsmith, though.

If you can't acknowledge any connection between the presumption of innocence and the onus on the prosecution to establish probable cause, then I'll wager you're more interested in scoring points than finding common ground and having a truly constructive discussion.
Not at all. But the point I had been making was not one that related to the presumption of innocence. It was related specifically to the right to be left the hell alone. They are related, but not the same issues.

E.g.: I can presume you're innocent of "borrowing" my hedge clippers, but suspect you're not, and improperly invade your privacy to satisfy myself as to whether my presumption of your innocence is justified.

I do envy the simplicity of your world—the clarity with which you see these things. You see, I'm trying to put each of these real-life situations in the appropriate category using the J Seymour Binary System™️ (JSBS): 'free to do', or 'not free to do'. Would you be a good sport and help me out?
I like you :) You're funny :)

Ok, let us pretend I'm Emperor of the World for Life (which would only be right and proper if the universe made sense).

  1. Jane owns a dog. She keeps him locked up in her backyard when she goes to work during the day. She owns the dog and the backyard. So I think I know this one: free to do right? (Oh, I forgot to mention that the dog barks all day when she is out. Her neighbour Bob works from home. The constant barking infuriates him and he finds it very hard to concentrate.)
This one's easy: The dog, being Jane's property, is an extension of Jane's will. Thus Jane is responsible for the dog's behavior. The dog's incessant barking is, metaphorically, "Jane swinging her fist into/past Bob's nose." Jane is obliged to redress the offense.

  1. Bob, driven to distraction by the barking, turns his music up loud. He owns his house, and the stereo system, and he streams the music legally from Apple Music. Pretty sure I got this one: free to do! (The music bothers the dog intensely, which adds to the anxiety it already feels from being abandoned by Jane all day. But it's just a dog.)
Trickier. Pets aren't commonly considered as having "rights" under common law. But, in these enlightened times (and I use the word "enlightened" advisedly), we do recognize animal cruelty. Plus Bob is clearly exacerbating the situation. Unless he's also annoying his neighbors in the process (see response above), this comes down to two things: 1. Whether one recognizes animals have rights, too, and 2. Bob doesn't have "clean hands," because he's clearly exacerbating the very problem from which his complaint arises.

  1. Bob likes to de-stress at the end of the day by taking a long bath. Unfortunately, he often forgets to close the curtains, and Jill, his other neighbour, has seen things she can't un-see more times than she cares to remember. I need help with this one, as it's Bob's home and you know… Bob's fist (and all his other body parts) stop well short of Jill's nose—they are fully contained within his house at all times.
At first blush (see what I did there?) this would seem to be a difficult one, but it's really not. Bob is not obviously purposely exposing himself. He is within his own domicile. I would say this one is clearly on Jane. Easy fix for her: Don't look. (N.B.: The law would probably disagree with me on this one.)

An analogy, if I may. On a sailing trip many years ago, at two different times, each of two different girls inadvertently (though the other guys were "Yeah, right, they did that accidentally") exposed themselves to me. Both cute girls and quite shapely. In each case, as soon as the exposure impinged upon my consciousness, I looked away. In one case calling to the girl's attention what had happened. Not that I wanted to, but that was the way I was raised.

Same principle I apply to TV shows and movies: If it offends you, don't watch it.

(N.B.: I expect the law would disagree with me on this one.)

  1. Jill has decided to erect a wall to block her view of Bob's house (and other… err… belongings). The wall will be fully contained within her property. She decides not to run it by her local council, as she's not a believer in 'those meddling bureaucrats'. (Besides, her uncle Barry is on the council, so she figures if anyone complains, she'll get Barry to pull some strings.) She hires a good engineer who assures her the wall will be structurally sound. Free to do?
The cruft about zoning ordinances, inside influence and the like aside: Jill is perfectly within her rights to erect said wall.

  1. Bob hates the wall because it reminds him of all those Mexican drug dealers, criminals and rapists. He doesn't want a confrontation with Jill though, so he decides to paint his side of the wall blue, with some fluffy white clouds, and pretend it's not there. He figures Jill can't see his side of the wall anyway and will never know.
Bob does not have the right to paint Jill's wall without Jill's consent.

  1. Bob buys his paint from Garry. Bob suspects that Garry gets his merchandise illegally, but he doesn't know for sure, nor does he want to know. The paint is significantly cheaper than the paint at his local hardware store, and Bob needs the extra cash as he's saving up for the iPhone 13. (There are also rumors that Garry sells drugs to children, but that's all they are—unfounded rumors. Bob really doesn't want to go poking around in other peoples' stuff to see if maybe they're doing something with which he disagrees.)
This is more a question of personal ethics than the law or individual rights. If Bob can sleep soundly with the decisions he's made... ¯\_(ツ)_/¯

(N.B.: The law may disagree with my position. E.g.: "Possession of stolen property," "aiding a criminal enterprise," etc.)

I think if you can show me which JSBS slot to put each of these situations in, that'll really help me to see where you're coming from!
There ya go. Hope this helps :D
 
Last edited:
  • Love
Reactions: kalsta
Thanks.

For me end of day, irregardless of the impact (min to max), if Apple wants to scan for “illegal” stuff, they need to perform those activities off my device(s).
How would that improve things though? Honestly, in what way is that better than the current system?

And don't just say "because it's not on my phone" because that's not really an answer.
 
  • Like
Reactions: kalsta
How would that improve things though? Honestly, in what way is that better than the current system?

And don't just say "because it's not on my phone" because that's not really an answer.

What we know: Apple wants to limit CSAM material on the iCloud.
Current broad use solution by Cloud providers; scan said Cloud for CSAM.
Solution for Apple: Scan the iCloud like other providers.

What I am not seeing is why the proposed Apple solution is better or needed.
 
  • Like
Reactions: Schismz and Pummers
What we know: Apple wants to limit CSAM material on the iCloud.
Current broad use solution by Cloud providers; scan said Cloud for CSAM.
Solution for Apple: Scan the iCloud like other providers.

What I am not seeing is why the proposed Apple solution is better or needed.
But why is it worse than having all of your photos unencrypted in the cloud?

Apple's new implementation gives you more privacy, not less.
 
  • Like
  • Haha
Reactions: kalsta and Pummers
But why is it worse than having all of your photos unencrypted in the cloud?

Apple's new implementation gives you more privacy, not less.

What?
So placing an object on my devices (each) to scan for illegal content and if found report me to the authorities is more private?

No. It is not.
Looks more like a direct invasion to my personal privacy.
 
What?
So placing an object on my devices (each) to scan for illegal content and if found report me to the authorities is more private?

No. It is not.
Looks more like a direct invasion to my personal privacy.
So you'd rather have all your photos out in the open and being scanned for the same exact content and reported to the same authorities?

At least on your device, Apple doesn't know about any of your photos until there's 30 matches. Then they only have access to those 30 photos vs having access to all your photos.

Plus, if you don't want your device(s) scanned, don't use iCloud Photo Library (which would be the same exact thing if they only scanned on the server).

Again, it's MORE private, not less.
 
  • Like
Reactions: kalsta
Oh ok that makes more sense. If Apple was always doing this sort of scanning then what's different here? Well, on device vs on cloud, which seems like a small difference.

My understand is that Apple was not always scanning for CSAM on iCloud. They were able to decrypt iCloud for law enforcement, but they weren't simply scanning all photos. My understanding is this is a new program AND it's going to run live on our devices, and scan every photo.

But yes, if Apple had been scanning iCloud for CSAM for years then I can see your confusion around why this moment picked up.
The difference is exactly that.

When I upload my stuff to iCloud, I would expect that it be scanned. I'm making a "conscious choice" to store my stuff OFF of my device. Folks have been using Facebook as a comparison... when you post something to Facebook, you're making a conscious choice to share it with the public. If you're stupid enough to be in possession of, and publicly post those images, you get what you deserve at that point.

Here's the difference; with local scanning, it's no longer my conscious choice. There's a very fine line between "We'll only hash it if you try to upload it to iCloud" and "We'll hash it anyway, even if you don't upload it to iCloud, because being in possession of the image is illegal, so we'll just take a peek to see if you're in possession of them..." Do you see how there's just a frog-hair difference between "Scanning when you upload" and just "Scanning"?

And, once again, it opens up a back door into our personal devices. Not to pick on China, but they've already "demanded the keys" to iCloud for Chinese users. Now, how much more of a stretch is it for the Chinese government to say "Well, CSAM is illegal in the United States... so pictures of (whatever) is illegal in China... turn on scanning for that. Oh, and we don't care if they upload it or not, the dissidents must be stopped! So just report to us if they have those images on their device at all...."

And, once again, since China is Apple's second biggest market, their government can have a LOT of persuasion in that department.

It's not about catching Pedos. It really isn't. Castrate them. It's about a very slippery slope and precedent that Apple is setting.

And before someone says "So, don't use iCloud"... well, I *PAY* for iCloud... This isn't what I signed up for.
 
  • Like
Reactions: Pummers and dk001
Here is something I ran across and it provides some very interesting information.

snip
”According to media reports, the cloud computing industry does not take full advantage of the existing CSAM screening tools to detect images or videos in cloud computing storage.9 For instance, big industry players, such as Apple, do not scan their cloud storage. In 2019, Amazon provided only eight reports to the NCMEC, despite handling cloud storage services with millions of uploads and downloads every second. Others, such as Dropbox, Google and Microsoft perform scans for illegal images, but 'only when someone shares them, not when they are uploaded'.”

This raises even more questions on why this solution from Apple.
 
The difference is exactly that.

When I upload my stuff to iCloud, I would expect that it be scanned. I'm making a "conscious choice" to store my stuff OFF of my device. Folks have been using Facebook as a comparison... when you post something to Facebook, you're making a conscious choice to share it with the public. If you're stupid enough to be in possession of, and publicly post those images, you get what you deserve at that point.

Here's the difference; with local scanning, it's no longer my conscious choice. There's a very fine line between "We'll only hash it if you try to upload it to iCloud" and "We'll hash it anyway, even if you don't upload it to iCloud, because being in possession of the image is illegal, so we'll just take a peek to see if you're in possession of them..." Do you see how there's just a frog-hair difference between "Scanning when you upload" and just "Scanning"?

And, once again, it opens up a back door into our personal devices. Not to pick on China, but they've already "demanded the keys" to iCloud for Chinese users. Now, how much more of a stretch is it for the Chinese government to say "Well, CSAM is illegal in the United States... so pictures of (whatever) is illegal in China... turn on scanning for that. Oh, and we don't care if they upload it or not, the dissidents must be stopped! So just report to us if they have those images on their device at all...."

And, once again, since China is Apple's second biggest market, their government can have a LOT of persuasion in that department.

It's not about catching Pedos. It really isn't. Castrate them. It's about a very slippery slope and precedent that Apple is setting.

And before someone says "So, don't use iCloud"... well, I *PAY* for iCloud... This isn't what I signed up for.
Wait. Where did you read that your photos are scanned when iCloud Photo Library is disabled? If you don't upload photos, nothing is scanned and no vouchers are generated or sent to Apple. Why the paranoia?

Edit: Haha, in one breath you say it's okay for your stuff to be scanned because you make the conscious choice to upload your content to Apple, then in another breath you say you won't stop using iCloud because you pay for it. Well, if you use iCloud, then you agree to have your photos checked for CSAM. Where's the problem here? Either agree to have your photos checked or don't use iCloud Photo Library... same as it's always been. Hahaha
 
Why is it alright to scan in the cloud but not on your phone? It is still a warrantless search if done in the cloud yet Facebook, Google and Microsoft all do it. If you use any of their cloud services you are being searched already. Worse, no online photo provider provides E2E encryption because of this need to make sure they are not hosting CSAM.

I see this as a gateway for Apple to be able to finally enable E2E encryption on their iCloud backups because they can reassure authorities that they are not hosting any CSAM material since the scan happens just before upload to the cloud. Then once your photos are there, nobody can access them.

Also, the only material that Apple and authorities would have access to is just the photos flagged as CSAM and only after you meet their threshold can it be decrypted for review by Apple. With cloud scanning, Apple and other service providers have full access to ALL of your photos all of the time.

I feel there’s just too many people that don’t understand the technology that are spreading misleading or false information. Be it on purpose or through ignorance.
You're incorrect on it being a warrantless search. They don't need a warrant, because you agreed to their EULA, which is like inviting the police freely into your home.

If the police knock at your door and ask to come in, you have a choice. You can invite them in, or you can tell them to come back with a warrant. If they are invited in, and they see pot sitting on your table in the dining room, you can be arrested. They didn't need a warrant; you invited them in.

The EULA does that. Invites them in.

Plus, a warrant only applies to law enforcement. Whether or not Apple is acting on behalf of law enforcement, or as an agent of law enforcement, and therefore under the 4th amendment, is something that will have to be determined in court. HOWEVER.... when you agree to the EULA, that is what they will produce. Apple's argument will be "The user AGREED to have us scan their stuff. 4th amendment warrant language doesn't apply, because they invited us in."
 
  • Like
Reactions: hans1972
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.