Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.
As far as I know, you can also view photos via the browser, for example by sending someone a link. This should then be encrypted? And the link is supposedly not protected by a password.

I don't use iCloud. I don't use iCloud photo. For good reason.
 
Correct. They are encrypted in a way that allows them to be decrypted whenever Apple (or the governmental entity that requests) wants them to be.
The police need a warrant to do that. In order to get that warrant they have to explain to an independent judge why they want that warrant and what they’re looking for…and the judge has to agree with that. You seem to be awfully concerned about your stored photos being subject to a warranted search? If you’re that concerned then maybe you should be encrypting them yourself using PGP, etc., or just not using online photo storage in the first place?
 
You still don't get the point. It's not about whether or not someone has something to hide.
Let me give you a couple of parallel examples.
Car companies installing a chip in your car that calls the police if you're speeding. It's OK, right, if you NEVER go over the posted speed limit....

Government wants to install a camera IN YOUR HOUSE. They promise that they'll never look at your wife in her panties... it's just there in case you beat your kids. Why should you worry, right? Everyone who works for the government is trustworthy, and wouldn't just "check in" to your camera, right? Until the government decides that they can also use the cameras to make sure that you're never engaging in anal sex... because it IS still illegal in many states. Is that what you signed up for?

Do you see why Apple's on-device scanning technology is a bad idea now? It doesn't matter if you're doing something bad or not... it's the fact that they're engaging in constant surveillance, and the fact that it can be EASILY altered to scan for other things. Got a picture of a rebel flag? Uh-oh... you're part of a white supremacist hate group. Got a BLM meme? Oh, you naughty boy you. Got a picture of Whinny the Pooh in China? Expect a midnight visit....
I'm not objecting to your post, but what about AI/object recognition? That has been a thing for several years now on our devices. Why aren't people all up in arms about that? I mean, everything that was mentioned here could potentially be identified with AI/object recognition...and even better at that. How do we know Apple isn't secretly doing that and not surfacing the data to us? They are obviously already using this method to identify nudes for the Communication Safety feature.

Looking for a rebel flag? Just recognize the flag using AI and find all photos contain it. Looking for guns? Just recognize things that look like guns using AI and find all photos containing them. No need to hash known photos of these and then try to find only those specific ones that are visually similar and have the same hash.
 
Last edited:
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
Except the system can’t search for “protestors” or “LGBT”. You misunderstand how it works fundamentally.
 
As far as I know, you can also view photos via the browser, for example by sending someone a link. This should then be encrypted? And the link is supposedly not protected by a password.

I don't use iCloud. I don't use iCloud photo. For good reason.
If that person has an iCloud account, then you’re giving them access to see your images. If they don’t have an iCloud account, do they have to sign up? I’ll have to try that :)
 
What Apple really wanted to say:
"OK, you got us for now, but it definitely wasn't a conspiracy. We'll get smarter and try again later whenever we can piggyback off of some BS bill in Congress. We thought we could take your rights away now under the guise of kiddie porn, but we'll wait since clearly you like kiddie porn."

This was garbage from the start, and I don't want to participate in the scanning of ANY photographic content whatsoever for ANY party. I don't even like on-device facial recognition.
 
The police need a warrant to do that. In order to get that warrant they have to explain to an independent judge why they want that warrant and what they’re looking for…and the judge has to agree with that. You seem to be awfully concerned about your stored photos being subject to a warranted search? If you’re that concerned then maybe you should be encrypting them yourself using PGP, etc., or just not using online photo storage in the first place?
No, I’m not concerned at all. There’s some thinking here that Apple wasn’t performing CSAM checks and suddenly decided to do so. They’ve BEEN doing CSAM checks, EVERYONE does against every image uploaded to the cloud.

Apple was attempting to implement a process where they could actually leave all your images in the cloud encrypted with a key that they don’t have access to. That’s the only thing that would be different. Apple and everyone else is doing the same CSAM scanning they’ve been doing all along.
 
This was garbage from the start, and I don't want to participate in the scanning of ANY photographic content whatsoever for ANY party. I don't even like on-device facial recognition.
That’s the thing, though. If anyone’s using any cloud image service they ARE participating in the scanning. This actually would have led to far fewer images being scanned in the cloud, a virtual treasure trove that I’m sure the government is happy is still open and available for search :)
 
I understand all the scepticism here but my guess is that CSAM detection as initially proposed is dead, at least for now. Apple didn’t need to remove the wording from their website if they were still planning to go ahead with it any time soon.
 
They will. And, because ALL your images in the cloud are unencrypted, that court order can have access to ALL your images. Instead of, like, none of your images.
Or that court order allows them to search your device, but there has to be a reason to search with a court order -- unlike Apple's way which needs no reason, and indeed, searches everyone's device.
 
P
No, I’m not concerned at all. There’s some thinking here that Apple wasn’t performing CSAM checks and suddenly decided to do so. They’ve BEEN doing CSAM checks, EVERYONE does against every image uploaded to the cloud.

Apple was attempting to implement a process where they could actually leave all your images in the cloud encrypted with a key that they don’t have access to. That’s the only thing that would be different. Apple and everyone else is doing the same CSAM scanning they’ve been doing all along.
Why do you care if law enforcement has warrant only access to your stored online photos? In every other scenario, they’re encrypted.
 
I understand all the scepticism here but my guess is that CSAM detection as initially proposed is dead, at least for now. Apple didn’t need to remove the wording from their website if they were still planning to go ahead with it any time soon.
Article updated with statement from Apple that they are still going ahead. Nothing has changed. I think they will just slip it under the radar, and will probably be the way they do these things from now on.
 
  • Like
Reactions: B4U and makitango
Or that court order allows them to search your device, but there has to be a reason to search with a court order -- unlike Apple's way which needs no reason, and indeed, searches everyone's device.
Yes, Apple’s doing that NOW (And the reason is the government requires it), for every image they have stored in the cloud. Using their proposed method, they’d be doing the SAME search, the same flagging, with the only difference being that anyone getting a court order to go through your images won’t be able to ask Apple for help. Because the images will be encrypted with a key that Apple doesn’t have.
 
Why do you care if law enforcement has warrant only access to your stored online photos? In every other scenario, they’re encrypted.
Why would anyone care if law enforcement can, through process, get access to all the images you have stored without having to have you unlock your device?
 
Yes, Apple’s doing that NOW (And the reason is the government requires it), for every image they have stored in the cloud. Using their proposed method, they’d be doing the SAME search, the same flagging, with the only difference being that anyone getting a court order to go through your images won’t be able to ask Apple for help. Because the images will be encrypted with a key that Apple doesn’t have.
They wont be scanning my device. I turned off iCloud photos when Apple announced this whole mess. Apple has turned it back on once, so check your settings often if you don't want it to happen. If they push even more, no more iphone for me.

The government can search your device with a court order, just like iCloud, and that's okay, that's the law, but illegally searching my device without a court order, no, never.
 
If Apple is going to implement CSAM scanning on devices I would stop using Apples's device at all. Spyware should not be on devices I pay for. If Apple wants to make sure that iCloud is clean, they should implement the spyware in iCloud.
 
But in the release features I read on this web site yesterday, one of the new features was that Apple has introduced in 15.2 is

In iOS 15.2, Apple is enabling Communication Safety in Messages for children. The feature is designed to scan incoming messages images on children's devices for nudity and warn them that such photos might be harmful

so it looks to me like they are still going to scan peoples photos to look for content that Apple deems to be unacceptable.

To me, that is unacceptable. Stop scanning peoples private data. Period.
 
  • Like
Reactions: bobcomer
If Apple is going to implement CSAM scanning on devices I would stop using Apples's device at all. Spyware should not be on devices I pay for. If Apple wants to make sure that iCloud is clean, they should implement the spyware in iCloud.
I haven't done any research on alternatives. What do you have in mind?
 
  • Disagree
Reactions: FindingAvalon
Update: Apple spokesperson Shane Bauer told The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.
Boo...hiss...???

So CSAM is gonna lay low until the heat dies down, then worm its way back under a different guise. Lets see how they market it so that it will sound like a good, privacy respecting feature.
 
  • Like
  • Disagree
Reactions: hn333 and B4U
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.