Correct. They are encrypted in a way that allows them to be decrypted whenever Apple (or the governmental entity that requests) wants them to be.I understand how it all works. You keep saying they’re stored unencrypted and they’re not.
Correct. They are encrypted in a way that allows them to be decrypted whenever Apple (or the governmental entity that requests) wants them to be.I understand how it all works. You keep saying they’re stored unencrypted and they’re not.
As far as I know, you can also view photos via the browser, for example by sending someone a link. This should then be encrypted? And the link is supposedly not protected by a password.Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.
The police need a warrant to do that. In order to get that warrant they have to explain to an independent judge why they want that warrant and what they’re looking for…and the judge has to agree with that. You seem to be awfully concerned about your stored photos being subject to a warranted search? If you’re that concerned then maybe you should be encrypting them yourself using PGP, etc., or just not using online photo storage in the first place?Correct. They are encrypted in a way that allows them to be decrypted whenever Apple (or the governmental entity that requests) wants them to be.
I'm not objecting to your post, but what about AI/object recognition? That has been a thing for several years now on our devices. Why aren't people all up in arms about that? I mean, everything that was mentioned here could potentially be identified with AI/object recognition...and even better at that. How do we know Apple isn't secretly doing that and not surfacing the data to us? They are obviously already using this method to identify nudes for the Communication Safety feature.You still don't get the point. It's not about whether or not someone has something to hide.
Let me give you a couple of parallel examples.
Car companies installing a chip in your car that calls the police if you're speeding. It's OK, right, if you NEVER go over the posted speed limit....
Government wants to install a camera IN YOUR HOUSE. They promise that they'll never look at your wife in her panties... it's just there in case you beat your kids. Why should you worry, right? Everyone who works for the government is trustworthy, and wouldn't just "check in" to your camera, right? Until the government decides that they can also use the cameras to make sure that you're never engaging in anal sex... because it IS still illegal in many states. Is that what you signed up for?
Do you see why Apple's on-device scanning technology is a bad idea now? It doesn't matter if you're doing something bad or not... it's the fact that they're engaging in constant surveillance, and the fact that it can be EASILY altered to scan for other things. Got a picture of a rebel flag? Uh-oh... you're part of a white supremacist hate group. Got a BLM meme? Oh, you naughty boy you. Got a picture of Whinny the Pooh in China? Expect a midnight visit....
Except the system can’t search for “protestors” or “LGBT”. You misunderstand how it works fundamentally.better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
If that person has an iCloud account, then you’re giving them access to see your images. If they don’t have an iCloud account, do they have to sign up? I’ll have to try thatAs far as I know, you can also view photos via the browser, for example by sending someone a link. This should then be encrypted? And the link is supposedly not protected by a password.
I don't use iCloud. I don't use iCloud photo. For good reason.
No, I’m not concerned at all. There’s some thinking here that Apple wasn’t performing CSAM checks and suddenly decided to do so. They’ve BEEN doing CSAM checks, EVERYONE does against every image uploaded to the cloud.The police need a warrant to do that. In order to get that warrant they have to explain to an independent judge why they want that warrant and what they’re looking for…and the judge has to agree with that. You seem to be awfully concerned about your stored photos being subject to a warranted search? If you’re that concerned then maybe you should be encrypting them yourself using PGP, etc., or just not using online photo storage in the first place?
That’s the thing, though. If anyone’s using any cloud image service they ARE participating in the scanning. This actually would have led to far fewer images being scanned in the cloud, a virtual treasure trove that I’m sure the government is happy is still open and available for searchThis was garbage from the start, and I don't want to participate in the scanning of ANY photographic content whatsoever for ANY party. I don't even like on-device facial recognition.
Or that court order allows them to search your device, but there has to be a reason to search with a court order -- unlike Apple's way which needs no reason, and indeed, searches everyone's device.They will. And, because ALL your images in the cloud are unencrypted, that court order can have access to ALL your images. Instead of, like, none of your images.
Why do you care if law enforcement has warrant only access to your stored online photos? In every other scenario, they’re encrypted.No, I’m not concerned at all. There’s some thinking here that Apple wasn’t performing CSAM checks and suddenly decided to do so. They’ve BEEN doing CSAM checks, EVERYONE does against every image uploaded to the cloud.
Apple was attempting to implement a process where they could actually leave all your images in the cloud encrypted with a key that they don’t have access to. That’s the only thing that would be different. Apple and everyone else is doing the same CSAM scanning they’ve been doing all along.
Article updated with statement from Apple that they are still going ahead. Nothing has changed. I think they will just slip it under the radar, and will probably be the way they do these things from now on.I understand all the scepticism here but my guess is that CSAM detection as initially proposed is dead, at least for now. Apple didn’t need to remove the wording from their website if they were still planning to go ahead with it any time soon.
Yes, Apple’s doing that NOW (And the reason is the government requires it), for every image they have stored in the cloud. Using their proposed method, they’d be doing the SAME search, the same flagging, with the only difference being that anyone getting a court order to go through your images won’t be able to ask Apple for help. Because the images will be encrypted with a key that Apple doesn’t have.Or that court order allows them to search your device, but there has to be a reason to search with a court order -- unlike Apple's way which needs no reason, and indeed, searches everyone's device.
Why would anyone care if law enforcement can, through process, get access to all the images you have stored without having to have you unlock your device?Why do you care if law enforcement has warrant only access to your stored online photos? In every other scenario, they’re encrypted.
You seem to care. That’s essentially been your whole argument here.Why would anyone care if law enforcement can, through process, get access to all the images you have stored without having to have you unlock your device?
They wont be scanning my device. I turned off iCloud photos when Apple announced this whole mess. Apple has turned it back on once, so check your settings often if you don't want it to happen. If they push even more, no more iphone for me.Yes, Apple’s doing that NOW (And the reason is the government requires it), for every image they have stored in the cloud. Using their proposed method, they’d be doing the SAME search, the same flagging, with the only difference being that anyone getting a court order to go through your images won’t be able to ask Apple for help. Because the images will be encrypted with a key that Apple doesn’t have.
Dangit.So the update to the article says that they just removed the flair text but will continue with the plan.
Nothing has been won here.
I haven't done any research on alternatives. What do you have in mind?If Apple is going to implement CSAM scanning on devices I would stop using Apples's device at all. Spyware should not be on devices I pay for. If Apple wants to make sure that iCloud is clean, they should implement the spyware in iCloud.
Boo...hiss...???Update: Apple spokesperson Shane Bauer told The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.