That reminds me, I was watching The Shrink Next Door on Apple TV+ the other day. There was a scene with a guy wondering why a friend hadn’t called him during his hospital visit, and he asked a nurse if they had poor signal in the hospital. He was using an iPhone 4. I shouted out “you’re holding it wrong!”, but nobody else in the room got the jokeApple doesn't usually admit any wrongdoings.
Well, they’ve removed the text, but the code is still present.Ok, so hopefully this is a true win for privacy advocates
lol good luck with that. Apple will never admit to having done anything wrong…ever.It still isn’t a public disavowal from Apple, which is what I want. Documentation can be re-added to the website at any time just as easily as the words were removed. Publicly making a statement is harder to get around.
I'm not sure they'll necessarily announce it if they're canning it.Well, they’ve removed the text, but the code is still present.
This has been so unpopular, I’m sure they’d make an announcement if they were canning it.
Seriously. This is one of those times Apple really needs to swallow its pride and just come out and announce that it is canning this "feature". Just put a statement on Apple Newsroom and be done with it. Would be much more effective at getting people's attention and trust back.If Apple publically announced it had listened and changed course because of the backlash, it could be even better PR tbh
They dont have to scan things in the cloud. This was discussed to death when this came up. Its a choice and a bad one.The only difference is this would scan on the device, flag CP images and leave ALL your other non-CP related images encrypted. They’re still doing scanning in the cloud, they have to, EVERYONE does.
Apple searches my photos and videos I don't hide anything.
What an utterly bizarre line in the sand to draw on privacy. Apple has my medical records, my credit card information, my social security number, my emails, and can tell you everywhere I’ve been since 2007. They *could* use any of that information against me. But they haven’t, and I trust that they won’t. Yet somehow scanning my photos on-device for known hash values is where my privacy and freedom are at stake? Riiiight.You still don't get the point. It's not about whether or not someone has something to hide.
Let me give you a couple of parallel examples.
Car companies installing a chip in your car that calls the police if you're speeding. It's OK, right, if you NEVER go over the posted speed limit....
Government wants to install a camera IN YOUR HOUSE. They promise that they'll never look at your wife in her panties... it's just there in case you beat your kids. Why should you worry, right? Everyone who works for the government is trustworthy, and wouldn't just "check in" to your camera, right? Until the government decides that they can also use the cameras to make sure that you're never engaging in anal sex... because it IS still illegal in many states. Is that what you signed up for?
Do you see why Apple's on-device scanning technology is a bad idea now? It doesn't matter if you're doing something bad or not... it's the fact that they're engaging in constant surveillance, and the fact that it can be EASILY altered to scan for other things. Got a picture of a rebel flag? Uh-oh... you're part of a white supremacist hate group. Got a BLM meme? Oh, you naughty boy you. Got a picture of Whinny the Pooh in China? Expect a midnight visit....
It might be easier to just remove your kiddie porn from the cloud.If I ever find out this has been surreptitiously added without my knowledge then I will sell every Apple device I own and never buy another. Anyone who doesn’t have an issue with this has no clue of the concept of mission creep. If these systems are allowed to exist then it’s only a matter of time before the Feds batter your door in for having a [insert future dictator here] meme in your iCloud library. The road to hell is paved with good intentions.
May as well post all your pictures here.It might be easier to just remove your kiddie porn from the cloud.
EVERY cloud service, DropBox, Google Drive, OneDrive, ALL of them have to scan. Apple is no different. ALL of them have access to ALL your unencrypted images (and thus able to provide them to authorities upon request if they’re in the cloud). Apple is no different.They dont have to scan things in the cloud. This was discussed to death when this came up. Its a choice and a bad one.
Photos are encrypted when stored in iCloud. They’re just not end to end encrypted.EVERY cloud service, DropBox, Google Drive, OneDrive, ALL of them have to scan. Apple is no different. ALL of them have access to ALL your unencrypted images (and thus able to provide them to authorities upon request if they’re in the cloud). Apple is no different.
Where Apple was different was in flagging the hash of known CP images on-device. No CP? No flagged hashes. And, as a result, that would allow them to encrypt ALL images that they store in the cloud. Apple was mistaken in thinking that people in general placed a high value on having their images encrypted in the cloud.
OR, the government, seeing this as being a serious curtailment in their ability to subpoena Apple for non-CP cases made sure the “right folks” got the wrong idea about how this works. Gotta admit, though, it was very effective because, without this method in place, all iCloud images are unencrypted, just the way the government prefers![]()
They will. And, because ALL your images in the cloud are unencrypted, that court order can have access to ALL your images. Instead of, like, none of your images.Good, let law enforcement get a COURT ORDER like they are supposed to according to the Constitution of the United States
Actually I am happy with that. Apple's servers, Apple's property, Apple's rules. I just don't want surveillance hardware/software in my pocket wherever I go, which is not the same, particularly since there are AI optimised chips being put in mobile devices. Apple's proposal would have set a precedent of mobile device surveillance that would have normalised the most intrusive behaviours of governments and businesses.?♂️ I don't think everyone realizes that they will just do the entire CSAM matching on their servers for iCloud Photos right? You gain nothing by them abandoning the client "voucher" process if they just replace it with the exact same system that GooglePhotos/FB/MSTeams/Telegram/Reddit/etc use to process every photo uploaded.
Same hashes, same content, same iCloud Photos servers, same result.
What we lose is the potential to have the pictures client-encrypted such that apple's servers can't even see their content*. Every single hypothetical dystopian scenario people keep claiming is exactly what is possible today, and the client check would have made it harder to implement instead of easier.
\* unless there are multiple client voucher failures, and a second fingerprint failure, and a manual review of only those photos.
Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.Photos are encrypted when stored in iCloud. They’re just not end to end encrypted.
To borrow boilerplate disclosures from Apple’s quarterly conference calls, “past performance is not indicative of future results.”What an utterly bizarre line in the sand to draw on privacy. Apple has my medical records, my credit card information, my social security number, my emails, and can tell you everywhere I’ve been since 2007. They *could* use any of that information against me. But they haven’t, and I trust that they won’t. Yet somehow scanning my photos on-device for known hash values is where my privacy and freedom are at stake? Riiiight.
Anyone who’s that concerned about privacy should not only throw away all their apple devices but disconnect all your Google and MS services, delete all social media, stop accessing the internet, and for the love of god, don’t use a cellphone.
On the other hand, I have to imagine the number of child abusers they are going to catch with a publicly advertised program that can be defeated just by not keeping CSAM in iCloud photos has to be… low.
So, yeah, on a scale of one to ten, my level of concern about this one way or the other is a one.
I understand how it all works. You keep saying they’re stored unencrypted and they’re not.Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.
Apple’s proposal would have set a precedent of mobile devices where it’s normalized that ALL images would be encrypted in the cloud with a key the provider doesn’t have. Thus, not available when authorities request them from the provider.Apple's proposal would have set a precedent of mobile device surveillance that would have normalised the most intrusive behaviours of governments and businesses.