Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple doesn't usually admit any wrongdoings.
That reminds me, I was watching The Shrink Next Door on Apple TV+ the other day. There was a scene with a guy wondering why a friend hadn’t called him during his hospital visit, and he asked a nurse if they had poor signal in the hospital. He was using an iPhone 4. I shouted out “you’re holding it wrong!”, but nobody else in the room got the joke :(
 
While this is a step in the right direction, it’s not nearly enough. I’ll join the chorus of comments requesting that Apple loudly disavow this effort. Apple can also show good faith by firing the people responsible for this disaster. People who have disastrously failed at their jobs.

Until then, I’m still phasing out my Apple usage.
 
Well, they’ve removed the text, but the code is still present.

This has been so unpopular, I’m sure they’d make an announcement if they were canning it.
I'm not sure they'll necessarily announce it if they're canning it.

For example, that device called AirPower (?) was promised several years ago and apparently technical difficulties led Apple not to have delivered. I don't think it made any announcement about not moving forward.

Sometimes it might be best for the company to just scrap a plan without making a public announcement over it as that may bring unwanted attention to them.

I mean, for me, either way is fine whether they announce it or not. I'm just thinking why they may not want to make a public announcement.
 
If Apple publically announced it had listened and changed course because of the backlash, it could be even better PR tbh
Seriously. This is one of those times Apple really needs to swallow its pride and just come out and announce that it is canning this "feature". Just put a statement on Apple Newsroom and be done with it. Would be much more effective at getting people's attention and trust back.
 
"we removed it from the website because it was bad for our image but we are moving forward now we are just being quiet"

Hmmm...not cool apple.
 
  • Like
Reactions: snek
The only difference is this would scan on the device, flag CP images and leave ALL your other non-CP related images encrypted. They’re still doing scanning in the cloud, they have to, EVERYONE does.
They dont have to scan things in the cloud. This was discussed to death when this came up. Its a choice and a bad one.
 
Apple searches my photos and videos I don't hide anything.

Oh, but you have everything to hide.

If you’re conservative you have that to hide if you live in an authoritarian communist country. If you’re liberal you have that to hide if you live in an authoritarian fascist country. If you don’t care about politics you have that to hide in any authoritarian country.

Idem if you’re religious, non religious, black, white, yellow, brown, rich, poor…

You don’t live in such countries?

YET. You can’t say that for certain until the day you die.

It is not that you don’t have something to hide. You do. It is that you’re too naive to realize it.
 
You still don't get the point. It's not about whether or not someone has something to hide.
Let me give you a couple of parallel examples.
Car companies installing a chip in your car that calls the police if you're speeding. It's OK, right, if you NEVER go over the posted speed limit....

Government wants to install a camera IN YOUR HOUSE. They promise that they'll never look at your wife in her panties... it's just there in case you beat your kids. Why should you worry, right? Everyone who works for the government is trustworthy, and wouldn't just "check in" to your camera, right? Until the government decides that they can also use the cameras to make sure that you're never engaging in anal sex... because it IS still illegal in many states. Is that what you signed up for?

Do you see why Apple's on-device scanning technology is a bad idea now? It doesn't matter if you're doing something bad or not... it's the fact that they're engaging in constant surveillance, and the fact that it can be EASILY altered to scan for other things. Got a picture of a rebel flag? Uh-oh... you're part of a white supremacist hate group. Got a BLM meme? Oh, you naughty boy you. Got a picture of Whinny the Pooh in China? Expect a midnight visit....
What an utterly bizarre line in the sand to draw on privacy. Apple has my medical records, my credit card information, my social security number, my emails, and can tell you everywhere I’ve been since 2007. They *could* use any of that information against me. But they haven’t, and I trust that they won’t. Yet somehow scanning my photos on-device for known hash values is where my privacy and freedom are at stake? Riiiight.

Anyone who’s that concerned about privacy should not only throw away all their apple devices but disconnect all your Google and MS services, delete all social media, stop accessing the internet, and for the love of god, don’t use a cellphone.

On the other hand, I have to imagine the number of child abusers they are going to catch with a publicly advertised program that can be defeated just by not keeping CSAM in iCloud photos has to be… low.

So, yeah, on a scale of one to ten, my level of concern about this one way or the other is a one.
 
If I ever find out this has been surreptitiously added without my knowledge then I will sell every Apple device I own and never buy another. Anyone who doesn’t have an issue with this has no clue of the concept of mission creep. If these systems are allowed to exist then it’s only a matter of time before the Feds batter your door in for having a [insert future dictator here] meme in your iCloud library. The road to hell is paved with good intentions.
It might be easier to just remove your kiddie porn from the cloud.
 
They dont have to scan things in the cloud. This was discussed to death when this came up. Its a choice and a bad one.
EVERY cloud service, DropBox, Google Drive, OneDrive, ALL of them have to scan. Apple is no different. ALL of them have access to ALL your unencrypted images (and thus able to provide them to authorities upon request if they’re in the cloud). Apple is no different.

Where Apple was different was in flagging the hash of known CP images on-device. No CP? No flagged hashes. And, as a result, that would allow them to encrypt ALL images that they store in the cloud. Apple was mistaken in thinking that people in general placed a high value on having their images encrypted in the cloud.

OR, the government, seeing this as being a serious curtailment in their ability to subpoena Apple for non-CP cases made sure the “right folks” got the wrong idea about how this works. Gotta admit, though, it was very effective because, without this method in place, all iCloud images are unencrypted, just the way the government prefers ;)
 
EVERY cloud service, DropBox, Google Drive, OneDrive, ALL of them have to scan. Apple is no different. ALL of them have access to ALL your unencrypted images (and thus able to provide them to authorities upon request if they’re in the cloud). Apple is no different.

Where Apple was different was in flagging the hash of known CP images on-device. No CP? No flagged hashes. And, as a result, that would allow them to encrypt ALL images that they store in the cloud. Apple was mistaken in thinking that people in general placed a high value on having their images encrypted in the cloud.

OR, the government, seeing this as being a serious curtailment in their ability to subpoena Apple for non-CP cases made sure the “right folks” got the wrong idea about how this works. Gotta admit, though, it was very effective because, without this method in place, all iCloud images are unencrypted, just the way the government prefers ;)
Photos are encrypted when stored in iCloud. They’re just not end to end encrypted.
 
It'll just be a silent update and not used for PR which backfired but I suspect it's already in place since my iPad Pro M1 has two long pauses for copy/paste (suspect one is for checksum generation and other for paste) to network drive vs usual one long pause for paste.
 
Good, let law enforcement get a COURT ORDER like they are supposed to according to the Constitution of the United States
They will. And, because ALL your images in the cloud are unencrypted, that court order can have access to ALL your images. Instead of, like, none of your images.
 
?‍♂️ I don't think everyone realizes that they will just do the entire CSAM matching on their servers for iCloud Photos right? You gain nothing by them abandoning the client "voucher" process if they just replace it with the exact same system that GooglePhotos/FB/MSTeams/Telegram/Reddit/etc use to process every photo uploaded.

Same hashes, same content, same iCloud Photos servers, same result.

What we lose is the potential to have the pictures client-encrypted such that apple's servers can't even see their content*. Every single hypothetical dystopian scenario people keep claiming is exactly what is possible today, and the client check would have made it harder to implement instead of easier.

\* unless there are multiple client voucher failures, and a second fingerprint failure, and a manual review of only those photos.
Actually I am happy with that. Apple's servers, Apple's property, Apple's rules. I just don't want surveillance hardware/software in my pocket wherever I go, which is not the same, particularly since there are AI optimised chips being put in mobile devices. Apple's proposal would have set a precedent of mobile device surveillance that would have normalised the most intrusive behaviours of governments and businesses.
 
Photos are encrypted when stored in iCloud. They’re just not end to end encrypted.
Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.
 
What an utterly bizarre line in the sand to draw on privacy. Apple has my medical records, my credit card information, my social security number, my emails, and can tell you everywhere I’ve been since 2007. They *could* use any of that information against me. But they haven’t, and I trust that they won’t. Yet somehow scanning my photos on-device for known hash values is where my privacy and freedom are at stake? Riiiight.

Anyone who’s that concerned about privacy should not only throw away all their apple devices but disconnect all your Google and MS services, delete all social media, stop accessing the internet, and for the love of god, don’t use a cellphone.

On the other hand, I have to imagine the number of child abusers they are going to catch with a publicly advertised program that can be defeated just by not keeping CSAM in iCloud photos has to be… low.

So, yeah, on a scale of one to ten, my level of concern about this one way or the other is a one.
To borrow boilerplate disclosures from Apple’s quarterly conference calls, “past performance is not indicative of future results.”
 
Although they are stored on Apple’s servers in encrypted form, this is done using a generic encryption key that Apple has access to. With the method Apple was attempting to implement, they would be encrypted with a key that Apple DOESN’T have access to.
I understand how it all works. You keep saying they’re stored unencrypted and they’re not.
 
Apple's proposal would have set a precedent of mobile device surveillance that would have normalised the most intrusive behaviours of governments and businesses.
Apple’s proposal would have set a precedent of mobile devices where it’s normalized that ALL images would be encrypted in the cloud with a key the provider doesn’t have. Thus, not available when authorities request them from the provider.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.