Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
At the very least I paid for iCloud Photos as a component of buying the device. Therefore iCloud storage is reasonably an extension of my device and protected from illegal search and seizure. While Apple isn't directly a government agent even landlords can not simply enter tenant's domicile at will.

Turning off iCloud shouldn't even be considered as a solution because it's just another room associated with my devices.
Ideally, you're absolutely right.

But now we have Apple's own privacy head basically saying "If you don't do anything illegal, what's the problem?" Not a really good response.
 
Ideally, you're absolutely right.

But now we have Apple's own privacy head basically saying "If you don't do anything illegal, what's the problem?" Not a really good response.
I keep waiting for Tim to come out and say "we only patented this to make it harder for others to do it. We aren't going to implement it with any of our products or services."
 
Yes. That make me wonder. How are they implementing the code on the device? They say its in OS but only enabled in US. How technically is that gonna work? Different OS distributions depending on localization?
The US part is obviously the server side. The hashes and code itself will be pushed to every iPhone in the world, meaning the local scanning will be done on everyone's iPhones. It's just that those who use US iCloud will fully follow the whole process. For now. Apple already stated that they can tailor this on per country basis. It should send chill to everyone.
 
The biggest problem is the fact you have no way to verify what the CSAM software is doing in the background and you just have to take Apple's word that turning off iCloud Photos disables this process.
Apple's attitude is already clear. The "we are holier than thou" and "you have nothing to hide if you don't do anything illegal." It's quite damning, and sad, especially those implications coming from their so called "privacy" head.
 
Apple's attitude is already clear. The "we are holier than thou" and "you have nothing to hide if you don't do anything illegal." It's quite damning, and sad, especially those implications coming from their so called "privacy" head.
My only thought to this is, since most other cloud companies do these scans anyway, is this a requirement by the US Govt to do business in the USA? (I don't know, I'm asking). There has to be some motivation behind this prompting these decisions - I'm not the brightest bulb in the bunch but I could have at least told someone that there's going to be people freaking out about this.
 
Don't understand the fuss.

When Apple talked at CES early last year they said that they were already scanning iCloud photo's for CSAM.
Your ****'s already getting scanned since god knows when and now suddenly people start freaking out? Imagine caring about your privacy so much that you're crying in these macrumor threads and not being aware that your photo's are already being scanned lmao.
 
For many for whatever reason, iOS 14.8 will be the end. Apple has decided this is more important than losing the customers who can’t handle it. Many are complaining as they go though the stages of grief with this loss. They will move on and Apple is going to be ok with that. Those who remain will trust Apple at that word. I will happily be among those who trust Apple and glad that those who don’t are gone.
 
There's some layer of trust in everything right? You trust your car not to fall apart while you drive it, you trust your (insert computer used here) not to constantly take photos of you while you use it distributing it to everyone you know.

We have to trust that Apple is only comparing hashes to previously identified CP (child-****) images and sending those up for review. We have to.

If we don't agree to it, simple - turn iCloud photos off. End of story, right? (According to what I'm reading, turning off iCloud photos disables device photo scanning).

But Apple keeps telling those sick individuals how to get around this. While you and everyone here including me are innocent and suffer. So what does this system gain from existing? Everyone is talking about it since it’s Apple. These criminals know how to avoid detection.
 
  • Like
Reactions: timeconsumer
Apple is scrambling now. What a poorly orchestrated communications campaign. A Harvard business case of how to **** up and ruin a company reputation that has not been on firm ground for some years now. Down they go...

The weird thing is I haven't seen this being reported on any major news outlets unfortunately - maybe I've missed it? Some local affiliates, and some tech blogs / "news" sites. But it seems to mostly be making the rounds on places like MR, where the "screeching minority" can easily "screech". :)
 
Don't understand the fuss.

When Apple talked at CES early last year they said that they were already scanning iCloud photo's for CSAM.
Your ****'s already getting scanned since god knows when and now suddenly people start freaking out? Imagine caring about your privacy so much that you're crying in these macrumor threads and not being aware that your photo's are already being scanned lmao.
That is not an issue here at all… I and maost of other understand that. But that is done server side on the given provider. Messing with the device backgroud activities is a whole new bag of ****
 
Don't understand the fuss.

When Apple talked at CES early last year they said that they were already scanning iCloud photo's for CSAM.
Your ****'s already getting scanned since god knows when and now suddenly people start freaking out? Imagine caring about your privacy so much that you're crying in these macrumor threads and not being aware that your photo's are already being scanned lmao.
It’s a combination of how they are doing it and how they are communicating it.
 
  • Like
Reactions: Jezak
Why are they doubling down on this so hard? Who asked them for this?

Not to sound conspiratorial but it’s getting to the point where it feels like the government put them up to this.

Super disappointed in Apple, and I now don’t feel safe upgrading to iOS 15 or buying a new iPhone once iOS 15 is released.

Agreed. I’m not the conspiracy type. But this is just fishy. I can definitely see Apple using this for something else, but under national security cannot say. So they just said what every parent and everyone really can’t relay argue “but the kids!”. Something just seems off here.
 
My only thought to this is, since most other cloud companies do these scans anyway, is this a requirement by the US Govt to do business in the USA? (I don't know, I'm asking). There has to be some motivation behind this prompting these decisions - I'm not the brightest bulb in the bunch but I could have at least told someone that there's going to be people freaking out about this.
It is illegal to even have those pictures. Thus any cloud companies can be liable if someone uploaded those pictures to their servers. Thus many understands if the scans are done on server side. But here, Apple is doing the scans on users' iPhones locally. That's what most people are not comfortable with since the system itself lacks any transparency.

The NCMEC is US based, and since Apple is using their hashes, it makes more sense to focus on US rollout first. Apple already stated that this is US only "for now." I expect worldwide rollout in some similar forms sooner or later, since the system is already in place with iOS15.
 
Apple has very good reason to target accounts with massive CSAM content. Obviously they’re the once that should be stopped. Owning one single photo can be a fluke, an accident, hardly an offence you can be convicted for. Typically these guys have massive libraries


stop being outraged about something you dont understand. Apple has found a way to van CSAM content from their servers while leaving you files encrypted on their servers. That’s a massive achievement. Because they want your data to remain your data. Nothing is leaving your device because they do CLIENT-SIDE verifications.
I am not outraged. I merely disagree with on device scanning. I do understand what is going on and I don't need to check with you in order to determine my understanding of any given subject.
 
How could Apple get hold of my wife pictures? The check happens client-side, no data leaves my device and photos in iCloud remain encrypted.
The on-device check computes the NeuralHash of your picture. NeuralHash is not a traditional hash function, it tries to represent semantic similarity. Apple does not tell us the tolerances in this, but they must be non-zero, otherwise they could just use traditional hashes. If the NeuralHash of your picture is sufficiently close to the NeuralHash of one of the 200,000 NCMEC pictures (maybe one of a teenager in front of a similar wallpaper), your picture goes to an Apple reviewer.

What happens next, Apple does not really explain. The Apple reviewer does not have access to the NCMEC pictures, so s/he will just look at your picture, supposedly even just a low-res derivative. "Hmm, the hash match is close, and this could be a kid, hard to tell. Better show it to NCMEC, let them sort it out." Is this feasible? I don't know, Apple won't tell. But the Apple reviewers must be making some decisions here, otherwise their existence in the process would be pointless.

So maybe NCMEC gets your picture. Here we really do not know what happens. Presumably they compare it to their database and see that it is not one of theirs. Maybe NCMEC drop the case, and the only damage is that two or more strangers have stared at your private picture. But maybe NCMEC says, "wait, this is not in our DB, but she might be under age anyway! Better forward this to the police, let them sort it out." Is this feasible? Only Apple or NCMEC knows.
 
  • Like
Reactions: hagar
The on-device check computes the NeuralHash of your picture. NeuralHash is not a traditional hash function, it tries to represent semantic similarity. Apple does not tell us the tolerances in this, but they must be non-zero, otherwise they could just use traditional hashes. If the NeuralHash of your picture is sufficiently close to the NeuralHash of one of the 200,000 NCMEC pictures (maybe one of a teenager in front of a similar wallpaper), your picture goes to an Apple reviewer.

What happens next, Apple does not really explain. The Apple reviewer does not have access to the NCMEC pictures, so s/he will just look at your picture, supposedly even just a low-res derivative. "Hmm, the hash match is close, and this could be a kid, hard to tell. Better show it to NCMEC, let them sort it out." Is this feasible? I don't know, Apple won't tell. But the Apple reviewers must be making some decisions here, otherwise their existence in the process would be pointless.

So maybe NCMEC gets your picture. Here we really do not know what happens. Presumably they compare it to their database and see that it is not one of theirs. Maybe NCMEC drop the case, and the only damage is that two or more strangers have stared at your private picture. But maybe NCMEC says, "wait, this is not in our DB, but she might be under age anyway! Better forward this to the police, let them sort it out." Is this feasible? Only Apple or NCMEC knows.
You pointed out one of the biggest issue. The whole system and process lacks any transparency. We already know that Apple was willing to outsource even confidential SIRI materials to contractors. It's concerning.
 
To those who say I have nothing to hide, let's say I could be having affairs in my personal life. That means I would have many things to hide, nothing illegal and it would be my business!!!

Those who say just go to Android. I spent only this year a lot of money on apple devices because I chose privacy as they advertised it and because they will be supported for years - so I do want to upgrade my devices with future OS. I am not going to burn my money just because Apple decides to kill privacy as it was initially advertised (non stop for years) and affected my choice of purchase.

If Apple is about to do what Google does, why not go with Google services which are much better anyway? Some of us, chose Apple's inferior services for privacy.

I keep tons of medical records and photos stored in my phone - icloud photos is off - I have no idea if it's legal to have my files/photos scanned in my device.

Some recommend to deactivate icloud photos - I don't use it anyway but the majority of users do - Isn't it Apple ecosystem a big selling point so far?

Finally some personal thoughts

To those who use terms like good and bad guys, I have to say life is way more complex than that.

It's obvious to assume there are people here in the forum who work for apple. And I would rather assume that than think that there are people so dogmatic and fanatic that they will spend so much of their time to defend a company.

We reached the point of paranoia where someone in order to express his/her opinion against apple has to begin by saying: Don't get me wrong but I love Apple.... How many times haven't we read this phrase here?? We almost reach the point that we need to apologize if we have a complain from a Apple.

Thanks for reading my long post. I have posted it in two threads because there are so many opened for the same subject
 
Last edited:
Just wait.... they've already transitioned to we want to make it available to other apps. Soon it will be mandatory for apps to use it if they have photos transferred otherwise you can't get them in the app store
I think that's different. What they want to have available to 3rd party apps is the picture censoring feature for minors. That one to me is a non issue since it's an opt-in.
 
So these checks only affect US devices and checked against a hash list generated from the US based NCMEC database. What happens when this is rolled out to other countries? Does the hash list get generated against that particular country's equivalent database? Have they even mentioned this?
 
So these checks only affect US devices and checked against a hash list generated from the US based NCMEC database. What happens when this is rolled out to other countries? Does the hash list get generated against that particular countries' equivalent database? Have they even mentioned this?
The scans are still being done whether it's a US iPhone or not since the hashes are coded in to iOS itself.
As for other countries, NCMEC does have international counterparts, so I guess they also have their databases of hashes. This is assuming that Apple only accepts hashes from these bodies, and they're not compromised in any shape or form.

The tricky part is, since this involves the judgement of morality, some countries have different standard than others, and thus what is considered illegal can differ.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.