Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple trying to clean up their spyware PR disaster.
Apple using eveery excuse in the book as not to admit they screwed up.

And the amount of people twisting their brain in knots to defend "looking for CSAM on your device" because "its' illegal" is amazing.

I don't care if content is illegal or not.
I'm not going to defend going through your data on your own device to try to "find stuff"***.

That's not how free society works.

***by "stuff" we really many an endlessly growing subjective list of things some entity doesn't agree with
 
Last edited:
There is no expectation of privacy for data that resides on a third party server. People using iCloud never should have had that expectation with regard to illicit materials ... child abuse materials or otherwise. I think you really do misunderstand the service Apple is providing to users ... in some cases a free service. As you state, Apple is required to respond to warrants and subpoenas. Apple cannot decline to respond to such lawful requests because you think it is technologically impossible to do ... especially when it is quite technologically possible in fact. IMHO.
1) You are deflecting. This proposed capability takes place outside of third party servers on privately-owned devices. It's kind of the entire point. The fact that iCloud provides some free service and any expectation of privacy there is not relevant even though it turns out that the data is extremely private in that case.

2) Apple can decline requests that are not possible. It is not possible for Apple to decrypt iCloud backups without the password, for example. Therefor they can not comply with such a request and must decline as they have done before.

3) The actual point, and you're made it for me quite nicely, is that now the capability will exist to access data that is outside of the encrypted data stored on iCloud servers. Apple can no longer decline to provide the data on the basis that it is impossible and they must expand the capabilities of on-phone analysis if compelled by authority. Apple opened the door and created the capability themselves. They can hardly refuse to use it afterwards.
 
It isn't necessarily THIS system; it is the precedent that it sets for other systems. That is the point that Apple seems to be missing.
Apple are fully aware of this.
My best guess is Apple just expected people to accept the "for the children line" and not look deeper into what this really is - ie spyware. The public understand exactly how this works on a technical level far better than Apple thought the public would.

Also the public are fully aware of the precedent that this spyware existing sets. That is why the public is speaking up against it.

That is why Apple is in full PR damage control now.
 
OR...PERHAPS...Apple is 100% liable for their servers hosting or transmitting child pornography by US Law and MUST report such activity. They are meeting this requirement by providing a more secure and private way of identifying it with this method versus just scanning every single photo you have taken and uploaded to iCloud.
Every internet service is required to do this either by scanning all photos are forwarding user submitted complaints/reporting.
If true, would it not have made more sense for Apple to clearly state this as the reason for their action rather than the nonsense they spouted? The judgement about communication to customers is about at a grade 3 school level in Apple these past couple of years.
 
Also as an aside what Apple are doing, is making a mockery of police warrants.
If Apple or anyone want your data, they can get it through a police search warrant. Usually arrants are only granted if there is probable cause or some other good reason in the first place. Not just searching everyone because you feel like it.
 
Correct, which is a good thing. Better than scanning in the cloud, as that would break encryption
It's not a good thing to me, and in fact, I very much am against the concept, and no, I have no pedo pictures on anything. If it's on iCloud, it's public to Apple, and that's fine by me, they can scan it all they want and even look at the pictures physically, but my device, no way, no how.

I'm not clamoring for EE2E encryption and wouldn't use it anyway.
 
  • Like
Reactions: Stunning_Sense4712
No, like I’m sorry. I’ve debated this over and over in my head. But Apple cannot be this naive, this dumb- to have focused this long and this much manpower on a tool aimed at combating CSAM, only to then alert pedo’s that they can just disable iCloud photos and they’ll be good.

It’s almost so dumb that it makes the very notion look suspect.

Regardless of the tech details, anyone can read the NeuralHash tech papers and understand what they’re setting out to do, just it seems completely unbalanced to sacrifice user privacy or at all degrade encryption in messaging for this CSAM goal that practically anyone hiding these kinda of photos now will have long disabled iCloud photos or would be thinking of other storage options now.

What does make more sense is that Apple is just merely trying to meet legal obligations and get this content off iCloud and wash its hands clean of it. But to think this will truely combat child abuse, I really don’t think so.
 
Last edited:
can someone comment on this paragraph from the article, does this sound right ?

Because it's on the [phone], security researchers are constantly able to introspect what’s happening in Apple’s [phone] software. So if any changes were made that were to expand the scope of this in some way —in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.

it made me think that perhaps this might be a decent solution to a big problem, at the same time, the whole debacle has made me think seriously about privacy (i am in canada so this doesn't affect us yet, though i suspect it will be arriving shortly ... or, it will be arriving for certain since the database ships with every version of 15, it just isn't activated)

it is now time to accept the fact that there really is no privacy in the online world, there likely never will be, if you want the best version of privacy, you will never get it on mainstream phones and mainstream operating systems
you are going to have to go dark and use the many non-mainstream os's and tool available

however, in the meantime i do think it wise to spread your data and technology around, use a little of this and a little of that, don't give anyone the whole pizza pie
 
Last edited:
Apple should be instead going full E2EE and holding firm.
Even people like Rene Ritchie are defending Apple here saying - "People keep forgetting their icloud passwords and having full E2EE would make it too hard or impossible for them to recover their data"

Just making up excuses they are. Full E2EE can happen with safeguards for lost passwords.
 
What does make more sense is that Apple is just merely trying to meet legal obligations and get this content off iCloud and wash its hands clean of it. But to think this will truely combat child abuse, I really don’t think so.

I agree.
I guess it would be too "on the nose" to ask them to just say that?

I don't even see the harm at this point.
They're already going out of their way to tell actual Pedophiles out there how to avoid any issues here.

It begs the question - why do this? What's really in play here? (to your points)

It's a slap in the face for them to honestly say this will do much about CSAM...
...equally, it's an even bigger slap in the face to have them defend "this is even more privacy!" as a concept now.
 
  • Like
Reactions: Stunning_Sense4712
Even people like Rene Ritchie are defending Apple here saying - "People keep forgetting their icloud passwords and having full E2EE would make it too hard or impossible for them to recover their data"

More seriously - I'm glad you mentioned this though..

We've seen folks like Gruber mention that this is all on the path to full E2EE for iCloud and thus maybe it makes sense?

Notice who doesn't seem to be talking about full E2EE?

Apple.
 
If you read the comments in this thread, it's clear that people still don't understand how it works. So yeah, there is confusion.

I'm on page 3 of 11 and there are at least 5 responses from people who could not have read / understood the article... And this has been going on for days. So yeah, "confusion" fits.

I reckon that by the time i get to the end of the comments, someone will mention, again, 'what about my kids bath pics?'
I don't know those who are posting in this thread, but in my own circle of acquaintances, the "confusion" is best summed up by this meme...

privacy.png
They're having a hard time reconciling what they have believed about Apple with this latest move by Apple. The actual text of the announcements was very clear IMO.
 
I'm not against it, but it wouldn't help those people who are principled against it.

Apple could reveal the source, then just change it, and start using a different version of it and it would be very hard to know.

The first part in the NeuralHash is probably using the neural engine hardware. So you have to trust the hardware to do what the designer says. Who is the designer of the hardware? Apple.

Also, iCloud Backup is a much better feature to misuse for governments. Why not demand they open source that part also?

What I am reacting to is the demand for open sourcing a part of the OS which is ill-suited to do surveillance and not asking it for those parts which are truly great for such surveillance.
I respect your personal opinion, but to me it is more important that it has been asked by many experts in the industry (including from academia) who expressed their concerns and asked for the code.

The way you see things is putting all the trust to a company without asking to verify anything at all. If you verify something there will always be something else that you can't verify so why bother at all...

Anyway as I stated before I would really appreciate a PR release from Apple explaining why they stepped back to China's and Russia's demands..

And I would really like to know why the care specifically about child pornograpny and not children receiving death threats for example.. They could start scanning messages. I am extremely worried why they care specifically about one cause and not the other..... Could you speculate why?
 
Oh, please. This isn't a "mass surveillance system" - which implies Apple wants to know about everything that's on our phone. All that's happening is that illegal images are being flagged and Apple is only notified if a good number of those are uploaded to THEIR servers.
How does mass surveillance system imply that they want to know everything on my phone? Mass surveillance simply means surveillance of something, whatever that may be, on a very large scale. In fact information reduction is the key when you want to do that.

Yeah, how "naïve" of me to require evidence to support claims of wrongdoing. I guess we should start hanging people before they have a trial too. I find it hilarious that you're trying to claim the moral high ground with this.

There's obviously no reasoning with you. So I'm not going to go back and forth with you on this further, as all we'll be doing is repeating ourselves. You've already made your mind up and closed it. My mind is open, in that if I see actual evidence of wrongdoing, I will acknowledge it. Until then, I presume Apple is innocent.

👋
I'm not suggesting hanging anyone and I don't need to grant anyone a fair trial because I'm a private citizen and not the government. And I don't judge Apple's actions here on their outcome, but on principle. And the principle here is clearly violated. If you can only see how that is bad once they already violated your privacy for years, then that's just what your consequentialism gets you, good luck with that.
 
  • Like
Reactions: Evil Lair
As an opt-in parental control, the Messages feature seems less controversial. The iCloud scanning feature certainly sounds more invasive to the average user - but perhaps the big news here is that the other major companies already scan every single image uploaded to their cloud across the board, whereas Apple is attempting to be more selective.
I think the concern is that feature could be expanded to government surveillance.
 
  • Like
Reactions: jmovie
Gaslighting. I don't think Apple thought they'd be PR spinning this a week later.

It's so interesting they didn't bring this up - anywhere - at WWDC

If this feature is so noble and well conceived and implemented... and "increases privacy"

Boy you'd sure think they would have wanted to shout about it from the rooftops...not backdoor slide it in right before new iOS and macOS releases.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.