Only Apple does the scanning on your device.To be clear, every phone manufacturer does this. Some have even more invasive techniques for filtering illegal content
Only Apple does the scanning on your device.To be clear, every phone manufacturer does this. Some have even more invasive techniques for filtering illegal content
Apple trying to clean up their spyware PR disaster.
Apple using eveery excuse in the book as not to admit they screwed up.
Correct, which is a good thing. Better than scanning in the cloud, as that would break encryptionOnly Apple does the scanning on your device.
1) You are deflecting. This proposed capability takes place outside of third party servers on privately-owned devices. It's kind of the entire point. The fact that iCloud provides some free service and any expectation of privacy there is not relevant even though it turns out that the data is extremely private in that case.There is no expectation of privacy for data that resides on a third party server. People using iCloud never should have had that expectation with regard to illicit materials ... child abuse materials or otherwise. I think you really do misunderstand the service Apple is providing to users ... in some cases a free service. As you state, Apple is required to respond to warrants and subpoenas. Apple cannot decline to respond to such lawful requests because you think it is technologically impossible to do ... especially when it is quite technologically possible in fact. IMHO.
Apple are fully aware of this.It isn't necessarily THIS system; it is the precedent that it sets for other systems. That is the point that Apple seems to be missing.
If true, would it not have made more sense for Apple to clearly state this as the reason for their action rather than the nonsense they spouted? The judgement about communication to customers is about at a grade 3 school level in Apple these past couple of years.OR...PERHAPS...Apple is 100% liable for their servers hosting or transmitting child pornography by US Law and MUST report such activity. They are meeting this requirement by providing a more secure and private way of identifying it with this method versus just scanning every single photo you have taken and uploaded to iCloud.
Every internet service is required to do this either by scanning all photos are forwarding user submitted complaints/reporting.
It's not a good thing to me, and in fact, I very much am against the concept, and no, I have no pedo pictures on anything. If it's on iCloud, it's public to Apple, and that's fine by me, they can scan it all they want and even look at the pictures physically, but my device, no way, no how.Correct, which is a good thing. Better than scanning in the cloud, as that would break encryption
Apple doesn't E2E encrypt iCloud. They break encryption for the FBI all the time.Correct, which is a good thing. Better than scanning in the cloud, as that would break encryption
Even people like Rene Ritchie are defending Apple here saying - "People keep forgetting their icloud passwords and having full E2EE would make it too hard or impossible for them to recover their data"Apple should be instead going full E2EE and holding firm.
What does make more sense is that Apple is just merely trying to meet legal obligations and get this content off iCloud and wash its hands clean of it. But to think this will truely combat child abuse, I really don’t think so.
Even people like Rene Ritchie are defending Apple here
Even people like Rene Ritchie are defending Apple here saying - "People keep forgetting their icloud passwords and having full E2EE would make it too hard or impossible for them to recover their data"
I don't know those who are posting in this thread, but in my own circle of acquaintances, the "confusion" is best summed up by this meme...If you read the comments in this thread, it's clear that people still don't understand how it works. So yeah, there is confusion.
I'm on page 3 of 11 and there are at least 5 responses from people who could not have read / understood the article... And this has been going on for days. So yeah, "confusion" fits.
I reckon that by the time i get to the end of the comments, someone will mention, again, 'what about my kids bath pics?'
I respect your personal opinion, but to me it is more important that it has been asked by many experts in the industry (including from academia) who expressed their concerns and asked for the code.I'm not against it, but it wouldn't help those people who are principled against it.
Apple could reveal the source, then just change it, and start using a different version of it and it would be very hard to know.
The first part in the NeuralHash is probably using the neural engine hardware. So you have to trust the hardware to do what the designer says. Who is the designer of the hardware? Apple.
Also, iCloud Backup is a much better feature to misuse for governments. Why not demand they open source that part also?
What I am reacting to is the demand for open sourcing a part of the OS which is ill-suited to do surveillance and not asking it for those parts which are truly great for such surveillance.
How does mass surveillance system imply that they want to know everything on my phone? Mass surveillance simply means surveillance of something, whatever that may be, on a very large scale. In fact information reduction is the key when you want to do that.Oh, please. This isn't a "mass surveillance system" - which implies Apple wants to know about everything that's on our phone. All that's happening is that illegal images are being flagged and Apple is only notified if a good number of those are uploaded to THEIR servers.
I'm not suggesting hanging anyone and I don't need to grant anyone a fair trial because I'm a private citizen and not the government. And I don't judge Apple's actions here on their outcome, but on principle. And the principle here is clearly violated. If you can only see how that is bad once they already violated your privacy for years, then that's just what your consequentialism gets you, good luck with that.Yeah, how "naïve" of me to require evidence to support claims of wrongdoing. I guess we should start hanging people before they have a trial too. I find it hilarious that you're trying to claim the moral high ground with this.
There's obviously no reasoning with you. So I'm not going to go back and forth with you on this further, as all we'll be doing is repeating ourselves. You've already made your mind up and closed it. My mind is open, in that if I see actual evidence of wrongdoing, I will acknowledge it. Until then, I presume Apple is innocent.
👋
I think the concern is that feature could be expanded to government surveillance.As an opt-in parental control, the Messages feature seems less controversial. The iCloud scanning feature certainly sounds more invasive to the average user - but perhaps the big news here is that the other major companies already scan every single image uploaded to their cloud across the board, whereas Apple is attempting to be more selective.
Where on Reddit is this?
Gaslighting. I don't think Apple thought they'd be PR spinning this a week later.