
Opinion | Why Apple wants to peek into your messages and iCloud
Apple needs to be transparent about its future plans for the features it's rolling out.

Exactly. Police is doing great work to catch these perverts with growing success. I really don't understand what Apple has to do with this. Especially now that it is known what will happen with iCloud, could anyone expect that criminals will leave their material up there, or even in their devices? If that material was ever on iCloud.Find an alternative way to catch criminals. And, why Apple is even getting involve?
It is more than evident that there is government pressure behind this fiasco. We just do not know what exactly happened. CSAM is just the excuse that everyone is going to accept.This is why many key experts think that there is something more to this, that it is laying the groundwork for something China or LE wants badly. Steve kept USERS first. Will the current Apple? I guess we know. (Edited for typos)
When you see how many fronts Apple are under pressure from governments and the EU throughout the world over APPs etc., and the pressure from these people over having a backdoor for anti terrorist/crime fighting etc., that Apple have apparently rejected. You can't blame Apple users for now wondering whether there's a deal been struck for that back door, which would explain Apple's intransigence in having the tools to do that on over a billion+ devices.
Even the argument Apple use makes no sense, inasmuch as they suggest it protect privacy because other cloud organisations check more than hash information, but that argument is vacuous, as on that basis they could use the same NeuralHash system on iCloud, but seem to be trying to defend the indefensible by having a billion plus users having it embedded on their own hardware, which is exactly what some governments and agencies dream of, a backdoor to individual hardware.
With much of the clamour to get Apple to assist with these features, I think many people would be forgiven for coming to a conclusion that perhaps a deal has been struck, whereby the threat of actions against Apple on various fronts is minimised and in the case of some nations, its a demand to allow sales in those countries at all.
Very dark days for Apple, the PR has been awful, and although Apple has made a few hashes (excuse the pun) in the past, this seems to be potentially the worst, and Apple's attempts to defend the indefensible seems to dig a deeper hole each time, as each excuse still doesn't prevent them doing what they SAY the are doing in preventing child abuse, etc., but using the same system on the cloud NOT on users hardware. If NeuralHash gives more privacy as Apple says, there is nothing that makes it impossible in doing that on THEIR servers. Putting it on users hardware tells its own story.
Whole things sounds like:
- Door making company is making a statement that in your own home you can have people to suffer wrongs / very bad treatment by yourself (including wrong things with children).
- Door making company is making a decision - you need to have universal key which will help those poor souls. Those universal keys to your own home will be owned by door company and they can use it to check on you (just in case and only with good intentions to help others).
- Door company is making a call that in cause they will ’feel’ something is wrong - some other people in door company (or maybe vendors hired by door company) will check proofs (photos?) and based on it lock you out of your home (iCloud) (even you bough it, it’s own by you. Because they decided it’s wrong - you will be locked out. Maybe forever? Or at least prepare for long fight and live somewhere else).
- Let’s think about one more point - this checking is not very proactive. Maybe the same door company will suggest to install a camera in your home. Nobody will have acces to it, but the door company. All in name of good things. They will check more proactivly to avoid bad things happening in your house. It’s for their sake of those poor sounds (eg. children).
I hope all of us agree that ’door company’ desires good things. Thye want to promote good behaviour. It’s not the police. It’s not prosecutors. It’s not the courts. It’s the door company who will help the world and save it.
Does it sound the bell? Apple, stop pretending. Stop being ‘the door company’ from this short post.
Have balls to say why you want to do it. Why you don’t want to give us the right to our own privacy on equipment purchased from you.
I’m sorry to say it - but Apple is heading in wrong direction. It will be one of the reasons I will consider another option for my home / family ecosystem (or at least modify it with further ‘trust’ and ‘no trust’ categories.
And… I’m event not talking about creating a back door to iMessage. I’m even not saying that there will be government who will say - I want LGBT photos to be included. I want more things to be reported. PEGASUS and other solutions will just enjoy their good time with it and governments would be able to influence apple (do they want to sell stuff in China? Indonesia? Saudi Arabia? Other countries with strong view on what’s allowed or not allowed) to either go away from their market or…. kindly give them a way to spy on their citizens and punish them in case it’s against the regime.
Poor job Apple. I’m ashamed. I’m not recommending your ecosystem anymore (even as ‘pretending to be’) as privacy oriented.
Sad. Disapointed. Angry (especially based on your silly excuses).
Start scanning iCloud and stop trying to expose ‘on device’ exploits for you / governments / others.
No, it isn't - you are demonstrating both the fallacies I was talking about by (a) confusing the chances of a random false match in a large sample with the chancesIt's clearly indicative of the probability any one account should have on the order of 30 such false positives are extremely low. It also shows that having one false positive doesn't really increase the probability of having several other false positives to reach the threshold.
No - as it says in the article - a big part of the issue was the invalid mathematics used to sway the jury with an astronomically low probability: the math for "two rolls of a fair die" assumes independent events. Two genetic siblings, living in the same conditions, eating the same food are not independent. You could flip the logic and use it to get a guilty person off: only 1:10,000 people murder a child, so the chances of someone murdering two children must be 1:10,000^2 - a jury would throw that out because it flies in the face of common sense, not because they spotted the math error.The famous case your pointing to involves medical conclusion which by their very nature is uncertain.
...it wouldn't be needed if a convincing expert witness for the prosecution stood up and cited the 3 matches in a trillion cases number, while the defence had to try and explain a line of reasoning which sounds for all the world like "1 in a trillion doesn't mean 1 in a trillion". In any case, with something like possession of CSAM images, by the time an innocent person has got as far as court, their life is probably already in ruins.I'm not even sure Apples derivative of the image would be acceptable in a court case.
There doesn't need to be probable cause or a warrant if you invite them in. When you agree to the EULA, you're inviting them in. Same as if the police show up at your door, and you invite them in your house. They don't need a warrant if the stuff is sitting there in plain sight and you invited them in.Does this constitute a reasonable search? What is the probable cause?
Once again, it isn't illegal if you agree to it. That's the EULA.It does not matter how they try to justify an illegal search with anti-colluding techniques, the search is still illegal. In Europe, it constitutes a breach in GDPR, as explicit consent is required for the use of ANY personal data. The fine for a breach in that can be as much as 20 billion Euros and the Europeans are always looking for a way to get money out of Apple.
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.[...]
Poor job Apple. I’m ashamed. I’m not recommending your ecosystem anymore (even as ‘pretending to be’) as privacy oriented.
[...]
Your correct of course , probably the only trouble apple might run in to is if they install ios15 through automatic updates and don’t provide a path back to 14…. But I’m not sure how the courts will view any of this, the whole thing should be challenged though just to see if there is any shaky legal ground….. seems like some smart lawyer could win on at least the point that something like this would need an express permission to have running on your phone And if you refuse either a refund or path to ios14 would be the remedyThere doesn't need to be probable cause or a warrant if you invite them in. When you agree to the EULA, you're inviting them in. Same as if the police show up at your door, and you invite them in your house. They don't need a warrant if the stuff is sitting there in plain sight and you invited them in.
I dunno why everyone thinks their devices will be scanned. Based on Apple's published docs, only photos that are uploaded to iCloud Photo will be hashed on -evice and matched against the on-device CSAM hash database. If there's a match, a security voucher will be generated for that photo and uploaded to iCloud Photo. If there's no match, no voucher will be generated and the photo is uploaded to iCloud Photo as is.While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux).
I would only point naysayers to the patriot act… once these tools are in place they will be utilized …. Some in Congress and our intelligence will be salivating for full accessWhile I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.
However, and I'm wondering, this is 2021, do people actually ask for a recommendation of whether they should get an iphone or Galaxy or ipad or tab or Mac or Windows? I used to be the goto person for that type of information and it's been 5 years since I've been asked any questions about "which gear is best for me?"
In this instance it might just be illegal. I state that with the usual caveats that I am not a constitutional lawyer and Apple has certainly done their legal homework. HOWEVER, as I noted earlier the 10th circuit appeals sided with Ackerman. The court ruled that as the law mandates information be sent to CSAM, and that CSAM due to it establishment by Congress making it a de facto law enforcement agency, AOL was effectively participating in an illegal search by law enforcement.....Once again, it isn't illegal if you agree to it. That's the EULA.
It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court orderI dunno why everyone thinks their devices will be scanned. Based on Apple's published docs, only photos that are uploaded to iCloud Photo will be hashed on -evice and matched against the on-device CSAM hash database. If there's a match, a security voucher will be generated for that photo and uploaded to iCloud Photo. If there's no match, no voucher will be generated and the photo is uploaded to iCloud Photo as is.
If there's no upload to iCloud Photo, nothing will be hashed on-device. There is no scanning of photos on-device for CSAM contents.
It’s literally a back door that once the public has agreed to through tos, then people will not know, all surveillance warrants on the national level are classified fisa court issued. One judge signing off literally would give the nsa full access to everyone’s ios15 device….you would never know…. This initial public step is all that is requiredIn this instance it might just be illegal. I state that with the usual caveats that I am not a constitutional lawyer and Apple has certainly done their legal homework. HOWEVER, as I noted earlier the 10th circuit appeals sided with Ackerman. The court ruled that as the law mandates information be sent to CSAM, and that CSAM due to it establishment by Congress making it a de facto law enforcement agency, AOL was effectively participating in an illegal search by law enforcement.....
However, as I noted, caveats.
Yes, and as note previously, NSL's are an entire 10 levels above FISA. If presented with an NSL, the receiving party is NOT allowed to even disclose the request, let alone the content. Unlike FISA with a three judge panel, and need to reviewed and renewed every 90 days(?) NSL's are at the discretion of the alphabet soup people....It’s literally a back door that once the public has agreed to through tos, then people will not know, all surveillance warrants on the national level are classified fisa court issued. One judge signing off literally would give the nsa full access to everyone’s ios15 device….you would never know…. This initial public step is all that is required
But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court order
To me, it seems to be the fact that explicit technology will be placed on phones owned by users. People, and indeed I see this as a step too far. Yes, if data is upload to "fill in the blank's" servers, then scan the information. However, leave my phone, tablet and computer alone.But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?
What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.
From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
I know b/c apple has spent millions advertising themselves as a privacy focused company.How do you know? There are many other reasons to buy Apple stuff - the cool factor, long-time software support, carrier subsidies, Apple’s own promotions (e.g. Back to School), splitting your payment over many months, excellent guaranty & repair etc. I am not sure that privacy is/was Apple’s main selling point, at least for me.
Anyway, for as long as your device is connected to anything (Internet, intranet, etc.) there is no guaranteed privacy (intentional breach, human error or software/firmware bugs).
Hey, as playfoot said… this is about a system designed to live on your phone and search for whatever data they are looking for, it has nothing to do with the normal surveillance we all are use to and somewhat expect…. This is the government having perma access to your device …. Only limitations would be what they can get a secret fisa court to agree to…. I keep saying Trojan horse because that’s what it is, they are using one universally hated crime to convince people it’s for the greater good. But once it’s on your phone they can use it as they see fitBut what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?
What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.
From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court order
A fisa court pen Stroke is all they would need, top secret and Apple can’t even appeal due to national security…. Brought to you by your friendly patriot act,It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.
Well, there are many security experts and privacy advocates who are question, are in opposition to, etc.It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.