Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Find an alternative way to catch criminals. And, why Apple is even getting involve?
Exactly. Police is doing great work to catch these perverts with growing success. I really don't understand what Apple has to do with this. Especially now that it is known what will happen with iCloud, could anyone expect that criminals will leave their material up there, or even in their devices? If that material was ever on iCloud.

Almost all this system will do is to report false positives with the potential to ruin some innocent people's lives. Even if the chances are really slim, we talk about lives of innocent people. There should be zero probability for that to happen but even Apple admitted that the probability is not zero.

What if other material to check for is added to the list? And what if hackers figure out how to break the system and wreak havoc with reporting people to the authorities for CSAM or anything else that will follow? This has to stop now and never, ever, be discussed again.
 
This is why many key experts think that there is something more to this, that it is laying the groundwork for something China or LE wants badly. Steve kept USERS first. Will the current Apple? I guess we know. (Edited for typos)
It is more than evident that there is government pressure behind this fiasco. We just do not know what exactly happened. CSAM is just the excuse that everyone is going to accept.
 
Whole things sounds like:

- Door making company is making a statement that in your own home you can have people to suffer wrongs / very bad treatment by yourself (including wrong things with children).
- Door making company is making a decision - you need to have universal key which will help those poor souls. Those universal keys to your own home will be owned by door company and they can use it to check on you (just in case and only with good intentions to help others).
- Door company is making a call that in cause they will ’feel’ something is wrong - some other people in door company (or maybe vendors hired by door company) will check proofs (photos?) and based on it lock you out of your home (iCloud) (even you bough it, it’s own by you. Because they decided it’s wrong - you will be locked out. Maybe forever? Or at least prepare for long fight and live somewhere else).

- Let’s think about one more point - this checking is not very proactive. Maybe the same door company will suggest to install a camera in your home. Nobody will have acces to it, but the door company. All in name of good things. They will check more proactivly to avoid bad things happening in your house. It’s for their sake of those poor sounds (eg. children).

I hope all of us agree that ’door company’ desires good things. Thye want to promote good behaviour. It’s not the police. It’s not prosecutors. It’s not the courts. It’s the door company who will help the world and save it.

Does it sound the bell? Apple, stop pretending. Stop being ‘the door company’ from this short post.

Have balls to say why you want to do it. Why you don’t want to give us the right to our own privacy on equipment purchased from you.

I’m sorry to say it - but Apple is heading in wrong direction. It will be one of the reasons I will consider another option for my home / family ecosystem (or at least modify it with further ‘trust’ and ‘no trust’ categories.

And… I’m event not talking about creating a back door to iMessage. I’m even not saying that there will be government who will say - I want LGBT photos to be included. I want more things to be reported. PEGASUS and other solutions will just enjoy their good time with it and governments would be able to influence apple (do they want to sell stuff in China? Indonesia? Saudi Arabia? Other countries with strong view on what’s allowed or not allowed) to either go away from their market or…. kindly give them a way to spy on their citizens and punish them in case it’s against the regime.

Poor job Apple. I’m ashamed. I’m not recommending your ecosystem anymore (even as ‘pretending to be’) as privacy oriented.

Sad. Disapointed. Angry (especially based on your silly excuses).

Start scanning iCloud and stop trying to expose ‘on device’ exploits for you / governments / others.
 
Last edited:
When you see how many fronts Apple are under pressure from governments and the EU throughout the world over APPs etc., and the pressure from these people over having a backdoor for anti terrorist/crime fighting etc., that Apple have apparently rejected. You can't blame Apple users for now wondering whether there's a deal been struck for that back door, which would explain Apple's intransigence in having the tools to do that on over a billion+ devices.

Even the argument Apple use makes no sense, inasmuch as they suggest it protect privacy because other cloud organisations check more than hash information, but that argument is vacuous, as on that basis they could use the same NeuralHash system on iCloud, but seem to be trying to defend the indefensible by having a billion plus users having it embedded on their own hardware, which is exactly what some governments and agencies dream of, a backdoor to individual hardware.

With much of the clamour to get Apple to assist with these features, I think many people would be forgiven for coming to a conclusion that perhaps a deal has been struck, whereby the threat of actions against Apple on various fronts is minimised and in the case of some nations, its a demand to allow sales in those countries at all.

Very dark days for Apple, the PR has been awful, and although Apple has made a few hashes (excuse the pun) in the past, this seems to be potentially the worst, and Apple's attempts to defend the indefensible seems to dig a deeper hole each time, as each excuse still doesn't prevent them doing what they SAY the are doing in preventing child abuse, etc., but using the same system on the cloud NOT on users hardware. If NeuralHash gives more privacy as Apple says, there is nothing that makes it impossible in doing that on THEIR servers. Putting it on users hardware tells its own story.

If the sky's the limit on what Apple could possibly do, and that makes you worried and/or unhappy, isn't it then time to vote with your wallet?
 
  • Like
Reactions: Freedom1
Whole things sounds like:

- Door making company is making a statement that in your own home you can have people to suffer wrongs / very bad treatment by yourself (including wrong things with children).
- Door making company is making a decision - you need to have universal key which will help those poor souls. Those universal keys to your own home will be owned by door company and they can use it to check on you (just in case and only with good intentions to help others).
- Door company is making a call that in cause they will ’feel’ something is wrong - some other people in door company (or maybe vendors hired by door company) will check proofs (photos?) and based on it lock you out of your home (iCloud) (even you bough it, it’s own by you. Because they decided it’s wrong - you will be locked out. Maybe forever? Or at least prepare for long fight and live somewhere else).

- Let’s think about one more point - this checking is not very proactive. Maybe the same door company will suggest to install a camera in your home. Nobody will have acces to it, but the door company. All in name of good things. They will check more proactivly to avoid bad things happening in your house. It’s for their sake of those poor sounds (eg. children).

I hope all of us agree that ’door company’ desires good things. Thye want to promote good behaviour. It’s not the police. It’s not prosecutors. It’s not the courts. It’s the door company who will help the world and save it.

Does it sound the bell? Apple, stop pretending. Stop being ‘the door company’ from this short post.

Have balls to say why you want to do it. Why you don’t want to give us the right to our own privacy on equipment purchased from you.

I’m sorry to say it - but Apple is heading in wrong direction. It will be one of the reasons I will consider another option for my home / family ecosystem (or at least modify it with further ‘trust’ and ‘no trust’ categories.

And… I’m event not talking about creating a back door to iMessage. I’m even not saying that there will be government who will say - I want LGBT photos to be included. I want more things to be reported. PEGASUS and other solutions will just enjoy their good time with it and governments would be able to influence apple (do they want to sell stuff in China? Indonesia? Saudi Arabia? Other countries with strong view on what’s allowed or not allowed) to either go away from their market or…. kindly give them a way to spy on their citizens and punish them in case it’s against the regime.

Poor job Apple. I’m ashamed. I’m not recommending your ecosystem anymore (even as ‘pretending to be’) as privacy oriented.

Sad. Disapointed. Angry (especially based on your silly excuses).

Start scanning iCloud and stop trying to expose ‘on device’ exploits for you / governments / others.

See above. Vote with your wallet.
 
It's clearly indicative of the probability any one account should have on the order of 30 such false positives are extremely low. It also shows that having one false positive doesn't really increase the probability of having several other false positives to reach the threshold.
No, it isn't - you are demonstrating both the fallacies I was talking about by (a) confusing the chances of a random false match in a large sample with the chances that an innocent person will have a false match a person singled out for having a match could be innocent (edited - what did I say about stats relying on pedantry?) and (b) assuming that multiple matches against the same person dramatically reduce the chances of a mistake.

The articles I linked explain this - with more authority than I could - and also show that it is a widely held fallacy.

As for (b) - it assumes independence - and multiple "NeuralHash" matches against an individual's photo collection are not independent because the typical person's photo album is not a random collection of unrelated images - as anybody who has politely nodded while being shown an endless stream of barely distinguishable photos of the new baby or various permutations of wedding guests - and may well include multiple cropped/resized versions of the same photo.

The famous case your pointing to involves medical conclusion which by their very nature is uncertain.
No - as it says in the article - a big part of the issue was the invalid mathematics used to sway the jury with an astronomically low probability: the math for "two rolls of a fair die" assumes independent events. Two genetic siblings, living in the same conditions, eating the same food are not independent. You could flip the logic and use it to get a guilty person off: only 1:10,000 people murder a child, so the chances of someone murdering two children must be 1:10,000^2 - a jury would throw that out because it flies in the face of common sense, not because they spotted the math error.

...and, as for the uncertainty, that's where this "not all hashes are cryptographic hashes" thing comes in: the techniques used in NeuralHash to stop it being fooled by transformations of the image introduce uncertainty - the "Neural" bit is a giveaway - neural computing techniques pretty much work on uncertainty - there's no algorithm that can be analysed to precisely predict which images will or won't be matched.

...that doesn't mean that such a technique can't be sufficiently reliable, but the onus on Apple is to prove it does.

I'm not even sure Apples derivative of the image would be acceptable in a court case.
...it wouldn't be needed if a convincing expert witness for the prosecution stood up and cited the 3 matches in a trillion cases number, while the defence had to try and explain a line of reasoning which sounds for all the world like "1 in a trillion doesn't mean 1 in a trillion". In any case, with something like possession of CSAM images, by the time an innocent person has got as far as court, their life is probably already in ruins.

Aside:

The problem with stats and probability is that you have to be completely pedantic about it while also understanding the real world - e.g:
  1. If Bob throws 100 consecutive heads by tossing a fair coin, the probability of his 101st throw being heads is 0.5
  2. If Alice throws 100 consecutive heads by tossing a coin, the probability of her 101st throw being heads is 1
...the first statement is correct because it's basically just a re-statement of what "fair" means in a mathematical context. The second statement is also correct - to an appropriate level of accuracy - because coming across a two-headed coin, while rare, is orders of magnitude more probable someone than throwing 100 consecutive heads on a fair coin.

...Of course, getting people to accept (1) is itself a very common problem probably because you're saying (1) and they're interpreting it as (2). The danger is that you'll convince them of (1) by gaslighting them into rejecting (2). In fact, there's no contradiction - (1) is just more precisely worded to eliminate "real world" uncertainties.
 
Last edited:
The voting with wallet thing is critical, I can pretty much immediately shut down icloud once I send my daughter a hard drive at college to download her data. It takes me out of the upgrade loop too this fall since they will come preinstalled with ios15, anyway the point is…now that they are moving forward with it, are you going to allow it. I am not, they only succeed if users roll over and let them
 
  • Like
Reactions: boswald
Does this constitute a reasonable search? What is the probable cause?
There doesn't need to be probable cause or a warrant if you invite them in. When you agree to the EULA, you're inviting them in. Same as if the police show up at your door, and you invite them in your house. They don't need a warrant if the stuff is sitting there in plain sight and you invited them in.
 
  • Like
Reactions: citysnaps
It does not matter how they try to justify an illegal search with anti-colluding techniques, the search is still illegal. In Europe, it constitutes a breach in GDPR, as explicit consent is required for the use of ANY personal data. The fine for a breach in that can be as much as 20 billion Euros and the Europeans are always looking for a way to get money out of Apple.
Once again, it isn't illegal if you agree to it. That's the EULA.
 
  • Sad
Reactions: boswald
[...]

Poor job Apple. I’m ashamed. I’m not recommending your ecosystem anymore (even as ‘pretending to be’) as privacy oriented.
[...]
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.

However, and I'm wondering, this is 2021, do people actually ask for a recommendation of whether they should get an iphone or Galaxy or ipad or tab or Mac or Windows? I used to be the goto person for that type of information and it's been 5 years since I've been asked any questions about "which gear is best for me?"
 
There doesn't need to be probable cause or a warrant if you invite them in. When you agree to the EULA, you're inviting them in. Same as if the police show up at your door, and you invite them in your house. They don't need a warrant if the stuff is sitting there in plain sight and you invited them in.
Your correct of course , probably the only trouble apple might run in to is if they install ios15 through automatic updates and don’t provide a path back to 14…. But I’m not sure how the courts will view any of this, the whole thing should be challenged though just to see if there is any shaky legal ground….. seems like some smart lawyer could win on at least the point that something like this would need an express permission to have running on your phone And if you refuse either a refund or path to ios14 would be the remedy
 
  • Like
Reactions: GBaughma
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux).
I dunno why everyone thinks their devices will be scanned. Based on Apple's published docs, only photos that are uploaded to iCloud Photo will be hashed on -evice and matched against the on-device CSAM hash database. If there's a match, a security voucher will be generated for that photo and uploaded to iCloud Photo. If there's no match, no voucher will be generated and the photo is uploaded to iCloud Photo as is.

If there's no upload to iCloud Photo, nothing will be hashed on-device. There is no scanning of photos on-device for CSAM contents.
 
While I'm "not happy" about this scanning I'm also not throwing away Apple gear and rushing to get Android (or Linux). I don't buy all the slippery slope arguments, although they are really tempting to dive into the whataboutisms that these arguments represent. I'll see how this starts to shake out before making a move to determine if I buy a Galaxy Note or Iphone 13.

However, and I'm wondering, this is 2021, do people actually ask for a recommendation of whether they should get an iphone or Galaxy or ipad or tab or Mac or Windows? I used to be the goto person for that type of information and it's been 5 years since I've been asked any questions about "which gear is best for me?"
I would only point naysayers to the patriot act… once these tools are in place they will be utilized …. Some in Congress and our intelligence will be salivating for full access
 
Once again, it isn't illegal if you agree to it. That's the EULA.
In this instance it might just be illegal. I state that with the usual caveats that I am not a constitutional lawyer and Apple has certainly done their legal homework. HOWEVER, as I noted earlier the 10th circuit appeals sided with Ackerman. The court ruled that as the law mandates information be sent to CSAM, and that CSAM due to it establishment by Congress making it a de facto law enforcement agency, AOL was effectively participating in an illegal search by law enforcement.....

However, as I noted, caveats.
 
  • Like
Reactions: GBaughma
I dunno why everyone thinks their devices will be scanned. Based on Apple's published docs, only photos that are uploaded to iCloud Photo will be hashed on -evice and matched against the on-device CSAM hash database. If there's a match, a security voucher will be generated for that photo and uploaded to iCloud Photo. If there's no match, no voucher will be generated and the photo is uploaded to iCloud Photo as is.

If there's no upload to iCloud Photo, nothing will be hashed on-device. There is no scanning of photos on-device for CSAM contents.
It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court order
 
In this instance it might just be illegal. I state that with the usual caveats that I am not a constitutional lawyer and Apple has certainly done their legal homework. HOWEVER, as I noted earlier the 10th circuit appeals sided with Ackerman. The court ruled that as the law mandates information be sent to CSAM, and that CSAM due to it establishment by Congress making it a de facto law enforcement agency, AOL was effectively participating in an illegal search by law enforcement.....

However, as I noted, caveats.
It’s literally a back door that once the public has agreed to through tos, then people will not know, all surveillance warrants on the national level are classified fisa court issued. One judge signing off literally would give the nsa full access to everyone’s ios15 device….you would never know…. This initial public step is all that is required
 
  • Like
Reactions: KazKam and Playfoot
It’s literally a back door that once the public has agreed to through tos, then people will not know, all surveillance warrants on the national level are classified fisa court issued. One judge signing off literally would give the nsa full access to everyone’s ios15 device….you would never know…. This initial public step is all that is required
Yes, and as note previously, NSL's are an entire 10 levels above FISA. If presented with an NSL, the receiving party is NOT allowed to even disclose the request, let alone the content. Unlike FISA with a three judge panel, and need to reviewed and renewed every 90 days(?) NSL's are at the discretion of the alphabet soup people....
 
It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court order
But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?

What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.

From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
 
But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?

What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.

From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
To me, it seems to be the fact that explicit technology will be placed on phones owned by users. People, and indeed I see this as a step too far. Yes, if data is upload to "fill in the blank's" servers, then scan the information. However, leave my phone, tablet and computer alone.

In thinking about this more, on a non-technical side, I see a number of issues. One, how do we know what NCMEC is giving to Apple? To the best of my knowledge there is no public oversight of the images selected. Two, how do we know that the file given to Apple is the file apple uses to review our phones? Three, in the end, as you alluded to above, we have absolutely no way to ensure that what Apple states it is doing with the hashes, neural AI, etc is actually what is done.

I am not being paranoid. Just noting there is no effective way for the general public to oversee, to question or to challenge what is being done one's property.
 
How do you know? There are many other reasons to buy Apple stuff - the cool factor, long-time software support, carrier subsidies, Apple’s own promotions (e.g. Back to School), splitting your payment over many months, excellent guaranty & repair etc. I am not sure that privacy is/was Apple’s main selling point, at least for me.

Anyway, for as long as your device is connected to anything (Internet, intranet, etc.) there is no guaranteed privacy (intentional breach, human error or software/firmware bugs).
I know b/c apple has spent millions advertising themselves as a privacy focused company.

In their 2020 annual report they spoke to the importance of "privacy" 6 times and data security 5 times. It's clear if you read their marketing materials or annual reports that they are selling privacy. https://s2.q4cdn.com/470004039/files/doc_financials/2020/ar/_10-K-2020-(As-Filed).pdf
 
But what is stopping Apple from doing 'surveillance' before now? They already have the capability to detect all sorts of documents contents years ago, and their capabilities have since grown more advance since. Why only start now?

What is the benefit to Apple for implementing the CSAM feature other than to be complaint to US laws? This is effort spent that absolutely does not help them sell more devices.

From Apple's perspective, Apple has the power to push back from govt. agencies if their request are not complaint to local laws, as least for the US right? So what is Apple's reason to work with the govt. to snoop on the users of the devices they sell? So that they can get more tax break? It doesn't make any sense to me.
Hey, as playfoot said… this is about a system designed to live on your phone and search for whatever data they are looking for, it has nothing to do with the normal surveillance we all are use to and somewhat expect…. This is the government having perma access to your device …. Only limitations would be what they can get a secret fisa court to agree to…. I keep saying Trojan horse because that’s what it is, they are using one universally hated crime to convince people it’s for the greater good. But once it’s on your phone they can use it as they see fit

——-
just to add to this… most security folks are pointing out that because Apple is global, lots of governments throughout the world will use the system nefariously almost immediately…..they are asking the simple question, can you trust Apple to not give China access when China threatens to pull their products off the shelf if they don’t…that’s how compromises happen…. China however is probably more interested in using it for espionage instead of just spying on its own citizens but likely would use it for both
 
Last edited:
It’s the system being created that is making people upset,the Trojan horse here is all the verbiage of their intended use and protections…. It simply won’t matter after it’s on everyone’s phones ,it can be changed with a pen stroke, it’s the skepticism being expressed by some that its no big deal is the very reason they will probably get away with it. By the time the Nsa gets involved you probably won’t even know because it will be classified, would take another Swowden to sound the alarm and make it public…. There is probably language in the existing patriot act that gives the nsa access to the system on day one and Apple will not be able to say no to a fisa court order
It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.
 
It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.
A fisa court pen Stroke is all they would need, top secret and Apple can’t even appeal due to national security…. Brought to you by your friendly patriot act,
——-
as far as you saying Apple can do this already… no, laws require probable cause for search and seizure without permission, if a cop asks you to look in your trunk and you say sure then it’s all legal, if you say no then they need a warrant….owning a ios15 device gives them permission
 
Last edited:
It can't be changed with a pen stroke, any more than any other feature in iOS could. All that 'verbiage' actually matters, because it describes how it works and how it has built-in protections against being fooled or abused, and how what it actually does is virtually useless to an attacker. If you don't believe what Apple say, and you don't believe in the system of independent researchers scrutinising how really iOS works, then you shouldn't be storing anything on your phone or using it to make calls. There are dozens if not hundreds of processes on your phone which 'access', 'read', 'scan', 'look at', 'classify', 'index', 'log' your data and your actions. Any one of them could be changed to send results to the NSA, or the CCP or the RIAA if Apple wanted to. This is why the 'slippery slope' is a fallacy. We've been standing on it all along and every movement either way up or down gets an intense amount of friction.
Well, there are many security experts and privacy advocates who are question, are in opposition to, etc.

And we have not been standing on the slope as noted above. Rather we have been sliding down the slope gaining speed and will in the not too distant future reach a point of no privacy. Phone, surveillance cameras, unknown search, seizure or imprisonment. Hopefully the opposition to arbitrary and overly capricious actions by both government and corporations will increase.

We have reached this point because most people believe that life can be and should be 100% safe, and are willing to give up freedoms, rather than protect freedoms. And in concert believe being able to send emojis with ease is more important than privacy. Do not misunderstand me, I am subscribing to tin hat theories, only what is being done by legislation, often well intentioned, but with many unforeseen consequences.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.