Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If it's the chances of a "hash collision" - i.e. your cat photo having the same hash as a known illegal image - then its a well-defined mathematical property of the hash algorithm and will be tiny. One of Apple's blunders was talking about AI detection of nude photos sent to kids in the same breath - which is far more sketchy and prone to overstatement by the AI writers. Hash matching is not AI/Machine Learning.

Also, they're talking about human confirmation of any matches - done properly that should reduce the probability of false positives to zero - but the "done properly" bit is the kicker. Ideally, the confirmation should be a blind test in which the "matched" images were mixed in with a stream of other random images (matching and non-matching) - otherwise the tester will look at every image expecting to see a matching image. Even if they're comparing the image with the one it supposedly matching, you want them to be hitting "false, false, false, false, true, false, false, false...." rather than vice versa.

The greatest risk, really, is that the agencies responsible for supplying the list of hashes will get careless or over-zealous in what they deem "illegal" and the checkers will be obliged to report anything that matches, even if it appears to be a picture of a fully-dressed kid wearing a "The President is a Big Silly" T-Shirt.


iPhones are continually downloading software updates - Apple can implement any technology they want at any time. If Apple want to "spy" on your phone there hasn't been any technical issue stopping them for years apart from the law.

The real issue here is not the technology, but the fact that ticking "I accept" on iCloud now includes granting Apple permission to do on-device checking against a third-party hash list. There's no immediate practical upshot, any cloud service will check your photos anyway but you've crossed a line in the sand (and maybe waived a constitutional right in the US) by granting Apple that permission.
You use their service, you play by their rules. Don’t like it, then leave. They also offer a way out by disabling iCloud photos or simply by not updating to iOS 15 and having your photos scanned anyway. Either way, they’re still checking for the same content. This is why I don’t understand the uproar.
 
You use their service, you play by their rules.
...which get longer and less comprehensible to the majority of users with every such development - last time I set up an iDevice I had to make a deliberate effort to skip iCloud*. But you're right from a purely selfish point of view that doesn't care about other people understanding the issues or the gradual one-little-step-at-a-time erosion of privacy.

This is why I don’t understand the uproar.

Well, it's making international headlines, so millions of people are now more aware of the rules and (maybe) can make more informed decisions. It's just a pity that some of the reporting is inaccurate - but that's partly because Apple made such a ham-fisted job of the announcement.

(* which I do habitually because I don't like/want iCloud for reasons aside from privacy)
 
If it's the chances of a "hash collision" - i.e. your cat photo having the same hash as a known illegal image - then its a well-defined mathematical property of the hash algorithm and will be tiny. One of Apple's blunders was talking about AI detection of nude photos sent to kids in the same breath - which is far more sketchy and prone to overstatement by the AI writers. Hash matching is not AI/Machine Learning.

Also, they're talking about human confirmation of any matches - done properly that should reduce the probability of false positives to zero - but the "done properly" bit is the kicker. Ideally, the confirmation should be a blind test in which the "matched" images were mixed in with a stream of other random images (matching and non-matching) - otherwise the tester will look at every image expecting to see a matching image. Even if they're comparing the image with the one it supposedly matching, you want them to be hitting "false, false, false, false, true, false, false, false...." rather than vice versa.

The greatest risk, really, is that the agencies responsible for supplying the list of hashes will get careless or over-zealous in what they deem "illegal" and the checkers will be obliged to report anything that matches, even if it appears to be a picture of a fully-dressed kid wearing a "The President is a Big Silly" T-Shirt.


iPhones are continually downloading software updates - Apple can implement any technology they want at any time. If Apple want to "spy" on your phone there hasn't been any technical issue stopping them for years apart from the law.

The real issue here is not the technology, but the fact that ticking "I accept" on iCloud now includes granting Apple permission to do on-device checking against a third-party hash list. There's no immediate practical upshot, any cloud service will check your photos anyway but you've crossed a line in the sand (and maybe waived a constitutional right in the US) by granting Apple that permission.

Thanks for the info.
So in the event of a match what is the "human" at Apple inspecting? I doubt they have the CSAM pictorial database on hand to do a side by side.

Slowly digging my way through the info and the more I find, the less sense this direction from Apple makes.
The more concerned I become too.
 
The fact that the scan is done on the user's phone, without their consent, and *prior* to uploading makes this a warrantless search that Apple is conducting as a fishing expedition on behalf of law enforcement.

Law enforcement cannot do this without a warrant which requires probable cause.

NCMEC is a private foundation, but is funded by the US Justice Department. Anything Apple refers to them will be reported to FBI or other agencies. It's also run by longtime infomercial hawker John Walsh

....



Then there is the mission creep of adding new hashtables of wrongthink to check for, "for the children" or to protect you against terrorists. The precedent that Apple can use our personal resources to incriminate us without cause is intolerable and is destructive to the brand.




Superb points
 
Thanks for the info.
So in the event of a match what is the "human" at Apple inspecting? I doubt they have the CSAM pictorial database on hand to do a side by side.

Slowly digging my way through the info and the more I find, the less sense this direction from Apple makes.
The more concerned I become too.
Apple is out to get you. Watch out!

I dislike paranoid conspiracy theorists that only think about "what if". That's like my wife getting mad at me for having thoughts that I haven't even had yet.
 
Apple is out to get you. Watch out!

I dislike paranoid conspiracy theorists that only think about "what if". That's like my wife getting mad at me for having thoughts that I haven't even had yet.

You’re trivializing this. It isn’t some weird Qanon theory about the Illuminati. These are very real concerns about scenarios that are extremely plausible. Apple isn’t your friend. If they believe it is in the best interest of the company to flip the script on this, they will in a second. But let’s assume they are telling the truth and they wouldn’t do anything like that.

The next guy in charge might feel very differently. Apple today is not the same company it was 10 years ago and it won’t be the same 10 years from now. Companies merge and get bought and sold all the time and long standing company polices get erased in the process. Remember when Google used to have “Do no evil” as a part of it’s code of conduct until some new guys came in and removed it?
 
It’s like your asking, “Why should the search for stolen property happen at the store where they think they saw you steal it instead of at your home and without a warrant?”

It comes down to a reasonable expectation of privacy. And I know people will say that Apple is not the government, but essentially providers of digital communication have been deputized by NCMEC. But since Apple is a company and not the government, the 4th amendment doesn’t apply. It’s a government’s dream to have unfettered access to its citizens’ box of ‘private‘ data.
 
  • Like
Reactions: Jstuts5797
I was just wondering how they decide CSAM photos vs normal photos? (I am not interested in real CSAM) So what happens if your child takes a nasty fall and you take a photo of it to show to doctors? How foolproof would a CSAM scan be?
 
I was just wondering how they decide CSAM photos vs normal photos? (I am not interested in real CSAM) So what happens if your child takes a nasty fall and you take a photo of it to show to doctors? How foolproof would a CSAM scan be?

CSAM scans are using hashes of known CSAM provided by NCMEC, not looking for new CSAM.

Also they’re looking for pedophile stuff, not regular physical child abuse.
 
A Case could be made without the outrage that even iCloud data shouldn't be scanned but thats not what the outrage is about. The primary objections of pundits and some users is that the scan is occurring on device rather than on iCloud.

so my Questions and confusion rather is this:

If the only available data TO scan for Apple to do on device is the exact same as the iCloud data , why does it matter where the scan occurs to users? especially if the scan is done by Device AI and Apple is only contacted when there is a BULK of hashes per device which match the one in Apple hash servers.
The users are objecting why the scan isn't occurring on servers. Well. It would be the exact same data as the one available to scan on iPhone. All the other data is still locked out to Apple

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

These questions above are different than WHy they are scanning to begin with but to that I have a question as well. If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.

"Why does it matter where the scan occurs?"

Cloud scanning (TL;DR, it's a garbage practice that's only possible because cloud providers choose to design their tech such that they can read everything that gets uploaded to them. It is NOT CAPABLE of ever reading the content on your device, only what the server receives)
:
  1. Only ever capable of scanning what the server actually receives. If I turn off cloud services then "what happens on your iPhone stays on your iPhone" is a completely true statement. Anything I do on/with my property stays on/with my property.
  2. Still invasive and anti-user, this shouldn't exist in the first place if companies actually bothered to implement zero access encryption (no mainstream company is doing this right now, including Apple outside of a few things like health data syncing supposedly). This is like random police roadblocks with car searches carried out in the name of stopping criminals. They could justify this by saying "you're using our (government) roads so you have to consent to being searched if you want to use them" but even that would be stretching it.
  3. Only narrowly avoidable. You can turn off cloud services but then your digital life becomes significantly harder to manage as you don't get the benefits of cloud syncing and storage. This solution is like saying "If you don't like the random police roadblocks then simply don't drive, use a bike instead." Apple and other companies can justify cloud scanning because they can say "If you're using our services and our hard drives you can't ask us to hold onto content that we can clearly see is illegal. US law requires us to report illegal content on our servers if we knowingly encounter it" That detail about "if we knowningly encouter it" is crucial because that means if they enabled zero access encryption and removed on device scanning (one of the most privacy respecting combinations you can think of) they would effectively be in the clear legally speaking at least in the US.
On device scanning (TL;DR, it's even worse than cloud scanning because it is CAPABLE of reading all content on your device regardless of if you're using cloud services or not):
  1. Capable of scanning literally every file on your local device regardless of whether or not somebody has iCloud enabled. Apple claiming to only scan photos that are about to be uploaded is simply a policy they chose to go with that can easily be overruled with a remote flag change on their servers (i.e, remotely enabling scanning all content not just iCloud destined content without the user updating any software). Additionally, Apple claiming to only scan for CSAM can be overwritten with a single update to the hash database. The bottom line is it sets the precedent for scanning the content on YOUR device that YOU own even if you choose not to use a server that somebody else owns. If there is a way to abuse a technology then it WILL be abused, that is an absolute. We should be treating this technology as if it were already switched to phase 2 which is to scan all content, not just photos right before they get uploaded. The EU are already considering mandating a similar technology that scans content before/after it gets sent via an encrypted service. Their talking point is the ultimate joke: "You can have your end to end encrypted technologies and conversations but we need to see what you're doing and talking about right before it gets encrypted because you could be sending or receiving CSAM or terrorist material" How do you think it's possible to scan end to end encrypted content between two devices? That's right, there's only one way: to enable on device scanning, the successor to on-cloud scanning.
  2. Outrageously offensive and authoritarian. No sense of private property. This is like if the police had a skeleton key to open any door in the country along with the authority to do so; searching every house whenever they feel like it under the guise of "you could be housing criminals in your home" and installing CCTV in every room of your home in the process. Literally no privacy even in your own bedroom (or the digital equivalent thereof). They can promise all they like to only search homes that they have evidence of criminals coming in and out of but that is completely irrelevant as the technology (for the police this would be the skeleton key, for the iPhone it's an on device scanner) can enable them to abuse this at any point with zero accountability especially if the law is structured to allow them to do this. They only need one legislative update to what is considered "criminal" and boom the infrastructure is already in place for them to come for anyone they want. There is no justification for this ever. The cloud scanning tech in place at the moment has already been adapted to also stop "terrorist material" (who defines this? In certain countries "terrorist material" is another term for "anti-fascist reading material") so good luck believing the lie that this will only ever be used for CSAM. I read an anecdote recently about a college student's Google account being deactivated because she was using Google Drive to upload pictures and videos of human rights abuses in the West Bank for her college assignment. The reason she was given? "Uploading terrorist material" (?!?!?!)
  3. Literally unavoidable, the only recourse here is to simply stop using technology. Even if you turn off cloud services you are still subject to being surveilled. Regardless of what you do "what happens on your iPhone stays on your iPhone" will never be a true statement. Again, there is no justification for this outside of this b.s: "If you buy one of our devices you don't actually have authority over what you can and cannot do with the device. What happens on your property is and will always be our business and the government's business because you could be doing something we don't approve of." Sounds preposterous to anyone that respects right to property but then again that is the EXACT justification they use to prevent anyone from installing apps outside of the App Store. Apple fans and weirdos that like to prioritize legalese and corporate rights over common sense and personal rights legitimately believe that's okay. It's ironic they'll be the first to comment on any Apple China related story about how bad personal freedoms and right to property are over there in "Commie Land" but then write an unironic post about how Apple severely locking down a device you OWN is somehow a good thing because "you own the hardware, not the software buddy." Can they explain why Apple don't provide a way to run alternate operating systems on iPhones then? I thought I own the hardware but not the software, right?
  4. If a car manufacturer came out and said "You can only drive on certain roads we approve of, use tires we manufacture, and whilst we promise not to continuously monitor your location we and the government will receive an 'encrypted ticket' if you visit a location on our secret blacklist multiple times. Don't worry, the location blacklist checking happens 'on device' in the car before we get the alert so that totally makes the act of blacklist checking okay! We can do this because whilst you own the car's metal and plastic we own all the software that's used to actually make that chunk of metal capable of doing anything. If you don't like it buy another car that will soon have the same technology and policies as we do if they don't already." Nobody in their right mind would think that's okay but apparently it's totally fine in the tech world.
Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to.

One of the central pillars of their entire marketing scheme is centered around "Privacy" and yet they ignored all their supposed privacy principles when entering the Chinese market. All iCloud services are hosted and operated by a Chinese state run server farm, yes that means all the keys to all encrypted content on iCloud in China are in Chinese government hands despite what Apple and their apologists might tell you. Do you think Apple would be allowed to sell in China if they did not comply with local regulations?

Similarly if the US government told Apple to start scanning for other content do you think Apple would refuse, stop selling products in the USA, and move their entire company elsewhere? You're on crack if you think they would, they already signed up to provide first party support for the NSA's mass spying programs. The "Apple refuses to help the FBI" story from a few years ago was also a PR stunt because Apple technically said "We can't help because we don't have the ability to crack open iPhone encryption in the first place." There's a big difference between refusing to do something you're capable of out of principle vs. refusing to do something because you're not capable of doing it in the first place.

You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

I don't trust Apple or any tech company with my private data, neither should you or anyone else, and that's the beauty of encryption: I don't need to trust anyone except the math. Mainstream tech needs a revolution in data management, ownership, and privacy via encryption. Apple's decisions here are just the tip of the iceberg but we should never accept the "lesser of two evils" argument as there is no such thing. A third option exists that is better: zero access encryption for cloud services. It's possible, smaller companies already offer it with 1/10,000th of Apple's resources.

If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.

We should not have to give up our privacy so authorities can catch bad people. There are a thousand other things we can do to help prevent and end the CSAM related problems in general, many of which involve overhauls to social programs but of course no politician in the US would support them because they 1) don't make money and 2) they don't come with the side effect of extending the reach of domestic spying tools which primarily exist to stop anyone that poses a threat to the capitalistic status quo. If you don't believe me go research how some of these patriot act fueled policies and programs (warrant-less searches of property within 100 miles of the US border for example) have been used to target climate activists, journalists, and technologists rather than just targeting 'terrorists' like they originally claimed.

Ultimately this is all related to how our country tackles crime and other related issues: doubling down on policing and spying by treating everyone as a potential suspect rather than trying to tackle the problems at their core via reformative and progressive policy that treats everyone as a human being with basic human rights.

The potential abuse of Apple's on device scanning is not a hypothetical if other technologies have already been abused for decades.
 
Data on the iCloud server farm is encrypted, you can not pick out an individual image from an encrypted format (at least it's not supposed to work that way😄 ). The AI on the device hashes images shared with iCloud and then compares those hashes to the CSAM hashes, if the hashes match then it sends a safety voucher up for review. If there's no match, it does nothing with the hashes/images. They would have to remove the in-transit encryption.

It's not "encrypted" in any meaningful way if Apple can access it. If you install a lock on your door but give someone you don't know the key, do you really have a lock on your door?
 
What is all of this actually going to accomplish? Open the door to further privacy intrusions under the banner of "think of the children" while protecting no one from anything. CP *******s aren't sending their goods through iCloud my man.

Want to search my ****? Get a warrant.

Reminds me of politicians not being able to respond when asked why they need to have an 'encryption backdoor' on common chat apps (iMessage, WhatsApp, Signal) when there is already irrefutable evidence demonstrating terrorist groups around the world have been programming their own encrypted chat apps. Same goes for this CSAM stuff, if you've got more than two braincells to rub together wouldn't you just not use iCloud in the first place or start creating your own encrypted chat tools?
 
I hope no one reverses the quest to look for pedophiles to put in jail

You can private message me your login information for all online services you use. Additionally I'd like for you to mail me your iPhone and passcode so I can browse around that too. I'm only asking because you're potentially a pedophile or terrorist and you clearly have no reservations about someone searching through your stuff so long as it's part of a quest to put criminals in jail.

You can trust me (a stranger) in the same way that you can trust Apple (strangers in a nice spaceship office building that have a history of collaborating with the government) or the government (strangers in a dingy office building that have a history of violating people's rights).
 
It's not "encrypted" in any meaningful way if Apple can access it. If you install a lock on your door but give someone you don't know the key, do you really have a lock on your door?

Aside from that there are scanning techniques that work for known hashes despite encryption. Apple, Google, Amazon, and others have been using it for a while now.
 
Aside from that there are scanning techniques that work for known hashes despite encryption. Apple, Google, Amazon, and others have been using it for a while now.

Apple, Google, Amazon, and others have been scanning for content on their servers.

Apple are now about to start scanning content on YOUR device that YOU own. Huge difference. Both are bad, this is catastrophically worse.
 
There’s not enough outrage.

“And while we might all agree that adding this capability is justifiable in the face of child abuse, there are huge questions about what happens when governments around the world, from the UK to China, ask Apple to match up other kinds of images”

 
  • Like
Reactions: GBaughma
Apple is out to get you. Watch out!

I dislike paranoid conspiracy theorists that only think about "what if". That's like my wife getting mad at me for having thoughts that I haven't even had yet.

Yeah man, I hate those conspiracy theorists too. They're always harping on about absurd things that could never happen like "What if the US government started doing biological experiments on black people" or "What if the government started profiling us based on everything we do online" or "What if the government started tracking everywhere we went" or "What if the government dropped a nuclear bomb on innocent people and killed 200,000 of them in the process" or "What if the government had a policy to ensure a baseline level of extreme poverty to control inflation" or "What if the government lied to the public in order to invade another country for economic purposes"

The people that like to think about "what if?" are often the people that have the ability to see more than 100 feet beyond them. They're the same people that are currently thinking about what our planet will look like 50 years from now whereas many people don't think beyond where they want to get lunch over the weekend. They're also the people that protect our rights by analyzing historical precedent and extrapolating what an authority could do with a given power. Some words used to describe them might be "investigative journalist" or "academic."

Lumping in the minority of people whose version of "what if?" is thinking "what if gatorade makes my pet dog gay" in the same category as people that ask "what if the government had the capability of spying on everything we do" is extraordinarily ignorant and disrespectful to smart people that actually give a damn about our country and planet at large.

It's not just enthusiasts and ordinary people like myself expressing concern about this, many people smarter than us with a greater understanding of the technology, and importantly history, are also expressing deep concern.
 
There’s not enough outrage.

“And while we might all agree that adding this capability is justifiable in the face of child abuse, there are huge questions about what happens when governments around the world, from the UK to China, ask Apple to match up other kinds of images”


Yeah there's not enough outrage in large part because many news outlets have downplayed it. Mainstream news outlets (including MacRumors and The Verge you linked) often act as stenographers and not journalists so they don't try to think and write about anything beyond the model of the world that is presented to them by the people they're supposed to be critiquing.

They framed the conversation by trying to focus the conversation around "Well we're just scanning against known hashes of material so don't worry your personal photos are safe" (i.e, focusing on the technical details of the scanning implementation they'll be releasing at launch) rather than letting people ask why any form of scanning should be happening in the first place.
 
They were scanning every single photo uploaded to iCloud since 2019 for CSAM, just like they're going to do in iOS 15.

The reason the on-device part is being so fiercely fought against is that it builds in the capability to on device scan and compare to a black box database that is not even something Apple knows the content of.

That capability and the implications of it existing, particularly if ever used to match against other future databases and content types - ON the devices, is the THE issue here.

It's not specifically about CSAM -at all in fact.
It isn't even about iCloud Photo Library. It's about this capability existing in the software, on the devices.

This is different than face photo matching we have currently.

For the record, I'd be thrilled to opt out of any photo analysis at all being done by Apple.
I don't care a lick for their face detection or memories stuff.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.