Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think that cloud scanning is just too pervasive in tech. Every single company/device/service relies on cloud scanning so the EFF knows that they cannot call for all cloud services to stop in the name of privacy. They have to draw the line at on-device scanning and Apple seems to the be the biggest target these days.

It makes me wonder why Apple even tried the on-device thing.

The last thing they want is people freaked out about snooping in their phones. They even had billboards about it!

What happens in Vegas... :p
 
  • Like
Reactions: eltoslightfoot
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
I’m kinda wondering if this was a preemptive move because they saw more intrusive government regulations on the horizon if they didn’t take action. Do note though that they’ve been scanning all your photos for years, that part isn’t new - how do you think they can tell you which pictures contain grandma, or cats? And all the other providers have long been scanning your photos in their cloud services for CSAM for quite some time. Apple could have handled this better, and certainly could have presented it better, but they’re to some extent getting singled out for criticism here.
 
Apple needs to shutter their CSAM plans forthwith. Just throw it in the bin of products to never release. Well justified privacy arguments aside, I see little talk about unintended consequences, and this goes for other companies too, who are scanning for CSAM in the cloud or wherever.

Wide spread CSAM detection will only lead to massive increases in human/sex trafficking and child exploitation/molestation. Do you really think that kiddie pornographers are going to stick to sharing (en mass) images that are already identified, hashed and tagged in the CSAM database?!?! These sick people are going to be obtaining and producing new material on an exponentially increased frequency to keep ahead of identification and tagging. Do you have any idea what that will do to the levels of abuse of children - the very (vulnerable) group you're professing to want to protect? Your plans will be counter-productive. Tech companies will cause a significant increase in this heinous behavior. Stop now! Crimes of exploitation are best handled by good police/detective work.

Back to privacy: The building-in of what essentially is a back-door will inevitably lead to more abuse of power and loss of liberty at the hands of governments around the globe, including in those countries where you thought you were free. We've all already seen Apple and others capitulate to communist China, the world leader in human rights abuse. Do you really think that creating hashes of kiddie porn is where this will end? Hashes can be created and cataloged for any image, any sound, any video, any document, or even any word. What happens when the language of every country is hashed and cataloged (if it isn't already)? Once you can identify the hash of anything being communicated to the "cloud," then you know exactly what the content is - the hash becomes equivalent to the data, so then what is the point of encryption at all? Put the lid back on Pandora's box now.
 
CSAM would be an unconstitutional violation of the 4th Amendment, which prohibits unreasonable SEARCHES and seizures. CSAM might be a good tool to use on a suspected target if there is a WARRANT.
 
  • Like
Reactions: russell_314
Definitely no future Apple devices or iOS 15 for us until Apple tells us exactly what they plan to do with this. Our company phone plans are up and we would already have had a large iPhone 13 order in by now. For security reasons, our company will not pay for devices that willingly give other parties access to our confidential material. (The hashes are not publicly auditable so there is no way of knowing what authoritative governments, the NSO, or hackers are looking for) Better act on this soon Apple or you just lost a bunch of sales. We are also waiting on the new MacBook Pro's. Those will also be a no go until we understand where Apple plans to go with this technology and their new found desire to act as police instead of being a corporation that just concentrate on making great devices.
 
It makes me wonder why Apple even tried the on-device thing.

The last thing they want is people freaked out about snooping in their phones. They even had billboards about it!

What happens in Vegas... :p
I still haven't heard an explanation from Apple or anyone else as to why hashing on-device first is superior or more effective than only hashing in the cloud. I know Apple needs to keep doing more on-device than competitors since their mobile chips are their showstoppers and their cloud services are not up to speed but their explanation here is overly complicated and offers no real advantage to users so of course no one wants this.
 
  • Like
Reactions: glowdragon
I don’t think so. This has nothing to do with CSAM but rather pressure from governments like USA and China to gain access to iPhones. Both countries are having problems with terrorist and their own citizens going against the government so they want to be able to monitor what is going on better.

I think Apple will just quietly implement the feature once the media frenzy has died down. I’m fairly certain the code is already an iOS 15 so it’s probably just throwing a switch remotely to activate it. There is no indication on the users phone so you wouldn’t know if it was running or not
There's no way for them to secretly implement everything they want to however. The proposed scanning functionality in Messages is visible to the user.
 
I don’t know what EFF is, probably a good privacy advocate organization, but I have to say, this would have been a funnier story if it had been NAMBLA or people pretending to be them.
 
Last edited:
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..?
Perhaps someone in a position of power had a personal experience that drove their passion?
 
I'm not waiting anymore for further information about this from Apple. I've already moved over to GrapheneOS on a Pixel 5, replaced my Apple Watch with a Casio G-Shock not-smartwatch, and I'm working on exfiltrating my massive iCloud photo library to a NAS.

I had already replaced my MacBook Pro with a ThinkPad running Ubuntu, but that was because of multiple problems with Apple software, not related to privacy.

Apple destroyed their reputation with this and I'm done with them at this point.
 
I'm not waiting anymore for further information about this from Apple. I've already moved over to GrapheneOS on a Pixel 5, replaced my Apple Watch with a Casio G-Shock not-smartwatch, and I'm working on exfiltrating my massive iCloud photo library to a NAS.

I had already replaced my MacBook Pro with a ThinkPad running Ubuntu, but that was because of multiple problems with Apple software, not related to privacy.

Apple destroyed their reputation with this and I'm done with them at this point.
Pretty darn extreme... that G-Shock has got to suck.

Do you have CSAM on your device? Why so concerned? I get that yes... this could get ugly if governments decide to add their own hashes, but it's not currently ugly (or even happening yet)... seems quite premature.

I could fear a hurricane coming on Thanksgiving, but it's not very likely.
 
Pretty darn extreme... that G-Shock has got to suck.

Do you have CSAM on your device? Why so concerned? I get that yes... this could get ugly if governments decide to add their own hashes, but it's not currently ugly (or even happening yet)... seems quite premature.

I could fear a hurricane coming on Thanksgiving, but it's not very likely.

My new watch has a battery life of many months, recharges from light (sunlight, indoor LEDs, whatever), syncs time automatically with atomic clock signals, and doesn't buzz my wrist with notifications. While I miss having a complication that tells me the outside air temperature, I'm otherwise very happy with switching from a smartwatch to a more traditional watch.

I am concerned because I do not consent to my property being searched. If they wanna scan iCloud photos, they can scan them in iCloud. They don't get to waste my CPU cycles and my battery to surveil my photo library. photoanalysisd is already ******* inefficient and memory-leaky on both iOS and macOS, I don't need them making it even worse for absolutely no benefit to me and at very clear risk to freedom and privacy in general.
 
If you put something on the cloud you are transferring it to someone else's servers, whether Apple, Google, your web hosting account, and so on. Someone else's hardware and responsibility, so yes, a company scanning what you put on their servers is expected and should not be a surprise. Put something on the cloud you do not want to be scanned? Encrypt it.

Quite different from coming into my private space uninvited and looking through my content.
Except this on device scanning is part of the iCloud Upload Pipeline. It is doing this scan when you are actively uploading to iCloud. Therefore, it should make NO difference.
 
That's not completely true. They scanned iCloud email, but not iCloud photos. That's why FB had in the 20M range of CSAM reports vs. Apples hundreds. Apple execs even said in recently released docs they know they are a big place for CSAM [1].

Yep, agreed. Apple is seen as a "Safe Haven" for this stuff, I have always said that. And the article confirmed it. They had a miserable 240 reports vs Microsoft/Google/Facebook/ others in the 5 or 6 or 7 digits!
 
The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles. The whole point is that back doors are not good, period. If you want to catch pedophiles, that's good, but don't do it while putting in place a massive data collection from people's smartphones or cloud storage, even if its only the Hash. If for an example a hacker breaches your iCloud password, he could upload this type of indecent images and you will have a hard time explaining to the cops that these photos are not yours. This same technology could also be used for different purposes if you substitute child violence with for example images of political pamphlets. Virtually half the world's population lives under some level of totalitarian regime where only a few care about privacy or democracy, so the iPhone should be a fortress as far as personal information is concerned IMO.
Sorry, but this kind of argument is just ridiculous at this point. If the government wants something SO BADLY, it doesn't matter what we all say. They will force Apple to implement it or exit the US market entirely.

CSAM or not, those kind of situation will always be a possibility.
 
It is unbelievable to me how much money and mental energy people are spending against what I consider to be a non-issue. All kinds of existing technology can be abused, yet we don’t campaign for it to be eliminated. And it still seems that a huge amount of people don’t even understand that 1. The scanning would only be active if you enable iCloud for photos (or elect not to turn it off, as the case may be) and 2. Apple wouldn't be able to see anything on your phone with the scanning process. The only time any scanning information gets exported from your phone is if you upload an illegal image to iCloud and even then Apple can’t decrypt that until there are 30+ Illegal images uploaded.
Yep. Also when every other major platform scans for it (Dropbox, OneDrive, more), nobody has an issue. But the moment Apple tries to implement something, its the end of the world. Absolutely ridiculous.
 
Maybe you should ask that to all the experts and privacy organizations world wide who clearly do not consider this a non-issue. Also you should ask Apple themselves (who decided to put it in the fridge for now and at least consider the concerns of all those organisations). According to your logic and reading your relentless posts in other threads all those experts and organisations with legitimate concerns must be lunatics with tin foil hats.
Where are their complaints about Dropbox, OneDrive, Facebook and more doing these scans? Why is everything ONLY ABOUT APPLE?
 
  • Angry
Reactions: glowdragon
CSAM would be an unconstitutional violation of the 4th Amendment, which prohibits unreasonable SEARCHES and seizures. CSAM might be a good tool to use on a suspected target if there is a WARRANT.
Except, Google, Microsoft, Facebook and others are already scanning for the SAME THING. Where were the complaints then? It is built in the iCloud Pipeline - meaning ONLY items being uploaded are scanned. Not every single bit on your phone is tracked.
 
  • Angry
Reactions: glowdragon
I'm not waiting anymore for further information about this from Apple. I've already moved over to GrapheneOS on a Pixel 5, replaced my Apple Watch with a Casio G-Shock not-smartwatch, and I'm working on exfiltrating my massive iCloud photo library to a NAS.

I had already replaced my MacBook Pro with a ThinkPad running Ubuntu, but that was because of multiple problems with Apple software, not related to privacy.

Apple destroyed their reputation with this and I'm done with them at this point.
Look, if the government wants to pry on us THAT BADLY, they will go after the GrapheneOS too. Make it illegal to offer a system without those tracking features.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.