Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah.... about that: No competent network security pro would ever state "I'm not going to worry about security because <thing> isn't worth hacking." Trust me: If it's network-accessible, somebody, somewhere, will find it worth hacking--if for nothing other than S&G's or "because it's there" or "because I can" or...
Don’t get me wrong…I worry about it every day.

It’s more of a response to people on here stating their outlandish examples of people altering the hash database like that would somehow put anyone at more risk. Then breaking into their phone to upload matching images. It’s literally the stuff movies are made of.

So, yes, should a reporter who is about to break a story about the leader of a country embezzling billions of dollars be worried about some shady spy agency breaking into his/her phone to set them up illegally? Sure….totally could happen and HAS happened.

But me? Nah….(and I’m quite an important person…in my own mind…)
 
It’s more of a response to people on here stating their outlandish examples of people altering the hash database like that would somehow put anyone at more risk. Then breaking into their phone to upload matching images. It’s literally the stuff movies are made of.

Exactly.

And people are quick to point out the privacy concern... when there is a 0.0000001% chance of anyone actually seeing your photos. Face it... nobody wants to see your photos.

Meanwhile... there are companies who are literally selling your demographic info, location data, browsing history, online purchases, podcast listening habits, etc.

That's the kind of stuff I think about in terms of privacy.

:)
 
You guys are missing the point.

It's not about scanning for CSAM.
Yes it is.
It's about scanning AT ALL.
No it isn’t. If it was you would have complain as soon as your phone could recognize that pictures contained dogs or cars, or whatever else it can ”see” in your photos. What you are really concerned about is reporting what it finds… and that capability was always there for any connected device.
Scanning can be implemented by any bad actor regardless of Apple implementing this. Apple doing this doesn‘t change that.
The opportunity for abuse is HUGE.
Yes the opportunity to do abusive scans is huge. And that opportunity has been there for a long time. Apple implementing this feature in iOS 15 isn’t something that has suddenly enabled that.
All it takes is changing the database hash, and now you can scan for political messages, terrorism, BLM, Antifa... you name it.
If you can compromise the system to change the database hashes, or force Apple to do it, you can compromise the system to install a scanner, or force Apple to do it.

Implementing this scanning feature in the first place is not the back door. Image recognition algorithms opened the door for abuse, not this particular iOS feature.

Apple openly doing this isn’t the abuse you are looking for. ML image recognition can be abused, right, I agree. The thing is, it‘s been around for a while now. You can‘t un-invent it. Devices capable of image recognition + malware, there you have your illegal surveillance. Replace malware with governments forcing something you don’t like, there it is again. Apple doing this for CSAM isn’t the enabler. As soon as we had connected devices capable of categorizing your photos by content, the abuse you are concerned about was always a possibility.
 
From Putin-boy’s latest blog post against the most successful American company:


If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the “Disable iCloud Photos” switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand. As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.


Dumb take, all companies keep CSAM off their servers, so all this “doing it for the headlines” argument could be applied to any cloud host.

Also, dumb “pedos can opt-out of it anyway” take. If they all were that smart, Facebook wouldn’t catch millions of CSAM pics every year.
 
I’m curious, can you provide an example of how it may be subverted? Give us a “real world” example as it applies to this specific technology being added to iOS.
Let me attempt to try provide a "real world" example that is personal to us. This is according to my understanding of this issue and might not be correct.

The issue here is not really the way the technology works, but the way Apple implements it breaks the trust we have with our phone and the trust we have with Apple. Privacy is a type of trust.

1. Your car (iPhone) and its speedometer (CSAM detector on your iPhone). And the road (iCloud)
So you own a car which is your personal vehicle you bought. In a normal car there is nothing from the car that makes you doubt anything. You trust it by nature. You drive it on the road which is a public property. Surely you make mistakes wether deliberately or accidentally sometimes by driving faster than the law on the highway. The highway camera takes a photo of your speeding car and fines you (this is the equivalent of scanning on the server side of iCloud).

Fine, you still trust that your car is on your side. It never betrays you and you never have to doubt it. The only way that you can get fined is by the police placing cameras on the highway or whatever other methods they use. Your speedometer knows the car speed at all times and is like how the AI on your iPhone is analyzing all your photos in your photo library to identify faces and dogs. But your speedometer and car do not report you to the police if it detects you're speeding. Same as the AI on your phone only analyzing the photos for you and only for your benefit.

So here is what will happen if Apple implements that feature. Your car will now record every time you speed. Whenever you want to go onto a highway (use iCloud photos) the car will be forced to report the police if you have speed more than 30 times on a local road (an example of subversion). Imagine that feeling. It feels like you can't really trust the car. You feel doubt and don't know when its going to betray you behind your back. Sure don't use the highway ever again, you have no secrets now because the car knows how your drive. We all make mistakes sometimes and what if that mistake is classified as a break of law one day (an example of subversion like in authoritarian governments "winnie the poo picture")? You cannot hide and fix before getting reported. You have no privacy, no room for error.

If given the choice, do you want to buy a car that only drives and shows you the speed. Or a car that drives, shows you the speed, and also reports to the police if you have sped 30 times and want to use the highway?

Some of my arguments here are not well thought out so it might change (like the highway and local road analogy, someone might say well don't use the highway then, etc).

2. Owning a self reporting iPhone feels like having a friend who likes to tell your mom or teacher every time you do something wrong (in this case its 30 mistakes).

You just don't feel the trust in that relationship. Do you want to spill your secrets or discuss important matters with this friend? No. If one day your mom or teacher (authorities, governments) ask your friend to tell them every time you talk about smoking (an example of subversion)?

I have friends where I trust they won't gossip behind my back no matter what I speak with them. That is trust and privacy.



Please let me know if these "real world" examples are convincing hahaha.
 
So, yes, should a reporter who is about to break a story about the leader of a country embezzling billions of dollars be worried about some shady spy agency breaking into his/her phone to set them up illegally? Sure….totally could happen and HAS happened.

But me? Nah….(and I’m quite an important person…in my own mind…)
Perhaps think of this as us standing up for all the reporters and journalists out there then. Privacy is a human right.
 
No it isn’t. If it was you would have complain as soon as your phone could recognize that pictures contained dogs or cars, or whatever else it can ”see” in your photos. What you are really concerned about is reporting what it finds… and that capability was always there for any connected device.
Scanning can be implemented by any bad actor regardless of Apple implementing this. Apple doing this doesn‘t change that.
You might find Edward Snowden's explanation enlightening.


"You might have noticed that I haven’t mentioned which problem it is that Apple is purporting to solve. Why? Because it doesn’t matter."
 
  • Like
Reactions: dk001 and rmariboe
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
There's one big difference: It's on the cloud, not locally.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
One of many reasons I chose Apple - and not Google, Adobe og Facebook - to store all my photos is the fact that I trusted Apple alone to keep their grubby mitts off my private matters. This is about to change and everything will be going back offline if they don't do a full reverse.
 
So, this means our iCloud data is no longer encrypted.
That's interesting since Apple used to tout personal encryption keys stored only on you device - for you privacy. Your privacy which is now going down the drain.
 
  • Like
Reactions: crymimefireworks
So, this means our iCloud data is no longer encrypted.
That's interesting since Apple used to tout personal encryption keys stored only on you device - for you privacy. Your privacy which is now going down the drain.
No, that's the point. If Apple does this scan they have to do it on your phone because it is encrypted in transit and on their servers, so they can't do it there without breaking the promise of privacy. If you don't want them doing this scan, don't try to send the file to their servers.
 
  • Like
Reactions: lxmeta and MozMan68
Stop and think about this little item for a moment ... numbers.
Apple claims they have a major CSAM problem on the iCloud.

iCloud has XXXXXXX CSAM items on it.

So Apple is going to put a tool on your Apple device that checks for CSAM only if you have iCloud Photo on. Apple also tells you how to not get caught.
That will catch XX at best.

So what is Apple doing about the XXXXXXX currently on the iCloud?
So what is Apple doing about the other ways to load CSAM (XX) onto the iCloud?

This whole thing makes little sense if they are trying to combat a major CSAM issue on the iCloud.
 
Stop and think about this little item for a moment ... numbers.
Apple claims they have a major CSAM problem on the iCloud.

iCloud has XXXXXXX CSAM items on it.

So Apple is going to put a tool on your Apple device that checks for CSAM only if you have iCloud Photo on. Apple also tells you how to not get caught.
That will catch XX at best.

So what is Apple doing about the XXXXXXX currently on the iCloud?
So what is Apple doing about the other ways to load CSAM (XX) onto the iCloud?

This whole thing makes little sense if they are trying to combat a major CSAM issue on the iCloud.
What if Apple really are committed to privacy and don’t want to catch anyone? After all, catching criminals isn’t their job or their speciality. What if they just want to make the best most secure photo service they can, without having to worry that it’s handing powerful secrecy tools to dangerous people? It all makes more sense if you think of the system as a deterrent, like a bag search at a venue.
 
I'm trying to figure out how effective this "feature" will be if it is not implemented on every Cloud based service. Any criminal with half a brain will just remove all of their photos from iCloud. This all sounds good from a PR perspective until you think of the dangers of opening up your private data to monitoring from any government or private organization. The road to hell is paved with good intentions.
 
What if Apple really are committed to privacy and don’t want to catch anyone? After all, catching criminals isn’t their job or their speciality. What if they just want to make the best most secure photo service they can, without having to worry that it’s handing powerful secrecy tools to dangerous people? It all makes more sense if you think of the system as a deterrent, like a bag search at a venue.

Could be. At the present we don’t really know.
There are a lot of “If” items out there on this and Apple has remained kind of mum.

What I am hoping is that the outcry has made Apple step back and take another look.
What I wonder at is how Apple missed the concerns all the security and privacy groups have raised.
Or maybe they didn’t.

Either way, hope we get a much clearer picture of this whole thing.
 
What if Apple really are committed to privacy and don’t want to catch anyone? After all, catching criminals isn’t their job or their speciality. What if they just want to make the best most secure photo service they can, without having to worry that it’s handing powerful secrecy tools to dangerous people? It all makes more sense if you think of the system as a deterrent, like a bag search at a venue.
Do you want a camera in your home as a deterrent? The point is I don't want to be spied on and monitored in order to make sure I obey laws.
 
  • Like
Reactions: Schismz and dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.