Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple should simply scan the servers....

This is a back door, combined with technology that reads chats on the phone....Remember, even friendly countries such as Australia, albeit a member of the "5 Eyes", has a law that clearly empowers ministers to compel firms to retrain an existing surveillance system on different images, vastly expanding the scope of Apple’s proposed snooping.
Apple cannot scan at the server end if they decide to start E2EE for iCloud Photos. I think this development will allow them to start E2EE, which will further enhance user privacy.

My take is that Apple is doing this to comply with local laws. If in your hypothetical situation where any countries made it a legal responsibility to perform mass surveillance, Apple will not be the only one affected. Everyone operating in that country will be affected, because the law dictated it.

Apple doesn't get any benefits at all for doing it unilaterally. I think this point is important when discussing such issues.

If anyone already made up their mind that Apple is acting on nefarious reason without asking why, it's difficult to have any meaningful debate.

To me, this development is just Apple complying with US laws but yet still intent on advancing user privacy with proper E2EE for iCloud Photo. Apple is not being altruistic. Making their device and services privacy focused is a differentiating factor that helps them sells more. If making more profits is not their objective, then Apple have no reason to exists.
 
  • Like
Reactions: one more
Apple cannot assure denying national law. What is legal and just for us may be subject for prosecution elsewhere. There only needs to be a simple law as requiring existing systems to assess all illegal activities. This may very well include political view, sexual orientation, and more.

Understood. If it moves this way, however, would not all tech companies be forced to put similar systems in place? The issue is then not Apple vs people, yet people being over-watched or oppressed by their respective governments. For now Apple addresses this kind of concerns in their CSAM FAQ paper, check its “Security for CSAM detection for iCloud Photos” bit towards the end:


Historically, Apple have proven to protect their users’ privacy and resist external pressure from authorities quite well.
 
Apple cannot scan at the server end if they decide to start E2EE for iCloud Photos. I think this development will allow them to start E2EE, which will further enhance user privacy.

My take is that Apple is doing this to comply with local laws. If in your hypothetical situation where any countries made it a legal responsibility to perform mass surveillance, Apple will not be the only one affected. Everyone operating in that country will be affected, because the law dictated it.

Apple doesn't get any benefits at all for doing it unilaterally. I think this point is important when discussing such issues.

If anyone already made up their mind that Apple is acting on nefarious reason without asking why, it's difficult to have any meaningful debate.

To me, this development is just Apple complying with US laws but yet still intent on advancing user privacy with proper E2EE for iCloud Photo. Apple is not being altruistic. Making their device and services privacy focused is a differentiating factor that helps them sells more. If making more profits is not their objective, then Apple have no reason to exists.
I fail to see how encrypting iCloud Photos would enhance user security when there exists an effective back door on all Apple devices. In the worst case scenario, the devices will be scanned, information read, and then one uploads to a secure server.....

Fully admit that I am not a cyber security expert.
 
Understood. If it moves this way, however, would not all tech companies be forced to put similar systems in place? The issue is then not Apple vs people, yet people being over-watched or oppressed by their respective governments. For now Apple addresses this kind of concerns in their CSAM FAQ paper, check its “Security for CSAM detection for iCloud Photos” bit towards the end:


Historically, Apple have proven to protect their users’ privacy and resist external pressure from authorities quite well.
Not to be overly critical, but when you write "Apple have proven to protect their users' privacy and resist external pressure from authorities quite well", you are of course ignoring instances such as moving all user date to government controlled servers in China. Or removing FaceTime from devices in Saudi Arabia. Could list South Africa, the Philippines, and the list continues.

Yes, this is business, but at some point it will all come back to us.
 
I fail to see how encrypting iCloud Photos would enhance user security when there exists an effective back door on all Apple devices. In the worst case scenario, the devices will be scanned, information read, and then one uploads to a secure server.....

Fully admit that I am not a cyber security expert.
The key is to accept Apple's explanation on how they are doing it. Apple published quite a few articles explaining what they intent to do. They are using on device intelligence to detect a very specific set of illegal materials. No humans gets to see on device materials unless multiple instances of such materials are found that is also uploaded to iCloud Photo. Otherwise nobody will know what contents the device has except for the device owner.

You can argue that this opens up Apple to 'snoop' on everything as well. Then the question would be why?

If you think this latest development enabled Apple to snoop, why now? What's stopping Apple, or for that matter Samsung, Google, or any device or OS publisher from doing this? For all we know, iOS has been snooping on users all along. If so, how does it benefit Apple? Most of the time, the most simple explanation is the best one.
 
  • Like
Reactions: Abazigal
Apple is searching ALL my photos on MY device before I store them on their servers, not only my photo selection AFTER I have stored them on THEIR severs. This is violating my privacy.

My private device is private. Apple has to respect this. This requirement gets not limited by their possible intentions to make their own end to end encryption easier for them on their servers. My device is untouchable to them. (If the police has a search warrant the police can look for whatever they need to look for but not Apple just in case)
 
Last edited:
  • Like
Reactions: VulchR and Playfoot
Not to be overly critical, but when you write "Apple have proven to protect their users' privacy and resist external pressure from authorities quite well", you are of course ignoring instances such as moving all user date to government controlled servers in China. Or removing FaceTime from devices in Saudi Arabia. Could list South Africa, the Philippines, and the list continues.

Yes, this is business, but at some point it will all come back to us.

Apple have servers in many countries and jurisdictions and China is just one of them. I am not sure, however, that the Chinese government has access to the data or can decrypt it. As for FaceTime in the Middle East (and not just Saudi Arabia), this is done mostly under their local telecoms pressure. It is not only Apple’s problem there - Skype, Viber, WhatsApp - all have struggled there for ages. Mostly telecoms greed. Ironically, if I am not mistaken, it was the UK company helping the UAE to block these “free” to user communications tools. Go figure. 🤷🏻‍♂️
 
It's all the freaking news has talked about. It's in my every newsfeed... on CNN, WSJ, MSNBC.... I can't say if it's on Fox or Newsmax.... assuming not from your tin foil hat manifesto.

View attachment 1819300
I am from EU and really did not see serious media coverage. And please, stop with this tin foil hat manifesto.
It is 2021, there is a massive push for digital policing (vaccination passports etc.), check what is going on in France.
Apple is global corporation. The biggest fear from this implementation are coming from potentially international misuse by authoritarian governments.
 
  • Like
Reactions: snek
No Apple. Just No.

You are crossing some serious boundaries here. You want to push 'features' to my purchased devices that scan my data without my consent.

Protecting children is just a narrative you're using to justify violating user privacy. If Apple bothered do one minute of research they would know that 93% of child sexual abuse victims KNOW their abuser. Either as an acquaintance or worse family members.

And that that's where MOST of child sexual abuse happens! Not by strangers.

Now you want to alert the very people that are most probable to be sexually abusive, about their child, sending or receiving nudes?

Why are giant private tech companies that are supposed to SERVE their purchaisng customer base, now policing them as though guilty until proven innocent? What business is it of yours Apple? Did you even bother ask? - No. Of course not. 'First trillion dollar company' must have inflated that sense of untouchability a little too much, huh?

You're a BUSINESS Apple, not a totalitarian regime.

Not that I believe they are, but if Apple was serious about stopping child predators, they'd target their spyware on the 93% most probable abusers, acquaintances and family.

But that just might paint an uglier picture than society is willing to face at this time.
 
  • Like
Reactions: Playfoot
Or, they are simply more interested in the revenue streams associated with the demands of totalitarian markets than they are in your privacy. In other words, privacy has become an impediment to greater revenue growth.
Absolutely crossed my mind when they made this puzzling move. The question remains though, from the get go they have shown they can do this by a regional basis or even country by country basis, so why subject everyone to this when they have that capability? This is so shady and leaves a very, very bad taste in my mouth.
 
The key is to accept Apple's explanation on how they are doing it. Apple published quite a few articles explaining what they intent to do. They are using on device intelligence to detect a very specific set of illegal materials. No humans gets to see on device materials unless multiple instances of such materials are found that is also uploaded to iCloud Photo. Otherwise nobody will know what contents the device has except for the device owner.

You can argue that this opens up Apple to 'snoop' on everything as well. Then the question would be why?

If you think this latest development enabled Apple to snoop, why now? What's stopping Apple, or for that matter Samsung, Google, or any device or OS publisher from doing this? For all we know, iOS has been snooping on users all along. If so, how does it benefit Apple? Most of the time, the most simple explanation is the best one.
Well, to be clear, I think Apple is providing a back door, eroding their claims, true or false of privacy commitments, and potentially enabling all types of agencies / national entities to demand access. And in turn, as I understand Apple will no longer be able to claim it can't peek on the device....

I don't believe Apple wants to snoop anymore than they already do by scooping data, and of course using its customers as test subjects for SIRI and other half baked products launched too early, and then charging full whack for the beta-tech.
 
  • Like
Reactions: VulchR
Apple have servers in many countries and jurisdictions and China is just one of them. I am not sure, however, that the Chinese government has access to the data or can decrypt it. As for FaceTime in the Middle East (and not just Saudi Arabia), this is done mostly under their local telecoms pressure. It is not only Apple’s problem there - Skype, Viber, WhatsApp - all have struggled there for ages. Mostly telecoms greed. Ironically, if I am not mistaken, it was the UK company helping the UAE to block these “free” to user communications tools. Go figure. 🤷🏻‍♂️
Yes, as of this year, it has been shown that the data in China is on effectively government servers, the encryption keys held by the Chinese company, etc.

And while it is the law of China, I guess one, I am simply frustrated by the possibility of the the greater back door access, and two, ongoing screaming by Apple from the tops of roof tops about their commitment to privacy but in reality not really seemingly caring overly much.

At some point when do WE, the people stand up and say enough. Or perhaps those of us who read, follow and do our best to protect privacy are a minority. Perhaps most are happy to continue to allow the erosion and loss of their privacy for shiny baubles.
 
How Apple will actively keep the chance of erroneously flagging you at 1 in a trillion by tweaking the positives threshold

5903418B-6657-4687-9DE1-FD16F89A01A4.jpeg


nice 👍

Literally zero chance of escalating to human review for people not owning CSAM pics.
 
  • Like
Reactions: star-affinity
Slippery slope arguments are logical fallacies. You can deal with these issues separately. Apple should do what it can to stop perverts from hurting kids.
It is not a logical fallacy when there is a history of abuse attached to the action. It is not unreasonable in this case to call it a slippery slope when all instances before of, "Won't anyone think of children!" excuse for taking away privacy hasn't lead to future abuses to your privacy. The slippery slope is real and just because it is Apple we should not excuse it nor should we believe this is the end of it.

It's really, really important we make a huge stink about this and victory will only count if we force Apple to abandon this. otherwise they'll do what all other companies/governments do with unpopular, draconian decisions... wait for it to die down, let apathy kick in then let it become the new norm. Then press ahead with more unpopular decisions.
 
Thank you. That was exactly what I was implying and waiting for others to write - if it needs 30 attempts, then it seems to me the system is NOT as full proof as Apple claims.....
It isn't 30 attempts, it is 30 photos.
 
The key is to accept Apple's explanation on how they are doing it. Apple published quite a few articles explaining what they intent to do. They are using on device intelligence to detect a very specific set of illegal materials. No humans gets to see on device materials unless multiple instances of such materials are found that is also uploaded to iCloud Photo. Otherwise nobody will know what contents the device has except for the device owner.
I don’t care if it’s a man or a machine. These conversations start with “Apple is” and end with “doing it”. I don’t care how groovy or complex or transparent or understandable the spyware is.
It quacks like a motherfloating duck.

You can argue that this opens up Apple to 'snoop' on everything as well. Then the question would be why?
Not relevant. I don’t care why someone hits me, robs me, or spies on me. The only thing that matters is that they do.

If you think this latest development enabled Apple to snoop, why now? What's stopping Apple, or for that matter Samsung, Google, or any device or OS publisher from doing this?
Abso-freaking-lutely NOTHING.

For all we know, iOS has been snooping on users all along.
Exactly.

If so, how does it benefit Apple? Most of the time, the most simple explanation is the best one.
Again, who cares?

Apple should’ve just shut the fruit up about this whole privacy thing, then, from the start.

They harped on it so hard they actually convinced us all to believe in the concept and then turn around and spend an ENORMOUS amount of resources on violating it, explaining how they are violating it, and explaining why they are violating it.

How thoughtful.
 
I have found the way to explain what is this system with one question:
DO YOU TRUST YOUR GOVERNMENT?
And if you do. Do you think that all the world trusts theirs. In the country that I live this question is rhetorical with clear answer: NO.
Technical implementation is done in a way to give you an impression of "encryption" and "security". But people are not afraid of Apple. People love Apple. They trust Apple.
People are afraid of corrupt politicians that can alter CSAM databases with everything they think of. This is it.
The law didn't require Apple scanning user devices. But here we are.
Reporting requirements of providers" 18 USC 2258A
 
Last edited:
How Apple will actively keep the chance of erroneously flagging you at 1 in a trillion by tweaking the positives threshold

View attachment 1819584

nice 👍

Literally zero chance of escalating to human review for people not owning CSAM pics.
Or in other words, 'We don't know what the false positive error rates will be in actual practice, other than back-of-the-napkin estimates based on a limited data set. Welcome to the Experiment!'. And I'll note they do not say what the false positives they did detect looked like, and those false positives demonstrate what we already know - that there must be errors. Finally, as I have noted over and over in my posts, if the false positives are statistically independent, then their calculations seem reasonable. However, people often take a series of pictures, which raises the possibility that if one picture in a series triggers a false positive, other pictures in the series likely will as well. I gather that there might be a way around this, basically categorising all false positive pictures in a series as one 'hit', but Apple really does need to explain this better. For one thing, the broader a perceptual range a template has (the fuzzier it is), the higher the chances of false positives.

Also, I distrust not only the implementation of this surveillance but the principle it establishes of making your mobile phone device part of the surveillance apparatus connected to the state. Pandora's Box has truly been opened, and the only defence now from intrusive surveillance is passing laws against it.
 
Processing capabilities have being going back and forth for 50 years from the mainframe to clients, then back to servers and clouds, now some capabilities back to local devices, etc. let’s not make too much drama of technicalities and implementations. Yes some processing power from the local device is now part of the crime-surveillance apparatus. Big deal. That’s the least of the problems. Science is science, tech is tech, it’s about how it’s used. Let’s wait for some hint of actual nefarious use at least to get angry for real.
 
There are laws and rights and privacy and respect as well. Not everything possible is permissible. Goes for child porn and for looking for child porn as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.