Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
Agreed. I just can’t understand why they need to implement ANY aspect of this (However un-intrusive or innocuous anyone may think it is) on my device, that I paid for. If I choose to use a cloud service for storage, I understand they are required to take some steps to minimize misuse of their assets. Do whatever you have to there. Why the hooks inside my device? And yes, I read Apple‘s explanation about increased privacy from their perspective, but that does not change what this fundamentally is: it’s a surveillance tool that lives on my device.

And yes, Apple is often up front about trying to “ensure local laws are respected”, as they are quoted in a statement in this article, for example:
https://www.bbc.com/news/technology-58258385

That, combined with the profit potential in certain markets gives me little hope regarding Apple’s actions if “push comes to shove” with a government down the road. If they were serious about my privacy, they would not implement any part of surveillance scheme on my device.
 
There are ways for Apple to counter this type of attack and ultimately there’s human review so not sure what the attacker would accomplish, maybe force Apple to hire more reviewers to discard cute dogs false positives in a split second.
the problem with human review is that it can be swamped if - as may be the case - they've badly underestimated the number of flagged cases. Partly because there's just more of those images going around but also... that would be the *point* of being able to produce innocent images that trigger the algorithm: To swamp the human reviewers with false positives. a Denial of Service attack. And then shortcuts get into the process and mistakes are made, both ways. And as for the human reviewers, see Facebook's experience: there's a human cost to *having* to view the kind of images this system is meant to trap, and a shockingly high burnout rate. Especially if they're swamped and can't even keep up.
 
WOW! scary thought that this choice by Apple may open the flood gates to more spying.
Especially easier now that they have unified all their products hardware and software with ARM and IOS.

I still anticipate that once Apple goes entirely ARM on all it's products that EVERYTHING will resemble IOS.
With surveillance of some sort and REMOTE LOCK or DEACTIVATE capabilities.
macOS will eventually be TOUCH SCREEN just like IOS and APPS will only be installed thru the App Store.
Unless congress passes that BILL breaking up the GOOGLE and APPLE App Store monopoly.
And ALL APPLE devices Mac included can remotely be locked or deactivated.
 
the problem with human review is that it can be swamped if - as may be the case - they've badly underestimated the number of flagged cases. Partly because there's just more of those images going around but also... that would be the *point* of being able to produce innocent images that trigger the algorithm: To swamp the human reviewers with false positives. a Denial of Service attack. And then shortcuts get into the process and mistakes are made, both ways. And as for the human reviewers, see Facebook's experience: there's a human cost to *having* to view the kind of images this system is meant to trap, and a shockingly high burnout rate. Especially if they're swamped and can't even keep up.
Or they could get one of these https://en.wikipedia.org/wiki/National_security_letter ordering them to let FBI agents handle the verification.
 
The results of the on-device scanning can’t be read until the pics are uploaded on Apple servers.

So it’s actually on-server verification of security vouchers. What happens on-device, let’s call it the “first half of the scan”, is inaccessible to humans, computers and servers alike. It’s like it doesn’t even exist until the photos are uploaded anyway.
Nice try but the scanning is still done on the device.
 
The previous poster implied that people would not sign such protest letters if they had read Apple's documentation. However, people have read and understood* the documents and still protest. Experts alone protesting would not prove much, but the objections are easy to understand, and the documents do not invalidate them.

Claiming that protestors just lack understanding plays into Apple's people-are-confused-ruse. People know and understand what Apple intends to do, that is why they protest. That is also why adding more documentation without addressing the core objections comes off as a very blatant diversionary tactic: people do not care whether the gun operator has good references or whether the safety only fails in 1 out of a trillion times - they simply do not want that gun pointed at them.

*As far as the documents allow - Apple glosses over many critical aspects of the system. The system is objectionable under any interpretation, though. More details might make it even worse.

Reasonable people also object because The Slippery Slope Risk is truly on full display here.
 
Another thread destined to by jam packed with misinformation I see! All the armchair privacy advocates on macrumours claiming that any cloud unencrypted mass scanning is better than encrypted client side methods. Hilariously misinterpreted and the absolute epitome of uninformed bandwagon jumping.
Go read up a bit on the various privacy conscious methods of doing stuff, and compare that with what Apple is doing (by reading apples white paper) verses what virtually every other company does.
 
the problem with human review is that it can be swamped if - as may be the case - they've badly underestimated the number of flagged cases. Partly because there's just more of those images going around but also... that would be the *point* of being able to produce innocent images that trigger the algorithm: To swamp the human reviewers with false positives. a Denial of Service attack. And then shortcuts get into the process and mistakes are made, both ways. And as for the human reviewers, see Facebook's experience: there's a human cost to *having* to view the kind of images this system is meant to trap, and a shockingly high burnout rate. Especially if they're swamped and can't even keep up.

Since when is “it could be DDoSed“ a reason to not even try?
You try to counter that with anti-DDoS techniques. (like maybe tweaking the threshold and the synthetic vouchers background noise in this case).
Again I feel a lot of ”presumption of incompetence” towards Apple.
 
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

I beg to differ. When all these groups and massive amount of complaints from IT specialists, media, even Apple employees and some referring to it as a backdoor including politicians, then I think its fair to refer to it as a backdoor. Whatever the motives it initiates surveillance on our hardware, whether anonmysed now or not, because its clear that you could not pick a better excuse than fighting child pornography or child abuse, but in doing that on hardware it opens the door, so yes it is a backdoor, maybe a backdoor initially designed for altruistic purposes, but I doubt that would stay like it, as the road to hell is paved with good intentions.

My own conclusion is that some here are not here as individuals, but posting on a company's behalf and I leave others to guess the company.
 
The amount of fanboys with pseudo technical understanding here is staggering. Someone reading a technical documentation without enough knowledge making "big assumptions" that Apple is his daddy and will never abuse "the good kids".

I can help you to overcome your cognitive tech bias. Watch this and listen.
I think some of you posters are deliberate engaged in obfuscation. You can post how apple INTEND to scan, but it misses the fact they are, which is really the operative fact.
 
Nice try but the scanning is still done on the device.

Spotlight indexing too.
Windows search indexing as well.
You people can’t really distinguish (or think it’s irrelevant) between a scan that sends out data to the outside world (like actual spyware) and a scan (like Apple’s CSAM pre-scanning in the iCloud Photos uploading pipeline) that doesn’t?
Apple does not remotely control the scan, it’s just a local pre-labeling system included with iOS updates, like many other background processes that would be scary IF they communicated with the outside world. But they don’t.
 
Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
You will never get this point across to the Apple defenders.

Those type of people believe Apple to be it's own country. They can't fathom that Apple is a corporation that has to abide by local laws in the countries that it operates in. They don't believe that Apple will bend itself over in order to escape persecution/retaliation by governments. Those type of people never think far into the future; their brains can only process present details. If they could think further, they might be able to grasp the concept that while this backdoor is created with good intentions, it is still a backdoor that can be utilized for nefarious reasons in the future. But sadly, their limitation creates a belief that once a company policy is written, it can never be changed even though there are many examples of companies rewording their policies all the time.
 
So do people think Apple has something to gain by implementing this, or do they think Apple are being forced to do this by some Government agency, because I'm genuinely puzzled by why they would go to all this trouble if they didn't think it was the right thing to do.
After rational analysis in Occam's razor style: Apple is not forced directly by no one, they are preparing this implementation since iOS 14.3 (from recent data), this is obviously done with Apple management knowing that some governments will apply pressure over monopolistic nature of "Apple closed ecosystem" and the obvious politically correct solution is to provide a "secure" way of monitoring user behavior by labeling it "for the common good narrative".

The implementation is designed to give the users impression of "privacy" and in the same time to remove Apple from the loop of eventual legal responsibility. The tipping point is "on device scanning" where your property is reducing the cost of the process for Apple and reassuring normalization of privacy intrusion and surveillance. This "move" creates "new business" opportunity for Apple, done behind closed doors and will send clear signal to interested third parties in user data.

Apple practically is moving in Data Broker territory, takes a marketshare from Google and Facebook in the process.
The added value is "optimization" of Apples control over services.

When the users are accepting the technological solution and are convinced that this is "privacy", implementing "perceptual hashing" and Neural Hash for other purposes is no brainer.

So to summarize: This is the defining moment for New Apple. Removing rationally thinking user base - "screeching voices of minority", implementing secure "backdoor" for multiple cases of digital policing, using "CSAM" as a politically and publicly acceptable problem, entering the Big Data Broker market with power move and reassuring that no government will be compelled to break the company for monopolistic practices.

I am done with this company, and fully understand that majority of normal users will accept this just because.
I am sharing this with hope to help the minority of Apple loyal users (like me, until this stupidity) to understand that its time to tame our collective addiction of "conveniences" and learn important lesson: Never trust your private data to a closed software/hardware company again.
 
You will never get this point across to the Apple defenders.

Those type of people believe Apple to be it's own country. They can't fathom that Apple is a corporation that has to abide by local laws in the countries that it operates in. They don't believe that Apple will bend itself over in order to escape persecution/retaliation by governments. Those type of people never think far into the future; their brains can only process present details. If they could think further, they might be able to grasp the concept that while this backdoor is created with good intentions, it is still a backdoor that can be utilized for nefarious reasons in the future. But sadly, their limitation creates a belief that once a company policy is written, it can never be changed even though there are many examples of companies rewording their policies all the time.

In the long run we’re all dead.

I make informed predictions about the near future based on the track record of the actors involved and present information, not based on generic slippery slope arguments. Or hot takes full of misplaced buzzwords like “backdoor”.
 
Spotlight indexing too.
Windows search indexing as well.
You people can’t really distinguish (or think it’s irrelevant) between a scan that sends out data to the outside world (like actual spyware) and a scan (like Apple’s CSAM pre-scanning in the iCloud Photos uploading pipeline) that doesn’t?
Apple does not remotely control the scan, it’s just a local pre-labeling system included with iOS updates, like many other background processes that would be scary IF they communicated with the outside world. But they don’t.
"between a scan that sends out data to the outside world (like actual spyware) and a scan (like Apple’s CSAM pre-scanning in the iCloud Photos uploading pipeline) that doesn’t?"

They are both surveillance, and its interesting to note you refer to them both as a scan!

1. look at all parts of (something) carefully in order to detect some feature.
"he raised his binoculars to scan the coast"

2. an act of scanning someone or something.

Cambridge Dictionary:
to look at something carefully, with the eyes or with a machine, in order to get information:

to look through a text quickly in order to find a piece of information that you want or to get a general idea of what the text contains:

a careful or quick look through something:

to examine something carefully:
[ I ] This technique is used to scan for defective genes.

the act of looking at or through something carefully or quickly:

to use a piece of electronic equipment to get information from something such as a bank card or a product's barcode:

Your word SCAN which suggests it certainly has the potential for a backdoor, although Apple has always got a backdoor because of System Integrity Protection but which is so much more reason NOT to have tools on the hardware of users (predictive text like to change Integrity for Integration, but perhaps it realises this step back from Apple's avowed statements on privacy over many years may to some represent a loss of integrity so changes it.
 
Last edited:
  • Like
Reactions: BurgDog and dgrey
well I guess it's due to US laws.

if you want privacy, US laws are not that convenient, not only regarding CSAM. for example, one of the selling point of this Swiss based cloud service is : NSA non-compatible


my point being: the discussion about privacy is important but shouldn't focus only on this on device mechanism, client side. how could a server side scanning be better in any way?

Scanning is bad wherever it is done.
 
  • Like
Reactions: aduzik
Spotlight indexing too.
Windows search indexing as well.
You people can’t really distinguish (or think it’s irrelevant) between a scan that sends out data to the outside world (like actual spyware) and a scan (like Apple’s CSAM pre-scanning in the iCloud Photos uploading pipeline) that doesn’t?
Apple does not remotely control the scan, it’s just a local pre-labeling system included with iOS updates, like many other background processes that would be scary IF they communicated with the outside world. But they don’t.
However you can use terminal to switch spotlighting Indexing off?

The user can also exert control over some of the functions and exclude items.

The other major point is Spotlight indexing is for the hardware its on, and its stays there it doesn't cross reference your photos or other data (or a hash as a hash is still data whether anonymised or not) with a server elsewhere and doesn't potentially compromise your system externally.

Having been Apple enthusiast for decades, been a developer, been a commercial user, and involved in the Uk and internationally in assisting some of the agencies who fight crime and fraud of all sorts, so for me to speak out, it has to be of real concern.

Of course agencies would love a backdoor, so would many governments, yet many of these governments and agencies also use Apple kit and I doubt that they are that keen on THEIR equipment being the subject of checks.
 
Last edited:
  • Like
Reactions: dgrey
However you can use terminal to switch spotlighting Indexing off?

The user can also exert control over some of the functions and exclude items.

The other major point is Spotlight indexing is for the hardware its on, and its stays there it doesn't cross reference your photos or other data with a server elsewhere and doesn't potentially compromise your system externally.

You can’t disable it on iOS/iPadOS.

And how do you know it’s not doing nefarious stuff, cross referencing, etc.? It’s not open source. How can you trust Apple running a local process like Spotlight indexing on YOUR device?? It’s definitely a backdoor. Windows search as well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.