Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It becomes a security and privacy issue because Apple now has full access to my entire device and can scan for whatever they want without me knowing about it. That is why I prefer this type of software to stay in the cloud.

You mean like Spotlight indexing? That’s not “Apple”. That’s software created “by Apple”. Not tethered to Apple Inc. until proven otherwise. Proceed to remove the battery from your iPhone to never have to worry about software created “by Apple” again.
 
  • Like
Reactions: JMacHack
You know what I think I’d like Apple? How about a parental control that uses the same image recognition to filter out explicit photos that horny teenagers send to each other.

No off device scanning necessary and would cut down on csam I suspect.
 
You mean like Spotlight indexing? That’s not “Apple”. That’s software created “by Apple”. Not tethered to Apple Inc. until proven otherwise. Proceed to remove the battery from your iPhone to never have to worry about software created “by Apple” again.
You can designate folders for spotlight to ignore. You cannot with this new system.
 
I see we have some newcomers to the thread that could save themselves a bit of typing energy by simply "reading" the previous pages of the thread. :D


Many of us don't want a tool installed on our device whose purpose is to scan our local content against third party black box databases.
 
The local scanning process is not remotely controlled by an external entity, it lives 100% inside the phone and has no way to communicate with the world outside. It’s like a wiretap that’s not connected to anything, just sitting there. It only “wakes up” when the photos are uploaded to Apple’s server. That makes it tricky to consider it a search on personal property without attaching a lot of asterisks and nuances to that. That’s why the technical implementation is super important to understand. (the opposite of what the “it’s not about the tech” people would make you believe with their blunt hyperbolic buzzwordy takes)

You are getting distracted.
It doesn’t matter if the search is done using hash-matching algorithms, bloodhounds, or black magic. The technical implementation is completely irrelevant. A search is a search.

All that matters is a search is happening, and it’s happening on my personal device, and a warrant for the search was not obtained. End of story.
 
They have always had that power!

Apple can copy everything on the device unless it was encrypted outside Apple's functions. They can delete every app, wipe the entire phone.

Apple already have code on the iPhone which can scan every file on the file system.
They did not have the ability to scan photos on the device and report back to Apple HQ. Not to mention I am sure this scanning will be expanded on and the option to turn it off will not exist.
 
  • Like
Reactions: briko
You are getting distracted.
It doesn’t matter if the search is done using hash-matching algorithms, bloodhounds, or black magic. The technical implementation is completely irrelevant. A search is a search.

All that matters is a search is happening, and it’s happening on my personal device, and a warrant for the search was not obtained. End of story.

We’ll see.

Technology sometimes creates completely new circumstances that may make it necessary to revisit previous definitions that no longer capture a new technical reality.

I could see it as half-a-search on-device and half-a-search on the cloud.
 
They did not have the ability to scan photos on the device and report back to Apple HQ. Not to mention I am sure this scanning will be expanded on and the option to turn it off will not exist.
They can’t report back to Apple HQ.
Apple HQ can only collect the security vouchers once you upload the images yourself.
 
  • Like
Reactions: januarydrive7
They did not have the ability to scan photos on the device and report back to Apple HQ. Not to mention I am sure this scanning will be expanded on and the option to turn it off will not exist.
Of course. The implementation they did does not make any sense if it were just to avoid CP on their iCloud servers. If they wanted just that, they could have just implemented server side scanning like everyone else.

The whole „feature“ makes sense only considering data stored on device. Hence it is very likely that the „only when using iCloud Photo“ part will disappear once the dust has settled.

What is beyond me is: WHY?

As said: if they only wanted to keep the filth from their servers - which is totally understandable - why not just server side scanning?

Why go through all the fuzz? I mean, they cannot be that disconnected from mothership earth that they did not expect this „feature“ to at least stir up a lot of dust.

I also do not get why they believe that this feature might be to the users benefit? It must be clear to all of them within the upper eschelon of Apple execs that developing this spyware is an evil thing, benefitting in its currently announced form literally not a single soul.

It is also absolutely clear that the feature can and will be misused. They must be aware that they cannot possibly deny FISA requests or anything like it (they bent over for the Chinese and Russians pretty willingly already). Knowing this, they still proceded in doing it. They also must be fully aware that promising otherwise is going to be debunked as outright lie right away - yet they made this claim.

So why? It does no make sense in this form whatsoever
 
Last edited:
I'm aware of Googles object detection. To conflate that with what Apple is doing is disingenuous, Apple is detecting nudity in the messages app for incoming and outgoing messages. This isn't detecting something in your photos app and telling you its a cat or a dog. While I concede Googles object detection could be added to their own messaging app to accomplish the same goals the alerting feature and all that would be new and not a fully finished idea which is what courts would look at when it comes to how much they would burden a company with a decree.

This is partly how Apple defended themselves against the department of justice in the FBI phone unlocking case. Compelling Apple to spend employee time (which costs money and would delay other product releases) to build in a function to the iPhone that would allow for unlimited attempts at the passcode was an unreasonable request under the 1789 All Writs Act that the DoJ was attempting to use as a bludgeon to get what they wanted. Ultimately the DoJ withdrew the case against Apple without resolution.

Had Apple already created such a feature though, for instance for the CPP of China that could at that time only be used on Chinese sold iPhones you bet the court would have sided with the DoJ almost immediately and ordered Apple to make that firmware accessible to the FBI on North American sold iPhones.

That is the difference, I mean even Apple is arguing against your point here in their own legal situations with the US government.

Do you think "detecting nudity" vs. "detecting a <insert nude body part>" is different? It's not. Deep CNNs are scary good at detecting discrete objects and differentiating between minute details (hot dog vs. penis). I'd encourage you to look through some of the convolutional layers of any decent trained open-source Deep CNN object detector network. What I said was not disingenuous at all. The argument that the alerting feature is some new novel thing is what's disingenuous: there have been many implementations of parental control messaging notifiers (first search result found this: https://useboomerang.com/2019/11/05/monitor-calls-text-messages-android/). We don't see news about Google being required to provide these things, even though it would be minimal work to enable it. I stand by what I said: if Russia et al wanted these features, it wouldn't have been difficult to get them already, as the technology already exists.

I know there are storage laws in China. And I stand by what I said earlier, they folded to Chinese oppression. They decided that the privacy of their customers had a dollar amount attached to it and they would rather keep selling phones there than to exit the market.

They chose money over privacy and they will continue to do that when the next battle occurs. If the DoJ had won that court case against Apple over the FBI phone Apple would have put the backdoor in the iPhone, they would never stop selling the iPhone in America.

Bit of a flaw here: had they chosen the alternative (to not comply with local jurisdiction), then there would be no Chinese customers' privacy to protect. It's not a mere choosing of privacy of their customers vs. a dollar, it's the choice of customer or no customer.

I don't disagree on your under arching point, though --- if some jurisdiction has laws that require these things, then it's up to Apple to either choose to do business there or not; but they don't have the freedom to just "protect privacy" while ignoring local laws. Any backdoor-enabling laws that affect Apple would affect every smartphone maker.

I highlighted the part that Tim most feared. Well guess what? Apple just did exactly what he feared now it's up to the Governments to request this feature extended to detect in peoples messages whatever they like. Just like how the CPP does in China with WeChat. You can literally not write certain words and phrases in the app, they are blocked by government decree.

Apples stance that messages are end-to-end encrypted and thus cannot be altered by Apple just had a big brick thrown through it that repressive regimes can point to and say "But you detect this in the messages so just build a profile of things we want to detect too".

As you say, China already does this with WeChat, so they know clearly that the tech exists to block keywords/phrases. The argument that Apple has used previously around these issues is that it is a feature within the kernel of the operating system, not a feature that can be attached to certain subgroups of iOS users. If China et al were able to force Apple to make a custom version of iOS for only their citizens, then they already would have.
 
  • Like
Reactions: JMacHack
So why? It does no make sense in this form whatsoever

And the really gross part is that if they do go E2EE for things heading up to the cloud, many will focus on that as a big "security win" -- when it's anything but...

Because they will have now installed the infrastructure to scan for whatever "three letter agencies" and other entities (State, corporate or other), etc, would like --- RIGHT ON YOUR OWN DEVICE!

How people are missing this is beyond me.

They've built the methodology and tools needed to surveil you - RIGHT ON YOUR PHONE....and with the black box methodology of it, even Apple wouldn't necessarily know what the people asking for the scanning are actually looking for.
 
says Apple That is the entire problem Apple has installed surveillance software on our device and we just have to trust Apple about what they are doing with that software.
You realize everything Apple has said (ever) is a "says Apple" thing, right? You know they installed iOS on our devices and have had full capability of doing these things at any point in time, right?
 
You realize everything Apple has said (ever) is a "says Apple" thing, right? You know they installed iOS on our devices and have had full capability of doing these things at any point in time, right?
The problem is they installed surveillance software on the devices and the users have no way to verify what it is doing since iOS is closed source. So all we have is Apple's word on what the software is doing. Do you not see a problem with that and how it can be easily abused?
 
There’s a difference between having the capability in general and it being reasonable to require them to use it. That’s how they got out of providing unlocking firmware under the All Writs Act, and it’s why they haven’t been sued for facilitating piracy.
So if Apple were required to use it, then what? Near-exact matches of child abuse photos would be flagged and then sent to some government agency for review? I'm not against that.

But my original assertion, that this can't be used for finding e.g. the set of people who attended a trump rally, still stands. It would only find near-exact machines of hashes of photos stored within iOS's internal database.
 
They've built the methodology and tools needed to surveil you - RIGHT ON YOUR PHONE....and with the black box methodology of it, even Apple wouldn't necessarily know what the people asking for the scanning are actually looking for.
Exactly. Yet the question remains: WHY? I am rather sure they are well aware of this and all implications. Yet they proceded. Puzzling indeed
 
Last edited:
The local scanning process is not remotely controlled by an external entity, it lives 100% inside the phone and has no way to communicate with the world outside. It’s like a wiretap that’s not connected to anything, just sitting there. It only “wakes up” when the photos are uploaded to Apple’s server. That makes it tricky to consider it a search on personal property without attaching a lot of asterisks and nuances to that. That’s why the technical implementation is super important to understand. (the opposite of what the “it’s not about the tech” people would make you believe with their blunt hyperbolic buzzwordy takes)

So there is no way the CSAM database will get updated? What’s in there now will remain in there, no more or no less content, for years? If it gets updated, then yes it is externally controlled.
 
Exactly. Yet the question remains: WHY? I am rather sure they are well aware of this. Yet they proceded. Puzzling indeed

I'm a bit of skeptic myself -- I think they are making deals with the US Gov to avoid anti-trust enforcement (or other legal issues)

What folks like Gruber are hinting at (E2EE + these backdoor methods on device) are literally **perfect** for surveilling people. Folks that don't dig into the details think they are "uber secure", when the truth is the total opposite.


(quick point of clarity on piece just above -- if you are ok with having "who knows who" scanning all your content before E2EE, then I guess you are very secure -- some of us just fiercely object to the concept of being subject to warrantless surveillance)


I know - seems far fetched -- but so did all the crap that Snowden revealed.
Life is stranger than fiction in my experience.
 
Last edited:
This is just Apples first step into a complete surveillance of Apple users. What will happen? Or what is happening right now? All the kids use Telegram, WhatsApp, Signal - or SnapChat. Hey, SnapChat that‘s one of the places if you want to see or post some nudity. Or lets talk about reddit and some subredits. Nothing illegal but nudity for sure.

Messages may be involved, but I guess that Messages is not the place to be. So the way to go for Apple is to force all messaging services to implement Apples photo scanning service and scan for nudity. For gods sake there are those AppStore guidelines, or should I call it „one ring to rule them all“.

So iOS16 will finally break end to end encryption completely by forcing all Apps into Apples chain of surveillance.
 
Last edited:
I'm sure you're right - though other cloud companies already do this? So I don't know how much cost it would add.

I don't think it will affect battery life much at all on the device.

I know it might seem silly, but it's just the principle of them using my device for it and not giving me a choice in the matter, other than disabling a feature I highly value as my family has 200+ GB of photo/video history on iCloud Photos between the 5 of us.
The verbiage suggests that this will only apply to future uploads once iOS 15 is installed -- i.e., your 200+GB won't be re-scanned on your device, unless you download and re-upload your entire library.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.