Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't think a lot of people understand what they're asking for here. If Apple abandons on-device scanning of uploads to iCloud (which is the only time this scan would happen) then they will have to implement scanning on their servers, which opens up a much bigger can of worms in my opinion including having your photos unencrypted on their servers. They have to scan it at some point for legal reasons to prevent child porn from ending up on their servers so isn't it better that it happens on your device with as much privacy and checks and balances built in as possible?
How about they just don't scan at all??? It's not complicated. They are under no legal obligation to implement this feature. They should just ditch it. It has the potential for much more harm than good.
 
You're changing your story from 1 post to the other. I'm going to end this discussion. (with you)

LOL! I haven't changed anything. You know I haven't and can't provide evidence that I did, which is why you're just throwing that baseless accusation out there and then running away. How convenient. But, hey, fine by me! All you were doing was twisting/misinterpreting everything I said or bringing up irrelevant points.
 
I can't believe all this misunderstanding and baseless paranoia! Does he not understand that Apple, Google, Microsoft, etc. are already scanning for CSAM? So if they wanted to instead search for other types of images, they can already do that. The only thing this new method does is make things MORE private by hiding all scanning data from Apple except that related to a sizable collection of CSAM being uploaded to their servers. If people are still paranoid about that and don't trust Apple, then they should immediately disable iCloud for photos.
Apple has not been scanning photo's for CSAM on its iCloud servers. They have been scanning e-mails for CSAM, potentially fighting the spread of CSAM, but that is not the same thing.

What is at issue here is whether it is wise to engineer spyware, which is exactly what this is, into an OS that runs on customer's devices. I care far less on what Apple does on their servers compared to what they are opening the door to on a mobile device that I carry everywhere, and which has my most private and sensitive information on it.

As for your paranoia comment, a colleague of mine once said that as a general rule the most important thing is not f(x) (how things are now) but d(x) (how things are changing). Apple's previous advertising strongly suggested they would never even contemplate this kind of spyware, which amounts to search without probable cause. And yet here we are.
 
This is the best post on this matter I have seen on this site.
Thank you. As someone who loves technology (worked at Apple Store for little while) who is also using these tools as a conservative pastor, I’ve tried to think through these things logically… I even wrote a paper for one of my masters degree classes called ‘a biblical theology of technology.’
 
Again…no body cares about iCloud scanning, this is about putting the spyware on your device…. Seems like everyone taking up for Apple are moving to different issues, we all accepted nothing is private in the cloud long ago, but we won’t accept monitoring of devices We own, it’s just a red line, lots of discussions on how to circumvent it, again it’s our device , the spyware should not be there in the first place. As far as Apple trust goes…. Eroded further this morning after learning the spyware was installed on my device as early as ios14.3 without my knowledge or permission
 
  • Like
Reactions: xpxp2002
The way the system is designed, they would need to have separate versions of iOS for each country (which they do not) and the different databases would have different root hashes. They would get caught right away. Again: it's much easier to comply with a government request from the server, where they can implement any kind of procedure without raising red flags.

That is the point of this whole system. Apple implemented this in such a way that any change in how it operates needs to be reflected in iOS, that security researchers can check. This will never be possible on a server. This system helps prevent government overreach by making part of the scan procedure auditable.

If you don't want to play with it, disable iCloud. Again, security researchers can check Apple's claim that the hashes are only produced when photos are being uploaded to their servers.
Com on - a database is just a file. Sync with chinese iCloud, done. People that don‘t even have a basic understanding how things work, should be protected from companies like Apple.

Who will surveil the surveilers?​

 
First half of the sentence: And you know that how?
Second part of the sentence: Of course! Securing evidence like this does not belong into the hands of a privately owned company without any control of what actually is happening with the collected data and to who it is handed over to in th end...
Looking just at the technical details of Apple's implementation and what the politician says, it's quite clear that he doesn't fully understand what's going on at Apple's end.

His main concern seems to be about "secure and confidential communication", which indicates that he - like many others - is confused by what Apple is actually doing.

Apple is not scanning for messages, but this entire process takes through iCloud photos.

This confusion, of course, is entirely Apple's fault for announcing CSAM and the parental controls around explicit messages at the same time.

That said, looking at simply the technical implementation of CSAM and this politician's statements, it strongly suggests he doesn't fully understand what he is talking about.

If someone has more accurate facts on this topic (note: NOT opinions), I'd be happy to learn more.
 
If Apple follows threw with this then they weekend their view against a back door for agencies that require access to a criminals phone, Apple can’t have it both ways.

The Photo scanning doesn’t prevent indecent photos from uploading just prevents sharing and circulation of known images, doesn’t prevent new unknown photos of illegal child pics.

iCloud should prevent the upload of that photo nothing more
 
Apple has not been scanning photo's for CSAM on its iCloud servers.

Well I could've swore that's what I had read from multiple sources, but other cloud services definitely have been. Even if Apple hasn't, they always could have if they wanted to. That's really my point - that these CSAM detection features in iOS15 aren't any sort of game changer in terms of what Apple COULD do if they wanted to. In fact, the whole point of the new CSAM detection process is to be as non-invasive as possible whilst still being able to detect CSAM uploaded to iCloud.

What is at issue here is whether it is wise to engineer spyware, which is exactly what this is, into an OS that runs on customer's devices. I care far less on what Apple does on their servers compared to what they are opening the door to on a mobile device that I carry everywhere, and which has my most private and sensitive information on it.

1629307098810.png

Please explain exactly how iOS15 is being magically installed on your device without your knowledge (and forgetting you have auto-update on doesn't count, as that's something under your control). You are misusing that term to try to make things sound more dramatic than they are. Plain and simple. Stop (you and everyone else doing the same thing).

And for the millionth time, NOTHING IS BEING SCANNED on your phone if you turn off iCloud for photos and even if you don't, no scanning information is leaving your phone unless you're uploading illegal images to iCloud and even THEN Apple can't decrypt that info unless the detected CSAM image threshold (30) is met.

As for your paranoia comment, a colleague of mine once said that as a general rule the most important thing is not f(x) (how things are now) but d(x) (how things are changing). Apple's previous advertising strongly suggested they would never even contemplate this kind of spyware, which amounts to search without probable cause. And yet here we are.

Again, it's not spyware, and "probable cause" has nothing to do with this topic since Apple is a private entity and you're voluntarily using their software, which is merely licensed to you (not owned by you).
 
Last edited:
Com on - a database is just a file. Sync with chinese iCloud, done. People that don‘t even have a basic understanding how things work, should be protected from companies like Apple.

Who will surveil the surveilers?​


From Apple:

"
The perceptual CSAM hash database is included, in an encrypted form, as part of the signed operating system. It is never downloaded or updated separately over the Internet or through any other mechanism. This claim is subject to code inspection by security researchers like all other iOS device-side security claims.
(...)
Since no remote updates of the database are possible, and since Apple distributes the same signed operating system image to all users worldwide, it is not possible – inadver- tently or through coercion – for Apple to provide targeted users with a different CSAM database.
(...)
Apple will publish a Knowledge Base article containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, users will be able to inspect the root hash of the en- crypted database present on their device, and compare it to the expected root hash in the Knowledge Base article. That the calculation of the root hash shown to the user in Settings is accurate is subject to code inspection by security researchers like all other iOS device-side security claims.
"

You can "surveil the surveils" by applying to Apple's Security Research Program.
 
Apple is all for end-to-end encryption. They are pretty much telling us, especially with iOS15 with iCloud Private Relay that no one is to be trusted. Your Internet traffic, DNS queries and everything is no being end-to-end encrypted directly from your phone, to a relay server at Apple, and in order for Apple to avoid sitting on any information being pressured by law enforcement and whatnot makes sure that traffic is being scrambled and re-encrypted and passed along to a Cloudflare relay server handling the connection with whatever destination you are trying to reach.

The destination no longer has any clue who you are, they can only see references to the Cloudflare relay server. Apple and Cloudflare have no clue who you are as a result of how Apple and Cloudflare is handling the traffic going between Apple and Cloudflare servers in the chain. The only place that have any idea of the source of this traffic is the endpoint being your iOS, iPadOS or macOS device.

This is great privacy at play. Making sure that the entire chain of traffic is incapable of decrypting or interrupting anything. Traffic is being sent from device A to server B without anyone, it being your ISP, law enforcement, Apple themselves, Cloudflare as their relay partner or server B that you are trying to reach has any means of sniffing anything about you along the chain.


This on-device CSAM doesn't make much sense in the light of this. CSAM happens on-device. It's happening pre-encryption. Once you start adding features on-device that is starting to sniff around the user looses all sense of control. We have no tools or utilities that allow us to figure out what Apple is doing under-the-hood on iOS and iPadOS. The whole reason why Apple design iCloud Privat Relay the way they did was to make sure that Apple themselves could not decrypt the data. It's not because they are telling us that they themselves can't be trusted, but they know if Apple would be capable of decrypting it some kind of law enforcement would at some point try to pressure them into doing so even if they themselves don't want to. Once again, great design and truly an awesome privacy feature coming with iOS15.

But at the same time Apple is implementing an on-device scanning that is scanning for CSAM. So Apple is designing iCloud Private Relay in a perfect way where law enforcement can't pressure them into handing over traffic utilising iCloud Private Relay. But at the same time they add an on-device scanner scanning content on our phones directly?

This doesn't make much sense to me. What stops law enforcement into pressuring Apple to start scanning for other kinds of content? Doesn't really matter if iCloud Private Relay makes sure my browsing traffic stays private, or iMessage is end-to-end-encrypted so no one along the chain can sniff on my messages if Apple is adding on-device scanning opening up the possibilities for law enforcement to pressure Apple to simply scan for whatever directly on my device.


One could argue that there is a difference between law enforcement and various third parties that is compromising your privacy on the Internet. This on-device scanning won't allow for my ISP, Facebook, Google or whoever to sniff on my content. It will strictly be Apple dealing directly with law enforcement. But from a strict privacy point of view, it's still intrusive and considering the lengths Apple went when designing iCloud Private Relay to be private all the way, this on-device scanning is completely out of place. It's like the left and is nothing communicating with the right hand.


And the notion and idea that people can't argue against child abusive content is just folly. Everything tends to start with a noble cause. No one is arguing against CSAM as a concept. The issue is how this is being done and the can of worms it opens. Five years down the road when China threatens to kick Apple out of China unless they scan against China's database of anti-CCP material, or Russia threatens to kick Apple out of Russia unless they scan against Russia's database of pro-LGBT material or even when the US government starts threatening Apple with huge tax bills if they don't start scanning for ISIS propaganda is when we start facing problems. You have to be extremely naive if you believe this won't ever become a problem.
 
(1) this implies it runs continuously. That is not true. It runs only on photos you upload to the iCloud service.
(2) this implies reports are automatic. That is not true. There's manual review after 30 matches are met.

You may not like the service. But this overly exaggerated framing of what Apple is doing a disservice for people who have legitimate concerns about this technology.

I never implied those things. But fine for the sake of clarity;

Yes, it only scans every version of every photo you upload to iCloud, which is only every photo you ever take on your Apple devices if you actually use iCloud. Rest assured that every photo you don’t take with an Apple device, or subsequently copy to an Apple device, will not be scanned!

Yes, Apple will manually review all reports of illegal activity. Just like they manually review all App Store submissions. And the App Store is flawless so we can surely presume the same level of accuracy for CSAM.
 
Well I could've swore that's what I had read from multiple sources, but other cloud services definitely have been. Even if Apple hasn't, they always could have if they wanted to. That's really my point - that these CSAM detection features in iOS15 aren't any sort of game changer in terms of what Apple COULD do if they wanted to. In fact, the whole point of the new CSAM detection process is to be as non-invasive as possible whilst still being able to detect CSAM uploaded to iCloud.



View attachment 1820332
Please explain exactly how iOS15 is being magically installed on your device without your knowledge (and forgetting you have auto-update on doesn't count, as that's something under your control). You are misusing that term to try to make things sound more dramatic than they are. Plain and simple. Stop (you and everyone else doing the same thing).

And for the millionth time, NOTHING IS BEING SCANNED on your phone if you turn off iCloud for photos and even if you do, no scanning information is leaving your phone unless you're uploading illegal images to iCloud and even THEN Apple can't decrypt that info unless the detected CSAM image threshold (30) is met.



Again, it's not spyware, and "probable cause" has nothing to do with this topic since Apple is a private entity and you're voluntarily using their software, which is merely licensed to you (not owned by you).


The major issue with iOS and iPadOS is the users lack of control. We have no tools or utilities at our disposal to really verify anything. How do you as a end-user verify if your phone is using this on-device scanning or not? If you don't have the tools you simply have to blindly trust Apple.

And things like this can still be considered spyware even though Apple is putting out a disclaimer about the change. If Apple puts out a disclaimer telling us that from iOS 16 and onward they'll be scanning all the content on our phones, including passwords, private notes, you name it. It's spyware by any definition. You will know about it, you will have to accept the new terms of services. It's still spying and logging everything you have on the devices and everything you do. Simply disclaiming it doesn't change that.

Don't get me wrong. I trust Apple, and I trust that if I disable iCloud Photo's on iOS, iPadOS and macOS this scanner will stay disabled. But the feature and scanner itself is still acting as some kind of spyware that is compromising my privacy.
 
  • Like
Reactions: xpxp2002
There’s a group of strangers out there that would just like to get your house keys to look at all your images anytime they like and compare them against a database you don’t have control over in order to make judegements about you. There literally nothing that could go wrong here.
I don’t think this analogy is quite right. It’s more like you want to store your images in a strangers house and they would like to compare them against a database before you do.
 
Running Copperhead or Graphene on it, absolutely agree.
Things I need to look in to apparently, it does look like it’s going to be a challenge, since their is zero pushback on this out of google them may be just seeing if Apple is successful getting this on iPhones…. If they are it opens lots of doors for them also, ushering in big brother in one giant swoop, I did trust Apple more than google before this…not anymore but It does not make me trust google more now without a commitment that they won’t do this. But I suppose android can be made more secure than the others…. I’m not well versed on it so just need to study up… I’ve always been with Apple since the beginning
 
No. They've been scanning without a warrant since 2019. The fact that you didn't know this says a lot about your stance on this CSAM issue.
Uh...have they published the fact that they've been scanning? Other then a few random articles on the internet, how exactly would a person know this?
 
Please explain how a publicly announced iOS update (including disclosure of the CSAM detection features) is "spyware". Last time I checked, spyware is malicious software installed without your knowledge or consent. None of that applies here. If you choose to upgrade to iOS 15 and be paranoid about that feature (for no rational reason that I've seen), that's on you.
Well, if as some people suggest, they've been doing it since 2019 and they're just now telling us they're doing it, then they've been spying on us since 2019 without our consent. This is spyware.
 
  • Like
Reactions: xpxp2002
Uh...have they published the fact that they've been scanning? Other then a few random articles on the internet, how exactly would a person know this?

The updated terms you agreed to when you install an update. If that's too much legal mumbo jumbo, then you should have relied on articles on the internet to tell you this.
 
I never implied those things. But fine for the sake of clarity;

Yes, it only scans every version of every photo you upload to iCloud, which is only every photo you ever take on your Apple devices if you actually use iCloud. Rest assured that every photo you don’t take with an Apple device, or subsequently copy to an Apple device, will not be scanned!

Yes, Apple will manually review all reports of illegal activity. Just like they manually review all App Store submissions. And the App Store is flawless so we can surely presume the same level of accuracy for CSAM.


Not to mention that the major issues most of us have is not related to CSAM or how this is working today. It's the precedence it sets and creates and what this might make tomorrow look like.

If this somehow stays exactly the way it is currently with iOS15 and will stay CSAM only forever. Then sure, we don't have much of a problem. The problem is that history shows us that once you start opening your can of worms your original idealistic and noble goal is not what this is going to be strictly used for in X amount of years.
 
  • Like
Reactions: schneeland
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.