Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
From the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.

The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
They already *do* decrypt all the files in iCloud photos and scan them there, iCloud photos arent protected from Apple unlike other iCloud data, that’s part of why the on device scanning (but only if you have icloud photos enabled) is so silly
 
Perhaps they are being forced to implement this directly by a government organisation and Apple have little choice over the matter. Maybe this is even a compromise which Apple have negotiated behind closed doors as an alternative to something even more invasive. Who knows! If they don’t back down despite continued backlash and possibly loss of sales then it would seem reasonable to think that this is coming from outside of Apple.

This brings up a good question. Where is the skin in the game for Apple in all of this. Why are they interested in this CSAM at all? Is there some head executives at Apple whose had personal negative experiences around child pornography and is pushing for this? I'm really curious to know if any investigative journalism has been done on the history of this Apple and CSAM thing.
 
  • Like
Reactions: glowdragon
From the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.

The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
I'm ignorant of what these trial emails are referring to. Could you point me to some sources about this trial? I'm unaware of it.
 
[...] Apple has always scanned iCloud data for CSAM. It just never got much media attention because no one really cared. OK you’re catching bad guys so go ahead
Is that true? Could you point to a source. I was always under the impression that our own iCloud data is not accessible to anyone without the required electronic credentials.
 
  • Disagree
Reactions: glowdragon
[...]

I would be fine with iCloud scanning but installing the capability to remotely scan my device is not acceptable. While Apple claims it will be use for CSAM there’s nothing stopping them from using this capability to scan against a different database. Even the current CSAM database gets updated so they can update that with pictures of certain dissidents in China. You might say well they wouldn’t do that but that’s where you’re wrong because it’s Apple’s policy to comply with legal requests from governments they do business in. They gave the Chinese government full access control over the iCloud servers in China…

Seems true!

Also, while there's a lot of fingers pointing at China -- and rightfully so -- we can't ignore the fact that the same thing can and does happen in the US. We only need to think back to the previous presidency to see how insanely authoritarian the US can be and in fact still has such types in congressional offices! So while we can point fingers at China, we can't ignore the fact that Apple, a US-based company, is also in dangerous territory when it comes to its own (our own) country.
 
1. It's great to see EFF wasting my 15 years of recurring annual donations I give them fight actual legal cases on behalf of people who can't defend themselves by hiring a Cesna and hauling a banner around all day
2. Given how many cars there are, looks like all EFF did was annoy the people who live around Apple Park and were instead having to deal with a plane flying overhead all day

This is giving me PETA vibes and I don't like it. They bullied Apple for years just to raise their own brand. EFF appears to be doing the same thing.
 
I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
Forced by the FBI, NSA, etc. Apple is doing this in return for a publicity pass from the government agencies regarding Apple supporting terrorism, porn, and child endangerment. Apple & the government want a back door that publicly does not appear to be a back door.
 
Promoting ones own moral vanity, is much easier, than forming an educated opinion. But I know a few of you out there actually cares more about child safety. Here's the Rene Ritchie video that changed my mind:

I fully support Apples technology and it cannot come soon enough. And it will.

That guy is explaining how he *thinks* the technology should work. But does not understand that technology has unforeseen consequences or gets hacked many many times every day. There are quite possibly seasoned hackers having a good laugh at him right now.

"Think of the children" is such a trope. On my favourite tech forum, they even have an icon of the Child Catcher from Chitty Chitty Bang Bang (Think of the children) that you can use to tag your posts. Because it's something that is used so often and so inappropriately.

No decent citizen wants to see CSAM shunted around the web. But there are ways and there are ways of detecting crime. This is the wrong way. Offenders don't just get this stuff out of the blue. They get on the dark web onto creepy forums, or they send their credit card details in a certain direction. Which the British police infiltrate, and use to catch them. It happens a lot, and I have have no problem at all with the police hijacking those deviant services, or chasing down payments to miscreants.

But that shouldn't be happening on my phone, as a matter of course, without my permission. In a way that puts way too much trust in external agencies.
 
The decision to implement this was not made recently but was already included as part of iOS 14.3 code. It’s not “if” but “when” this will be activated either publicly or privately.
The decision was finalized long before iOS 14 was released. It’s already out of Apple’s hands now and authorities higher up already had given the green light for this project. The delay is just smoke and mirrors to trick Apple users to think they are influencing the outcome.
Apple cannot back out of this now even if they wanted to. For the first time iOS users can opt out of a major release, this is unprecedented and has never been done before, this is the only recourse Apple could provide its users to slow down this forced implementation.
Don’t be surprised if Apple tries to thwart this decision from on high and provide an iOS 14.x IPSW for the newly released iPhone 13 and 14 and provide two years of updates for iOS 14 alongside iOS 15 and 16.
20 years from now the non-disclosure agreements that Cook was forced to sign will reveal he was on our side from the beginning and was trying to stop this by any means possible.
 
Last edited:
I'm ignorant of what these trial emails are referring to. Could you point me to some sources about this trial? I'm unaware of it.

Is that true? Could you point to a source. I was always under the impression that our own iCloud data is not accessible to anyone without the required electronic credentials.


Ah yes, the bait and catch response, claim ignorance and ask for sources but because your too lazy to go look for them and when the member does not produce those sources, you go and complain to the mod's that the site rules have been broken.
 
  • Disagree
Reactions: neuropsychguy
it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
It is scanning your photos for child porn - this "it's not scanning your photos it just compares the hashes" claim is meaningless. They are using "perceptual hashes" - an image recognition/machine learning technique designed so that "visually similar" images will produce the same hash, regardless of cropping, scaling, changes in image quality etc. False positives are inevitable and the risk can only be measured experimentally - you really, really have to trust Apple on the reliability, especially when it comes to "we only send an alert after multiple matches" - which doesn't necessarily help because your personal photo collection is not a random collection of images, most people will have loads of "visually similar" images of children/pets/homes/posessions.

It is highly likely that any software that can identify your friends' faces and pets in your photos is also "just comparing hashes".

These are very different from "cryptographic hashes" - the sort of thing you'd use for checking the authenticity of a downloaded file. Those check that files are identical to the original, and are designed so that a tiny change to the file will change the hash, with a negligible (and mathematically computable) risk of false positives. They'd be useless for detecting known porn images, because cropping, rescaling or tweaking the image, even slightly, would lead to a different hash.

The only mitigation is that, Apple have said, that the scanning will only be done on images that you are about to upload to iCloud Photos - and having unencrypted, photos stored on someone else's server scanned for known illegal images is probably unavoidable, so we're really talking about the "line in the sand" crossed by doing the scanning on the users' phone instead of on the server.
 
Ah yes, the bait and catch response, claim ignorance and ask for sources but because your too lazy to go look for them and when the member does not produce those sources, you go and complain to the mod's that the site rules have been broken.
Er....ok. I wasn't picking a fight at all. Sorry you read it that way.

I am ignorant of these trial emails. I'm assuming court trials?
 
The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles.
That's a weak argument. I can use my iPhone to take pictures of my pet gecko. Someone else can use it to produce CSAM. The same technology, different uses. Should we not have cameras or the internet or drugs or cars or anything because they can be abused? Do we take down the internet because foreign governments use it to oppress their people?

I'm not saying those are all equivalent to Apple's plan to scan for known CSAM. All I'm saying is that is a poor argument against Apple's plan.
 
Not to "whataboutism" this, but did the EFF protest like this when Google started doing it?

That would be like Greenpeace protesting about whaling ships using disposable plastic harpoons.

Pretty sure that the EFF would like to see Google - who's entire business model is based ib selling your personal data to the highest bidder - wiped from the face of the Earth. Apple, OTOH, have previously made a big noise about how much they care about privacy, so they're being called out for hypocrisy.
 
Er....ok. I wasn't picking a fight at all. Sorry you read it that way.

I am ignorant of these trial emails. I'm assuming court trials?
Then stop saying 'show me sources' and instead say something like 'I do not know about the trials you refer to, can you explain them to me or point me in the direction where I can read them for myself'. just typing 'show me sources' is very confrontational.
 
  • Wow
Reactions: msackey
What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
Change the meta and hashing will mean nothing. There's an app for that.
 
Meanwhile Google et al are glad Apple has drawn the attention away from what they are doing with your images stored on their services.
We remember! However what they do isn't near as bad as what Apple is proposing. (for now)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.