They already *do* decrypt all the files in iCloud photos and scan them there, iCloud photos arent protected from Apple unlike other iCloud data, that’s part of why the on device scanning (but only if you have icloud photos enabled) is so sillyFrom the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.
The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
Perhaps they are being forced to implement this directly by a government organisation and Apple have little choice over the matter. Maybe this is even a compromise which Apple have negotiated behind closed doors as an alternative to something even more invasive. Who knows! If they don’t back down despite continued backlash and possibly loss of sales then it would seem reasonable to think that this is coming from outside of Apple.
I'm ignorant of what these trial emails are referring to. Could you point me to some sources about this trial? I'm unaware of it.From the trial emails it's clear Apple believes they have a lot of CSAM in iCloud Photos and they don't want it there. So, starting with this premise that they are going to change something, they have 2 options. Decrypt all the photos in iCloud and scan them there or use the method they presented.
The method as presented is ironically more secure and privacy oriented than scanning in the cloud. All the arguments against are 'what if'. The negative press may push Apple off the new method, but make no mistake, they are going to do something.
Is that true? Could you point to a source. I was always under the impression that our own iCloud data is not accessible to anyone without the required electronic credentials.[...] Apple has always scanned iCloud data for CSAM. It just never got much media attention because no one really cared. OK you’re catching bad guys so go ahead
[...]
I would be fine with iCloud scanning but installing the capability to remotely scan my device is not acceptable. While Apple claims it will be use for CSAM there’s nothing stopping them from using this capability to scan against a different database. Even the current CSAM database gets updated so they can update that with pictures of certain dissidents in China. You might say well they wouldn’t do that but that’s where you’re wrong because it’s Apple’s policy to comply with legal requests from governments they do business in. They gave the Chinese government full access control over the iCloud servers in China…
Forced by the FBI, NSA, etc. Apple is doing this in return for a publicity pass from the government agencies regarding Apple supporting terrorism, porn, and child endangerment. Apple & the government want a back door that publicly does not appear to be a back door.I don’t understand Apple on this one, they have been pitching themselves as the privacy friendly company amongst tech giants and still claim that as one of their advantages; yet working on a feature that scans all your photos..? it just doesn’t make sense to me.
Promoting ones own moral vanity, is much easier, than forming an educated opinion. But I know a few of you out there actually cares more about child safety. Here's the Rene Ritchie video that changed my mind:
I fully support Apples technology and it cannot come soon enough. And it will.
I'm ignorant of what these trial emails are referring to. Could you point me to some sources about this trial? I'm unaware of it.
Is that true? Could you point to a source. I was always under the impression that our own iCloud data is not accessible to anyone without the required electronic credentials.
It is scanning your photos for child porn - this "it's not scanning your photos it just compares the hashes" claim is meaningless. They are using "perceptual hashes" - an image recognition/machine learning technique designed so that "visually similar" images will produce the same hash, regardless of cropping, scaling, changes in image quality etc. False positives are inevitable and the risk can only be measured experimentally - you really, really have to trust Apple on the reliability, especially when it comes to "we only send an alert after multiple matches" - which doesn't necessarily help because your personal photo collection is not a random collection of images, most people will have loads of "visually similar" images of children/pets/homes/posessions.it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
Er....ok. I wasn't picking a fight at all. Sorry you read it that way.Ah yes, the bait and catch response, claim ignorance and ask for sources but because your too lazy to go look for them and when the member does not produce those sources, you go and complain to the mod's that the site rules have been broken.
Apple exec said iCloud was the 'greatest platform' for CSAM distribution | AppleInsiderI'm ignorant of what these trial emails are referring to. Could you point me to some sources about this trial? I'm unaware of it.
I don't think that matters, I'm sure all the important people have seen it just like we have. Sometimes, to protest something, you have to keep stuffing it in their face, and the press obliges.I wonder how many people were actually in the building to see that.
That's a weak argument. I can use my iPhone to take pictures of my pet gecko. Someone else can use it to produce CSAM. The same technology, different uses. Should we not have cameras or the internet or drugs or cars or anything because they can be abused? Do we take down the internet because foreign governments use it to oppress their people?The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles.
Thank you! I didn't know "trials" was referring to the ongoing battle with Epic games etc.
Not to "whataboutism" this, but did the EFF protest like this when Google started doing it?
Then stop saying 'show me sources' and instead say something like 'I do not know about the trials you refer to, can you explain them to me or point me in the direction where I can read them for myself'. just typing 'show me sources' is very confrontational.Er....ok. I wasn't picking a fight at all. Sorry you read it that way.
I am ignorant of these trial emails. I'm assuming court trials?
Change the meta and hashing will mean nothing. There's an app for that.What part of this is NOT a good idea!!! You do realise don't you your own precious Cat, Dog or Family photo will not be actually looked at!!! - it simply compares the Hash for the photos against already identified hashes of indecent images of children!! It doesn't actually look for what it thinks are indecent images, just the hash!
We remember! However what they do isn't near as bad as what Apple is proposing. (for now)Meanwhile Google et al are glad Apple has drawn the attention away from what they are doing with your images stored on their services.