Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In addition once this launches I see a lawsuit challenging the legality of CSAM software being installed on phones. In addition to that it would not be surprising if a class action lawsuit requiring Apple to compensate users for their devices who did not sign up for this software to be installed. It will be interesting to see it play out.
This will not help or in any way affect Apple users outside the US. There's a big world "out there" ;-)
 
Help me understand, because I clearly don’t get why everyone is so angry about this

If I offered an app with the ability to upload images or whatever, and I’m legally and morally compelled to prevent child abuse images, why would I not want to use a free service from Apple that does this in a way that maximizes privacy.

I get no corporation can be trusted. Facebook, Boeing, every single Wallstreet firm, et al….all scum. I get the slippery slope arguments. But even if you don’t trust Apple’s leadership and their intentions, trust how they make money. They sell premium devices which will expand more and more into health and financial transactions. There is a lot more money for Apple there then selling you a new iPhone or Mac every 3 years. And this strategy needs rock solid security and privacy. Plus, they need consumer trust. (You would be nuts to give your medical info to Facebook, Google, Amazon, Verizon.)

I think the truth is far more simple and benign:

1. Apple wants end-to-end encryption on iCloud. They haven’t done it because people lock themselves out of their account losing access to a lifetime of memories and critical date and because law enforcement bullies tech companies.
2. So Apple solves these problems under their terms. At wwdc we see ability to designate a person who has access to your iCloud. Yes, it was framed as post-life, but it’s a safety mechanism for any scenario when someone can’t get access. And they decide to focus on child safety because it‘s the law, it’s morally correct, and because this is an area where they can have the most impact.

facebook, Google, yahoo, Adobe, Amazon, you name it, all scan your uploads for child porn. All of them. Again, it’s the law.
I was also assuming, that this is a part of a move to client-side-encryption otherwise they could just scan in the cloud
 
  • Like
Reactions: peanuts_of_pathos
I think that is debatable since Apple owns the software and not the phone.
Yes and because it's debatable it's the Juge who will rules, and I really don't see a juge blocking a feature who will strop some pédophile (because juge don't are IT expert so they will not see any problem with this feature).
it's why I think it will fail.
 
Yes and because it's debatable it's the Juge who will rules, and I really don't see a juge blocking a feature who will strop some pédophile (because juge don't are IT expert so they will not see any problem with this feature).
it's why I think it will fail.
A judge will stop it if it violates users 4th amendment rights of search and seizure.
 
  • Like
Reactions: peanuts_of_pathos
Quoted directly from Apple:

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Sounds like scanning to me.
It is not scanning, if by "scanning" you mean "evaluating the content of your photo." Apple is comparing the digital fingerprint of the photo to a list of digital fingerprints from the CSAM database. The digital fingerprint is a hash, which means even if Apple or a third party gets its hands on it, it's useless to recreate the actual photo. And it's done on the phone and not Apple's servers, so they don't even have it to begin with.

Unless you think they would get them from your phone behind your back, in which case none of this is new anyway. :)
 
Actually that's EXACTLY what it does. As pointed out below - they scan on-device before it goes into the cloud to "protect your privacy." They scan your photos "on device" before they go into the cloud. Apple doesn't give a **** if you have CSAM images on your phone (which is why this doesn't work if you keep icloud photos off) - they do however very much care if you store CSAM images on their servers - icloud photos. Thus, they are LITERALLY scanning your photos. Please properly inform yourself.
I think we have to be clear about what everybody means by "scanning your photos." To one person it might mean "having AI evaluate your photo content and see what's in it," which would be awful but is not what the system does. To another it might be what it is: a comparison of a digital hash of the photo (from which no one could recreate the photo itself or derive what it's content might be by looking for pattens) against a list of digital hashes of known CSAM.

To me, what is happening is not "scanning photos" the way I think most people consider the term.
 
True, just hope apple will release an E2E client side of iCloud photo (will explain why this scan of hash client side) before the juge have to say anything.

Indeed. Though, I have to question the value of staying so dogmatic about having E2E since they've essentially created a workaround making their E2E encryption less... protected? (for lack of a better word)
 
  • Like
Reactions: peanuts_of_pathos
I think we have to be clear about what everybody means by "scanning your photos." To one person it might mean "having AI evaluate your photo content and see what's in it," which would be awful but is not what the system does. To another it might be what it is: a comparison of a digital hash of the photo (from which no one could recreate the photo itself or derive what it's content might be by looking for pattens) against a list of digital hashes of known CSAM.

To me, what is happening is not "scanning photos" the way I think most people consider the term.
Regardless to me it is word games to get around the fact that your device is being surveilled by software Apple installed on your phone even if it is just looking and comparing your photos hashs to a list. I would not have a problem with this if this software was kept and ran on the iCloud side like other cloud providers do but I take issue if this is done directly on my device that I paid for.
 
Indeed. Though, I have to question the value of staying so dogmatic about having E2E since they've essentially created a workaround making their E2E encryption less... protected? (for lack of a better word)
True but, having 99 photo encrypted and one false positive (if we accept one false positive), it's better than 100 readable by apple on their server
 
Regardless to me it is word games to get around the fact that your device is being surveilled by software Apple installed on your phone even if it is just looking and comparing your photos hashs to a list. I would not have a problem with this if this software was kept and ran on the iCloud side like other cloud providers do but I take issue if this is done directly on my device that I paid for.
But if it's the first step to encrypt all your content on their servers it's better not ?
Personnaly i prefer a scan on my smartphone (who will be negative) and all my picture not accessible by anyone (CSAM only scan) than having like now all photo unencrypted on their server
 
  • Like
Reactions: peanuts_of_pathos
Regardless to me it is word games to get around the fact that your device is being surveilled by software Apple installed on your phone even if it is just looking and comparing your photos hashs to a list. I would not have a problem with this if this software was kept and ran on the iCloud side like other cloud providers do but I take issue if this is done directly on my device that I paid for.
This for me as well.

I can understand why some people are ok with it. I understand why Apple is trying to do it this way (they can continue to claim they can't access your information). But the trade-off isn't worth it to me. It might sound cliche, but it's my device, and they're now going to use my device as a tool for the NCMEC. A worthy cause, for sure. But I didn't sign up to use my device towards that end.
 
A judge will stop it if it violates users 4th amendment rights of search and seizure.
Again: This will only affect US-citizens physically present inside US-borders. There's a big world "out there", and no US-judge will have any say in the affairs affecting more than 95% of the inhabitants of this world. Or any US citizens outside the US borders.
 
This for me as well.

I can understand why some people are ok with it. I understand why Apple is trying to do it this way (they can continue to claim they can't access your information). But the trade-off isn't worth it to me. It might sound cliche, but it's my device, and they're now going to use my device as a tool for the NCMEC. A worthy cause, for sure. But I didn't sign up to use my device towards that end.
Yes you have signed up, when you have checked the case "I have read the CGU and privacy policy".
 
True but, having 99 photo encrypted and one false positive (if we accept one false positive), it's better than 100 readable by apple on their server

We might agree to disagree on that. I'm not saying it's right, but we already give up a level of privacy when we upload to another company's servers. I'm ok with that for backup/organization reasons. I trust it won't be abused - and if it is, there will be a mass exodus.

I am not ok with any false positive in the realm of CSAM - even if it takes multiple to get me reviewed by a human at Apple (which is a whole other can of worms).
 
  • Like
Reactions: bobcomer
In addition once this launches I see a lawsuit challenging the legality of CSAM software being installed on phones. In addition to that it would not be surprising if a class action lawsuit requiring Apple to compensate users for their devices who did not sign up for this software to be installed. It will be interesting to see it play out.
Why would it be illegal to integrate checks in the procedure to upload photos to Apple’s servers, especially if they’re targeted at illegal content, and done in a privacy friendly way where no data leaves your device?
 
  • Like
Reactions: slineaudi
Again: This will only affect US-citizens physically present inside US-borders. There's a big world "out there", and no US-judge will have any say in the affairs affecting more than 95% of the inhabitants of this world. Or any US citizens outside the US borders.
And I don't really see any government who use pedoporn as excuse for their mass surveillance services go against this feature.
 
  • Like
Reactions: UltimoInfierno
Apple. As a very long time Apple user you need to realise when you've come up with a really bad idea, and its no good trying to rope others into it or extend it as it will not make it a good idea.
 
Yes you have signed up, when you have checked the case "I have read the CGU and privacy policy".
Just becuase it is in the terms of service does not make it legal.
Why would it be illegal to integrate checks in the procedure to upload photos to Apple’s servers, especially if they’re targeted at illegal content, and done in a privacy friendly way where no data leaves your device?
Apple does not own the phone that is the problem.
 
Last edited:
  • Like
Reactions: peanuts_of_pathos
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.