Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For the server to do it, it would mean your photos would have to be unencrypted (with respect to apple) on the server. This is currently the case. The reason apple proposes this new technique is so they can encrypt your uploads and not have the key.
It’s far better for privacy if apple cannot
Look at the photos you upload.
What ultimate benefit is there to my privacy if Apple encrypts my uploads without having the key, when they're specifically engineering a system that will surveil camera rolls for potentially criminal content BEFORE UPLOADING THEM, and then forward photos from someone's camera roll to apple and then ultimately to law enforcement?

why the **** should I care if my data is encrypted AFTER they've scanned it?

Also can you please show me where Apple ever said that they plan to implement end-to-end encryption for iCloud Photos, and that on-device scanning for CSAM was a requirement before doing so? Because I keep seeing people talk about that as if it's something Apple has announced, and I'm pretty sure that's a complete fabrication.
 
Its not scanning because it already has the picture identified that is being uploaded. Therefore, it is not introducing a scanning functionality. In fact, it is not scanning AT ALL. iCloud already did the scan to identify which picture to upload, Apple is just including calculating a hash during this process.

There is essentially no scanning, no matter how you look at it. iCloud already "scanned" for the image to be uploaded. I guess you want Apple to completely drop iCloud then since it scans for things to upload.
That is not what Apple said. You should go and re-read the process they were proposing. Regardless, you are missing the point. I don't really care about the technicalities no matter how great Apple or anyone else says they may be. The point is WHY is apple policing their customers in the first place. Do you expect other companies that make devices to follow suit. Should we put cameras in everyones bedrooms with this same tech to make sure no child porn is happening there? Surely that wouldn't be hacked. I pay the police/FBI and other agencies to find these low lifes through my taxes. I am sure all these stupid scanners accomplish is make it harder for law enforcement because it drives these scum bags further underground.
 
For the server to do it, it would mean your photos would have to be unencrypted (with respect to apple) on the server. This is currently the case. The reason apple proposes this new technique is so they can encrypt your uploads and not have the key.
It’s far better for privacy if apple cannot
Look at the photos you upload.
When did Apple propose this? I have not heard one thing about them planning to encrypt uploads. I've got a better idea. Because Apple is a company and not the police, why don't they encrypt my uploads to iCloud and not scan my stuff at all. Brilliant I know. Imagine... a company that understands that they should be catering to their customers because that's where their money comes from.
 
It is unbelievable to me how much money and mental energy people are spending against what I consider to be a non-issue. All kinds of existing technology can be abused, yet we don’t campaign for it to be eliminated. And it still seems that a huge amount of people don’t even understand that 1. The scanning would only be active if you enable iCloud for photos (or elect not to turn it off, as the case may be) and 2. Apple wouldn't be able to see anything on your phone with the scanning process. The only time any scanning information gets exported from your phone is if you upload an illegal image to iCloud and even then Apple can’t decrypt that until there are 30+ Illegal images uploaded.

sure, close your eyes and do not understand the issue (yourself).

 
So cloud scanning is still OK according to the EFF?

You'd think they wouldn't like that either!

Scanning is scanning.

Whether your photos are on your device... or on a server somewhere... it's still scanning.

Device Scanning: "Don't violate my privacy!"
Could Scanning: "Come on in! Look at all my photos!"

:oops:
You don’t have to upload your pictures to the cloud. Scanning on the phone is not optional and intrusive.
 
sure, close your eyes and do not understand the issue (yourself).


Nothing new there. I've dealt with all this before on this forum in various threads on this topic. First of all, you can't argue against hypothetical conspiracy theories - people will cling to them to their dying day. They're not based on facts. I simply put ZERO stock in them and see absolutely no reason why Apple's proposed CSAM scanning method would lead to the horrible things they're "warning" about. Apple isn't handing the "keys to the kingdom" to any third party - they still have as much control over the system as they want, and I see no evidence that they're all of a sudden planning on going rogue and turning into some evil company in cahoots with authoritarian governments.

The letter also mischaracterizes the CSAM detection process. Apple has never stated that the system automatically notifies law enforcement of potential CSAM. Once the threshold is reached, the photos in question are manually reviewed to verify they are actually CSAM before being reported to NCMEC, who in turn notifies law enforcement. So there is absolutely no reasonable chance that someone will be reported to the authorities for innocent images (and Apple has stated that the chance of an account even getting to the point of suspension for 30+ LEGAL images (i.e. false positives) is less than 1 in 1 trillion.

The letter also fails to mention that users have the ability to disable any scanning by simply disabling iCloud for photos. So if someone is paranoid and thinks some government agency is after them for an anti-Biden meme image on their phone, then they can simply not use iCloud to upload their photos. And they shouldn't use any other cloud service, because since they can their photos too (and not privately, as Apple is proposing), they also have the technical potential to be in cahoots with abusive governments.

Look, I'm glad there are people and organizations concerned about security, but in this case I believe their concern lacks substance and is based on hypotheticals/slippery-slope falacies.
 
For the server to do it, it would mean your photos would have to be unencrypted (with respect to apple) on the server. This is currently the case. The reason apple proposes this new technique is so they can encrypt your uploads and not have the key.
It’s far better for privacy if apple cannot
Look at the photos you upload.
Unencrypted on the server is fine by me. I don't care if Apple can look at the photos, it's their servers after all.

It's definitely not that I have photos to hide, it's the law enforcement on my own phone that bothers me.
 
That is why I will not upgrade. We have already seen a lower adoption rate for iOS15, guess why. And I hope, it's not because the button has been moved or people are not forced to upgrade.Of course, they could also inject this into iOS14 at any time, but this is about making a statement to apple.
I love my apple devices, but Apple has seriously grown unappealing in the last months to me. Their culture of ignoring customers and their handling of mistakes is very bad for the brand. Why are they doing this?

Very frustrating, and it sucks for people who have been sucked into this ecosystem and have so much invested in it. They know there just aren’t many options out there for those who have enjoyed all Apple has had to offer, and the cohesiveness of everything working together.

I understand it’s a “first world” problem, but life would be a lot different without Apple products for me, although this has certainly had me thinking of ways I could do without them.

It all sounds fantastic when it’s about protecting children, but that is absolutely not where it will end.
 
For the server to do it, it would mean your photos would have to be unencrypted (with respect to apple) on the server. This is currently the case. The reason apple proposes this new technique is so they can encrypt your uploads and not have the key.
It’s far better for privacy if apple cannot
Look at the photos you upload.
Prove this. I can find no evidence of this.
 
  • Like
Reactions: glowdragon
You probably should read what this scanning, and yes, that's the proper word, does. There's a task on your device that makes a hash number from the picture prior to it being uploaded (hence the term scanning -- you can't make a hash value without scanning the picture), it uses a neural-hash algorithm to describe the picture, then compares it with an on device database, and if it finds a match, it flags it, then does the upload to iCloud, which does not do any scanning, then once you have 30 of these flagged images, it makes up a packet of low res copies of your images, and the hashes they matched and send them on for apple to check manually. If Apple agrees, then they send it to the CSAM people. And that's all according to Apple themselves!

That software that does the neural hash scanning and the database being on our devices is what most of us are objecting to vehemently. If the server was really doing the scanning and comparing as you suggest, I wouldn't have a problem with it.
It is not scanning. It already found the picture before this process even started. Its just a hash function on a found file on your phone. NO scanning AT ALL. Its not looking at your Outlook files, not looking at your downloaded music. In no way is it scanning.

1. iCloud identifies which pictures to be uploaded to Apple's servers <--- if you want to be technical, THIS is the scanning process right here. But CSAM or not, the scanning is being done.
2. Pictures are identified
3. Hash is generated on the identified images - if you are a developer think of all identified pictures in a list and this is just doing a foreach. As in, the collection was already determined in step 1
4. When the image is uploaded, the hash results are also uploaded.

I looked at the technical document, watch dozens of videos and dozens of podcasts on this issue. There is no scanning implemented with this. The images are already being uploaded to iCloud, there is nothing for this new feature to scan.
 
Last edited:
That is not what Apple said. You should go and re-read the process they were proposing. Regardless, you are missing the point. I don't really care about the technicalities no matter how great Apple or anyone else says they may be. The point is WHY is apple policing their customers in the first place. Do you expect other companies that make devices to follow suit. Should we put cameras in everyones bedrooms with this same tech to make sure no child porn is happening there? Surely that wouldn't be hacked. I pay the police/FBI and other agencies to find these low lifes through my taxes. I am sure all these stupid scanners accomplish is make it harder for law enforcement because it drives these scum bags further underground.
I have read the technical document, watched dozens of podcasts and videos on this topic. The images are ALREADY SCANNED as part of the iCloud process. This just hashes the identified images. No scanning. Its not looking at your outlook data, your downloaded music, or other files on your SSD. Only items already in the process of being upload - which in fact is "scanned" in the first place!

People are throwing around the term "scanning" as if you downloaded something off Safari and suddenly that gets flagged. It is NOT scanning. iCloud identifies what images need to be uploaded, then it simply gets a hash on those images.
 
  • Disagree
Reactions: glowdragon
It is not scanning. It already found the picture before this process even started. Its just a hash function on a found file on your phone. NO scanning AT ALL. Its not looking at your Outlook files, not looking at your downloaded music. In no way is it scanning.

1. iCloud identifies which pictures to be uploaded to Apple's servers <--- if you want to be technical, THIS is the scanning process right here. But CSAM or not, the scanning is being done.
2. Pictures are identified
3. Hash is generated on the identified images - if you are a developer think of all identified pictures in a list and this is just doing a foreach. As in, the collection was already determined in step 1
4. When the image is uploaded, the hash results are also uploaded.

I looked at the technical document, watch dozens of videos and dozens of podcasts on this issue. There is no scanning implemented with this. The images are already being uploaded to iCloud, there is nothing for this new feature to scan.
Just htf can it recognize pictures without scanning them? That just makes no sense with your argument.

And NO, iCloud doesn't recognize them, your device does, period.
 
  • Like
Reactions: glowdragon
Just htf can it recognize pictures without scanning them? That just makes no sense with your argument.

And NO, iCloud doesn't recognize them, your device does, period.
iCloud already picks up the pictures that are being uploaded. Its not scanning for every single picture on your device. The whole scanning discussion is massively overblown.
 
First of all, you can't argue against hypothetical conspiracy theories - people will cling to them to their dying day. … Look, I'm glad there are people and organizations concerned about security, but in this case I believe their concern lacks substance and is based on hypotheticals/slippery-slope falacies.

just wow how ignorant some can be! what hypothetical conspiracy theory is behind the fact that children/teens may get abused by their own parents/community (religion) when their parents/community get to know the child is gay/trans…?
 
It is not scanning. It already found the picture before this process even started. Its just a hash function on a found file on your phone. NO scanning AT ALL. Its not looking at your Outlook files, not looking at your downloaded music. In no way is it scanning.
You keep clinging to some subtle linguistic nuance, but that's not the point. OK, let's not call it "scanning". Let's call it "policing". Apple is arrogating the right to check whether I'm a criminal and to report me to the authorities if I am. That's none of their business. Whether I'm a paedophile or not, that's not for Apple to determine. Even if I don't have any CSAM photos on my device, I still do not want Apple to know it, simply because they have no business knowing it.

Apple: "We want to check whether you might be a paedophile."
You: "Sure, go ahead, I've got nothing to hide."
Me: "What's it to you if I am? Mind your own business!"

That's what it's about. It doesn't matter whether you call it "scanning" or not. Apple has no business checking if I'm a good citizen who donates to charity, eats vegetables and drives a Prius. If they really want to check that, then let them do it on their property (i.e. the cloud), not on mine (i.e. the phone), and let them use their own resources. I should not have to waste even one single CPU cycle on my device, not even 0.000001% of the battery, to aid in this check. Because it's a check that doesn't do anything for me, it's done exclusively for the benefit of somebody else.
 
Last edited:
You keep clinging to some subtle linguistic nuance, but that's not the point. OK, let's not call it "scanning". Let's call it "policing". Apple is arrogating the right to check whether I'm a criminal and to report me to the authorities if I am. That's none of their business. Whether I'm a paedophile or not, that's not for Apple to determine. Even if I don't have any CSAM photos on my device, I still do not want Apple to know it, simply because they have no business knowing it.

Apple: "We want to check whether you might be a paedophile."
You: "Sure, go ahead, I've got nothing to hide."
Me: "What's it to you if I am? Mind your own business!"

That's what it's about. It doesn't matter whether you call it "scanning" or not. Apple has no business checking if I'm a good citizen who donates to charity, eats vegetables and drives a Prius. If they really want to check that, then let them do it on their property (i.e. the cloud), not on mine (i.e. the phone), and let them use their own resources. I should not have to waste even one single CPU cycle on my device, not even 0.000001% of the battery, to aid in this check. Because it's a check that doesn't do anything for me, it's done exclusively for the benefit of somebody else.
Then where is the outrage on Google, Microsoft, Dropbox? They are do this "policing". Oh right, because it is just cool to hate on Apple. The minute Apple wants to improve their NCMEC reporting they get called out, meanwhile every other major company that hosts files actively does this policing without anyone saying a word.
 
  • Disagree
Reactions: glowdragon
Then where is the outrage on Google, Microsoft, Dropbox? They are do this "policing".
It's right in my last paragraph. They're doing it on their own servers, using their own resources.

Besides, this finger pointing at the competition, which CSAM supporters keep resorting to when all else fails, is a fallacy. The fact that others are bad is no excuse for Apple to be the same. Should a thief be excused just because others have stolen more than he did?
 
It's right in my last paragraph. They're doing it on their own servers, using their own resources.

Besides, this finger pointing at the competition, which CSAM supporters keep resorting to when all else fails, is a fallacy. The fact that others are bad is no excuse for Apple to be the same. Should a thief be excused just because others have stolen more than he did?
Except this is part of the iCloud Pipeline. During the upload process. Its not "scanning" and I am glad you changed that word being used. Without iCloud enabled, nothing happens. When iCloud is enabled, this only happens while something is being uploaded. So the exact same files would be checked if they did it on their servers. The alternative would limit privacy by allowing content to be decrypted for the check to occur. This will allow for far greater privacy.
 
  • Angry
Reactions: glowdragon
Well, I've got a better idea. How about encrypting the files end to end and forgetting about any kind of check? This will allow for far greater privacy still.
Why can't Apple just put its money where its mouth is and give us 100% privacy with no strings attached? They've been touting their devotion to privacy for years and years. How about delivering it?
And don't tell me: "others are doing it". I know they are. I just want Apple to be better than others.
 
Well, I've got a better idea. How about encrypting the files end to end and forgetting about any kind of check? This will allow for far greater privacy still.
Why can't Apple just put its money where its mouth is and give us 100% privacy with no strings attached?
And don't tell me: "others are doing it". I know they are. I just want Apple to be better than others.
Because Apple has a duty to report any NCMEC content on their servers. While they might not be legally obligated to actively monitor the files, they do have a duty to report. And Apple's numbers are ridiculous compared to the other big tech. Miserable ~250 reports compared to the 5 or 6 and even 7 digit reports from other big tech. Apple appears to be getting the title of being a Safe Haven for this type of content. So Apple chose the best approach while still preventing this material from being on their servers. I definitely prefer this over all my stuff being able to be decrypted in order to check them.

If you want Apple to prevent this, how about you fight to change the law. Speak to your local representative about the NCMEC rules and regulations in order to drop the duty to report criteria for hosting providers.
 
Apple has no duty to search for it.
You completely ignored the rest of my post. Typical.

While they might not be legally obligated to actively monitor the files, they do have a duty to report. And Apple's numbers are ridiculous compared to the other big tech. Miserable ~250 reports compared to the 5 or 6 and even 7 digit reports from other big tech. Apple appears to be getting the title of being a Safe Haven for this type of content.
 
While they might not be legally obligated to actively monitor the files, they do have a duty to report.
If they find it.
They have no duty to actively search for it on the grounds that they suspect it might be there.
If political correctness activists want to consider Apple a safe heaven just because they don't actively search for such stuff, then so be it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.