Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Are you? You cannot with any degree of certainty tell us that the database contains ONLY what they say it does because no one has verified it independently. We live In a world with federal law enforcement doing their absolute best to end access to encryption, secret courts handing out -thousands- of secret warrants, and cops taking it upon themselves to use facial recognition software that runs on their phones. Are you really seriously saying that the idea of a court ordering the NCMEC to add other images to their database is a step too far to believe? Really?

I believe that if the government really wanted to do someone in, there are likely far easier ways to do so than to first find a way to “taint” the CSAM database with pictures of Winnie the Pooh or photos of the insurrection or whatever, hope the people they are targeting use iPhones, pray that it subsequently gets flagged enough times by the system, and that Apple somehow does such a poor job of screening the flagged images that they pass on the information to the relevant authorities despite them clearly not containing child pornography of any sort.

It’s not that I want to keep speaking up for Apple, but some of these hypothetical scenarios being raised feel so improbable to me that I really can’t see it happening.
 
Today we're scanning photos, tomorrow we're scanning texts/Messages. Then we're scanning emails. Next we're scanning Signal. Now we're scanning everything.

Plus, lots of comparisons are made to gmail, OneDrive, etc. Those things aren't personal the same way an iPhone is. I believe a large number of people find it uncomfortable to to be associated with CSAM in the first place.

Allmost without exception all emails are scanned.

Personally I have much more confidential and private stuff in iCloud Drive than photos.
 
It has to be expected that servers are searched for things by whoever can including data on the way in and out. But crossing the line to do it on private devices is not acceptable. Installing some unknown search database for porn is even worse. I'd say this will be legally blown away in many countries. iPhones are operated under national laws different from what Apple might be able to do on US soil.
 
Bingo! Thank you for actually answering OP’s question. I came here to post the same thing but you beat me to it.

My phone is my property. If Apple wants to build surveillance functionality into their cloud servers that they own, that’s fair game. But Apple should not be installing surveillance functionality on our devices.

It's only scanning if you have turned on iCloud Photo Library.

If the photos are going to be scanned, doesn't matter that much where its happening.
 
Point me to when they were scanning on-device, looking for certain images, before this announcement.

Apple were scanning more generally trying to find faces, scenes, specific items, etc.
They also scan and index your entire user area with Spotlight.

Try search for 'sushi' in the Photo app.
 
  • Haha
Reactions: dk001
It's only scanning if you have turned on iCloud Photo Library.

If the photos are going to be scanned, doesn't matter that much where its happening.

Apparantly you didn’t bother to read my last reply to you from the other thread. For your convenience, I’ll paste my reply here as well:

You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So the REAL choice is: accept the new terms and consent to surveillance software being installed on your personal device that conducts warrantless searches, or stop using all Apple products. This is why people are upset.

Imagine if a landlord told a tenant that they must consent to cameras being installed in their home. The landlord *promises* to only turn the cameras on while the tenant uses the Internet. The tenant has no recourse to stop the cameras from being installed. They must make a choice: consent to living with the cameras and *hope* that they aren’t being abused, or find a new place to live.
 
Installing some unknown search database for porn is even worse.
Oh no, scary 1's and 0's. There's no photo data installed on your device. It's literally just hashes that can't be converted into anything else.
 
Apparantly you didn’t bother to read my last reply to you from the other thread. For your convenience, I’ll paste my reply here as well:

You are so close to reaching the crux of the matter, but you slip at the end. Turning off iCloud Photo Library does not prevent the surveillance software from being installed on your personal device. You should know this if you read the press release, so you’re either ill-informed, or you’re being intentionally misleading.

So the REAL choice is: accept the new terms and consent to surveillance software being installed on your personal device that conducts warrantless searches, or stop using all Apple products. This is why people are upset.

Imagine if a landlord told a tenant that they must consent to cameras being installed in their home. The landlord *promises* to only turn the cameras on while the tenant uses the Internet. The tenant has no recourse to stop the cameras from being installed. They must make a choice: consent to living with the cameras and *hope* that they aren’t being abused, or find a new place to live.
It doesn't prevent the code from being there, but turning off iCloud Photos makes it so the code doesn't run.

Also, Apple has long had the ability to put any code on your device. Do you know for 100% certainty that your device isn't doing anything else without your knowledge or did you just trust Apple all along? I don't get where this SUDDEN distrust came from. If anything, I trust Apple more now because they're being completely open and honest about what's going on vs just doing it.

I'll be updating to iOS 15 and those random 1's and 0's won't bother me because a) I won't ever be able to even see the codes and b) I won't have anything even remotely close to matching what's in the database because I don't collect pictures like that and c) I've been using iCloud photos for years and I haven't been falsely accused of anything.
 
  • Like
Reactions: Joe Jackson
I’ve been contemplating just switching off iCloud photos and storing everything on the device, but the last thing I want is to accidentally lose everything. I hope Apple changes their mind.
 
  • Like
Reactions: dk001
It doesn't prevent the code from being there, but turning off iCloud Photos makes it so the code doesn't run.

Also, Apple has long had the ability to put any code on your device. Do you know for 100% certainty that your device isn't doing anything else without your knowledge or did you just trust Apple all along? I don't get where this SUDDEN distrust came from. If anything, I trust Apple more now because they're being completely open and honest about what's going on vs just doing it.

I'll be updating to iOS 15 and those random 1's and 0's won't bother me because a) I won't ever be able to even see the codes and b) I won't have anything even remotely close to matching what's in the database because I don't collect pictures like that and c) I've been using iCloud photos for years and I haven't been falsely accused of anything.
Because promises can’t be trusted when it comes to back doors. I’m literally just using the same argument that Apple did in 2016 when they refused to unlock Syed Farook’s iPhone for the FBI. They only handed over the cloud backups for a reason. The same reason applies here. Warrantless searches should be done in the cloud, not on my personal device. We deserve the presumption of innocence. Get a warrant if you want to search my personal property. End of story.
 
Last edited:
So…just to be clear…you are absolutely okay with them searching your “personal property” the dozen of other ways they do now and have for more than a decade, not just THIS particular way of searching (which btw, has way more toll gates and privacy measures implemented in it) for illegal material??

They can search, track, scan, etc. as always…just not this particular way that is identical to the other searches they do, just with more privacy…got it!
 
So…just to be clear…you are absolutely okay with them searching your “personal property” the dozen of other ways they do now and have for more than a decade, not just THIS particular way of searching (which btw, has way more toll gates and privacy measures implemented in it) for illegal material??

They can search, track, scan, etc. as always…just not this particular way that is identical to the other searches they do, just with more privacy…got it!
It's actually less invasive than your photo library scanning the actual contents of the photos and categorizing them into things like "cats", "cars" etc which nobody had a problem with.
 
  • Like
Reactions: MozMan68
It's actually less invasive than your photo library scanning the actual contents of the photos and categorizing them into things like "cats", "cars" etc which nobody had a problem with.
Don’t forget the hidden tags Apple has embedded that also flag private parts, rainbows, guns, confederate flags, etc….. /s
 
It's actually less invasive than your photo library scanning the actual contents of the photos and categorizing them into things like "cats", "cars" etc which nobody had a problem with.
It is not less invasive for several reasons. The other scanning does not phone home to the government. It’s not searching for criminal material. We should not be subject to warrantless searches just because there are some bad people in the world. Innocence people are presumed innocent for a reason, there is no parallel between looking for a dog in my images and looking for criminal behavior.
 
Last edited:
So just "not using iCloud Photo Library" is the disgusting pervert "privacy switch" here?

Any of you really believe this will stay that way?

If they are honestly trying to curtail CSAM -- and they've built a tool to search through your local device looking for it..

We are to believe this won't morph into searching all your local photos?

C'mon -- don't wizz on me and tell me it's raining..
 
So…just to be clear…you are absolutely okay with them searching your “personal property” the dozen of other ways they do now and have for more than a decade, not just THIS particular way of searching (which btw, has way more toll gates and privacy measures implemented in it) for illegal material??

They can search, track, scan, etc. as always…just not this particular way that is identical to the other searches they do, just with more privacy…got it!

Where are you drawing this conclusion from? I’m not okay with warrantless searching on people that are not presumed to have committed a crime. Cloud servers are not my property, they are infrastructure that belongs to the service provider. I’m consenting to the provider’s terms when I upload content there, including consenting to searches in the cloud.

I am NOT consenting for surveillance software to be installed on my personal device.
 
Where are you drawing this conclusion from? I’m not okay with warrantless searching on people that are not presumed to have committed a crime. Cloud servers are not my property, they are infrastructure that belongs to the service provider. I’m consenting to the provider’s terms when I upload content there, including consenting to searches in the cloud.

I am NOT consenting for surveillance software to be installed on my personal device.
In order for anything to happen you would have to do all of the following

1. Update to iOS 15 when it comes out
2. Enable iCloud Photo Library
3. Have a whole collection of known CSAM photos being uploaded to iCloud

All of these things are optional. Nobody is forcing your phone to report everything it finds to Apple or authorities.
 
Then what do they think it means? Certainly not some highly technical and pedantic definition of "scan" that excludes reading the data to calculate a hash. The phrase "Photos are not scanned. Hashes are generated...." is classic smoke and mirrors (hashes are generated, yes, by scanning the photos... unless you assign some highly specific meaning to "scan").

Read through the discussions. Scan is not used as shorthand for "reading the data to calculate a hash." Scan is being used as though the photos are being scanned for CSAM like they are for objects and people.

Which would be a great supporting argument if it was combined with end-to-end encryption so that, once the photos are on the server, Apple couldn't decrypt them - but that doesn't seem to be the case, at least with iCloud photos (iMessages, maybe...) Also, note that Apple are promising that matches will be reviewed by humans before reporting - so (at least if that process has any meaning) they do have a mechanism for viewing the photos anyway.

Apple can decrypt iCloud photos on their server. They currently only do so when presented with a judicial request.
 
Read through the discussions. Scan is not used as shorthand for "reading the data to calculate a hash." Scan is being used as though the photos are being scanned for CSAM like they are for objects and people.



Apple can decrypt iCloud photos on their server. They currently only do so when presented with a warrant.
Are iCloud photos able to de decrypted if iCloud Backup is set to OFF? The way I've always understood things like Photos in iCloud and Messages in iCloud is that if your device is set to having backups OFF while the other two mentioned are ON then there really isn't a way to decrypt those seeing as they aren't accompanied by an iCloud backup that can be decrypted.
 
A Case could be made without the outrage that even iCloud data shouldn't be scanned but thats not what the outrage is about. The primary objections of pundits and some users is that the scan is occurring on device rather than on iCloud.

so my Questions and confusion rather is this:

If the only available data TO scan for Apple to do on device is the exact same as the iCloud data , why does it matter where the scan occurs to users? especially if the scan is done by Device AI and Apple is only contacted when there is a BULK of hashes per device which match the one in Apple hash servers.
The users are objecting why the scan isn't occurring on servers. Well. It would be the exact same data as the one available to scan on iPhone. All the other data is still locked out to Apple

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

These questions above are different than WHy they are scanning to begin with but to that I have a question as well. If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.
because most people have made the assumption that once data leaves their phones and resides on someone else's server it is going to be subject to a different set of rules versus what happens on their device

in other words, there is a completely different sense of privacy when you speak about a personal device which we hold with us and guard from loss or invasion or hacking or theft

another way to look at it is this:if what you are saying is true why then doesn't apple, a company that continuously touts privacy as its primary competitive advantage, simply do the scan through it's own servers on the way to icloud ?
 
  • Like
Reactions: Pummers
I would like to know how they came up with that number.
What I have been learning over the last few days about hashes says that number is likely false. It is far more common.

Worst case someone gets accused falsely and a life is ruined. Can they legally sue Apple?

Apple will do a human review which will catch almost all false positives. Apple will also not be turning the material over to law enforcement agencies but to NCMEC.

If Apple believes it's child pornography after this human review they have to, by law, report it to NCMEC.

Providers like Apple has a high level of immunity in this process.
 
because most people have made the assumption that once data leaves their phones and resides on someone else's server it is going to be subject to a different set of rules versus what happens on their device

in other words, there is a completely different sense of privacy when you speak about a personal device which we hold with us and guard from loss or invasion or hacking or theft

another way to look at it is this:if what you are saying is true why then doesn't apple, a company that continuously touts privacy as its primary competitive advantage, simply do the scan through it's own servers on the way to icloud ?
You trusted Apple with all of your data up until this point, so why do you think suddenly they just threw all privacy out the window and they're now going to use it to spy on you?
 
So in the event of a match what is the "human" at Apple inspecting? I doubt they have the CSAM pictorial database on hand to do a side by side.

You have to have iCloud Photo Library turned on and unknown number of matches. Apple doesn't at any time know how many matches you have until you reach the threshold.

Then they can read the security vouchers which were created when a match occurred and included there is a derivative of your photo from your device. They will then only have access to the photos which were matched and flagg.

This will also allow them to do end-to-end encryption of iCloud at a later point.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.