Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You do not need to upload photos. If you have these apps installed on your phone and you grant access to the photo library, they already scan for CSAM. These apps aggressively upload some contents in the gallery to speed up the upload process when you make a new post. That's why Apple implemented access to the gallery on a per photo basis.

According to the FAQ:

Does this mean Apple is going to scan all the photos stored on my iPhone?

No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.

Does turning off iCloud Photos disable CSAM detection?

Yes. When iCloud Photos is deactivated, no images are processed. CSAM detection is applied only as part of the process for storing images in iCloud Photos.


I also don't see any references to what you mention in the technical implementation document, though I think you might be referring to this:

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.

which wouldn't happen unless iCloud Photos is enabled, after which the comparison is made prior to the upload.
 
I don't think a lot of people understand what they're asking for here. If Apple abandons on-device scanning of uploads to iCloud (which is the only time this scan would happen) then they will have to implement scanning on their servers, which opens up a much bigger can of worms in my opinion including having your photos unencrypted on their servers. They have to scan it at some point for legal reasons to prevent child porn from ending up on their servers so isn't it better that it happens on your device with as much privacy and checks and balances built in as possible?
 
Despite the facts that it scans locally before uploading, that it only uses hashes initially, that child exploitation is a serious problem that must be solved and that Apple has done a pretty good design job to protect privacy it still isn’t right. This is still digital surveillance of the private population. And worse still it is being done by a corporation of unelected individuals deciding for us how it should be done rather than law enforcement. The term “overreach” is often used in government and it applies here. Apple is not responsible nor accountable for CSAM detection in law enforcement and no country’s citizens have passed a law to give them this mandate. However secure, private and well intentioned this system may be they are breaching the privacy of the people without the people’s’ permission.
“A serious problem that must be solved”

Yes and no.

Murder is a serious problem but it cannot be solved. Corrupt politicians are a serious problem that cannot be solved. And on and on and on.

Most problems that arise from human failings can be mitigated but cannot be solved. We already mitigate child pornography in many ways. The question is this new way worth the trade offs for the gains.
 
This functionality opens a backdoor and the iPhone is only a hash value away from detecting images like the famous tank picture and informing the chinese authorities.

„After a Chinese cybersecurity law came into effect in 2017, Apple started storing customer iCloud data—spanning emails, contacts, photos, and geolocation—on computer servers in China and handled by Chinese state employees.“
https://fortune.com/2021/05/18/apple-icloud-data-china/



Furthermore, Apples arguments are fake. Why on earth should Apple implement a system that searches for specific content on the phone when this is already implemented in the cloud?

The reason is that Apple developed a framework for scanning content and offers this framework to app developers. Maybe Apples wants to keep nudity from the iPhone - not only CSAM. The next step in iOS16 has to be to make this framework mandatory - or China makes it mandatory if you want to sell Apps in China. But China is only an Example - just think of whistleblowers or content that authorities want to block? You cannot block content on the web - but hey, if you could block it on every single device?


What we‘ve learned: If Tim enters the stage and talks about privacy, what he really means is „blah blah privacy blah blah“. For Tim „privacy“ ist just another word for „blah“.


030.jpg
Or maybe Apple has an obsession with people who take pictures of unreleased products and puts fuzzy hashes of what their upcoming products look like into their phones so the Chinese government can make the leakers... disappear
 
This functionality opens a backdoor and the iPhone is only a hash value away from detecting images like the famous tank picture and informing the chinese authorities.

Chinese authorities will infiltrate child security organisations in two different jurisdictions, put an hash for that specific photo. And then someone is gonna pass the 30 threshold limit and somehow the manual review process will not notice that those images are not actually child pornography 🤷🏻‍♂️

Furthermore, Apples arguments are fake. Why on earth should Apple implement a system that searches for specific content on the phone when this is already implemented in the cloud?

Because of these four advantages.

According to the FAQ:

Does this mean Apple is going to scan all the photos stored on my iPhone?

No. By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device.

Does turning off iCloud Photos disable CSAM detection?

Yes. When iCloud Photos is deactivated, no images are processed. CSAM detection is applied only as part of the process for storing images in iCloud Photos.


I also don't see any references to what you mention in the technical implementation document, though I think you might be referring to this:

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines whether there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result. It also encrypts the image’s NeuralHash and a visual derivative. This voucher is uploaded to iCloud Photos along with the image.

which wouldn't happen unless iCloud Photos is enabled, after which the comparison is made prior to the upload.

I was talking about apps like Facebook, Instagram that have access to your photos gallery. Not Apple's implementation of CSAM.
 
I don't think a lot of people understand what they're asking for here. If Apple abandons on-device scanning of uploads to iCloud (which is the only time this scan would happen) then they will have to implement scanning on their servers, which opens up a much bigger can of worms in my opinion including having your photos unencrypted on their servers. They have to scan it at some point for legal reasons to prevent child porn from ending up on their servers so isn't it better that it happens on your device with as much privacy and checks and balances built in as possible?
Do they have to? Who says?
If I own a storage facility and someone stores child porn in a box in one of the units am I liable?

It seems people are just magically assuming Apple has some sort of liability here. Who is going to go after them. Is someone going to get a warrant to search all the iCloud accounts in the world? What judge is going to do that? In normal world you have to have a suspicion of a crime you don’t just mine through someone’s life or data looking for anything they may have done wrong. So why would anyone mine through Apples data?
 
I I were a German politician, I would not ask Apple to stop this, but I would start to study the existing laws about privacy and throw the papers on Mr. Cooks desk.

But, you know, this electronic internetz - where that computer data is running is often kind of a secret garden to them. They usually do not really know much about.
 
This is so obviously a bad move for Apple and privacy, it makes my blood boil. Now I can’t help but look at images of Mr. Federighi and pity at how he could possibly have sat in meetings with reasonable, intelligent people and approved this measure, only to be caught completely by surprise by the public reaction. What? You’re a top-level manager of the most valuable company in the world and you’re permitted to make a blunder so large as to threaten the privacy and security of everyone on the planet?

Withdraw this and resign.
 
Correction acording to your wrong believe: The government has every right to have your home searched. Seriously though. How can you even think this is right, especially if Apple spent years to claim otherwise.
Even the government does not have this right. Only if a judge decides it. This is how systems in a democeacy are designed. Separation of legislative, executive and judicial powers.
 
Chinese authorities will infiltrate child security organisations in two different jurisdictions, put an hash for that specific photo. And then someone is gonna pass the 30 threshold limit and somehow the manual review process will not notice that those images are not actually child pornography 🤷🏻‍♂️



Because of these four advantages.



I was talking about apps like Facebook, Instagram that have access to your photos gallery. Not Apple's implementation of CSAM.
You have way too much trust in Apple doing specifically what they say, and not having their arm twisted by other governments who they need to play ball with in order to survive.
 
The most obvious thing in all this is after all these plans discussed everywhere does anyone think that some pedophile will use iCloud or photos app to view explicit content on the phone?

Any file manager can be used for that to which anything can be uploaded from computer.

What is the plan here really? To catch 3 pedophiles who do not have internet access and never seen of CSAM discussions? Seriously?

This. I'm used to politicians proposing simplistic, feel-good solutions to complex problems because most voters lack the critical-thinking skills to understand why said solutions wouldn't work. I'm surprised that Apple is doing it, though. Yes, I watch enough true-crime shows to know that many criminals are stupid and careless ("Hi, I'm Chris Hansen, and you're on Dateline! Did you really think you'd be meeting a 14-year-old girl here today?"), and I suppose some are dumb enough to store their child-porn collections in iCloud, but I assume most of them use other storage solutions and share photos on the dark web. Does Apple have data to the contrary? If so, where did they get it? The only way I can think is if large numbers of pedophiles who have already been caught willingly unlock their Apple devices for law-enforcement authorities, who then find their collections in iCloud Photos. In that case, the methods used to catch them (e.g., sting operations) already worked.

So this sounds like a combination of Apple bowing to political pressure and perhaps naively thinking that most people would be okay with the policy because no sane, civilized person thinks child porn is acceptable. Our culture does seem to view sex crimes as somehow more evil than other types of crimes. Most places in the US have databases of convicted sex offenders so you can find out if any are living in your neighborhood, but I'm unaware of comparable databases for convicted murderers, violent assailants, burglars, fraudsters, and so on. The fact that Apple has refused to unlock the iPhones of suspected terrorists and other criminals points to this double standard. Yes, they claim they have no way to do so, because of their stance on security, but somehow a security company hired by law enforcement found a way to do so in at least one instance.

I'll admit that, initially, Apple's announcement didn't seem like a big deal to me, and I didn't understand the uproar on this forum. The more I've thought about it, though, the more I've come to view it as something that will have little effect on the problem it's intended to address and that potentially will open the the door to abuse.
 
You have way too much trust in Apple doing specifically what they say, and not having their arm twisted by other governments who they need to play ball with in order to survive.

That's the thing: this system is much more auditable than any server-side approach (that's the standard in the industry). I would prefer to have no scanning, but between this implementation or implementation on iCloud I prefer this one.
 
  • Like
Reactions: WiseAJ
I don't think a lot of people understand what they're asking for here. If Apple abandons on-device scanning of uploads to iCloud (which is the only time this scan would happen) then they will have to implement scanning on their servers, which opens up a much bigger can of worms in my opinion including having your photos unencrypted on their servers. They have to scan it at some point for legal reasons to prevent child porn from ending up on their servers so isn't it better that it happens on your device with as much privacy and checks and balances built in as possible?

A good point well made. I’m going to reconsider my position on this and do some more thinking.
 
  • Like
Reactions: aforty
First they came for the socialists, and I did not speak out—because I was not a socialist.
Then they came for the trade unionists, and I did not speak out— because I was not a trade unionist.
Then they came for the Jews, and I did not speak out—because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
I don’t know if you know (you probably did) but you replied to a Christian pastor with a quote by a Christian pastor. That was interesting.
 
Chinese authorities will infiltrate child security organisations in two different jurisdictions, put an hash for that specific photo. And then someone is gonna pass the 30 threshold limit and somehow the manual review process will not notice that those images are not actually child pornography 🤷🏻‍♂️



Because of these four advantages.



I was talking about apps like Facebook, Instagram that have access to your photos gallery. Not Apple's implementation of CSAM.
Apple just proofed those advantages to be fake.
https://www.macrumors.com/2021/08/17/apple-appeals-corellium-copyright-lawsuit/

So what about „Craig Federighi said that security researchers would serve as a check“? Apple, get rid of the system and please stop throwing BS in my direction.
 
  • Like
Reactions: Apple$ and BurgDog
That's the thing: this system is much more auditable than any server-side approach (that's the standard in the industry). I would prefer to have no scanning, but between this implementation or implementation on iCloud I prefer this one.
I don't see how its more auditable. By whom?
I can't look at my phone, see what photos triggered a match and had a version sent to Apple for manual confirmation.

Apple says the reason they have to secure their phones is people carry their most private and intimate moments and data around with them. Now they are saying if you use the phone as it is setup, by default, with everything turned on..... we will fuzzy hash your images and if any might be suspicious, they go for a person at apple to look at. Yes, you have to hit a threshold....I still don't buy that its not able to be abused. As someone mentioned, you could send someone a bunch of generated images designed to trigger those hashes and suddenly a ton of their photos get checked by an Apple employee.

I don't see a way to reconcile the privacy first marketing with this turd of a feature that no iPhone user was asking for. Way to destroy a 10 year marketing campaign overnight...
 
Höferlin is of course right and this is the biggest reason so far to STOP BUYING APPLE PRODUCTS.

I just can not believe that anyone wants software on their phone that looks at stored content and sends the result of that looking some place.

I think the only way to make Apple back off this is if some European countries passed laws to prevent spy-ware. The letter is a good first step. Step two is some draft legislation.
 
The disturbing news today is Apple has been putting secret builds of this on our devices sine ios14.3, kinda does away with the plan to just not upgrade to ios15, may have to rethink the plan ….. I mean wow… trust just went from bad to worse
 
I don't think a lot of people understand what they're asking for here. If Apple abandons on-device scanning of uploads to iCloud (which is the only time this scan would happen) then they will have to implement scanning on their servers, which opens up a much bigger can of worms in my opinion including having your photos unencrypted on their servers. They have to scan it at some point for legal reasons to prevent child porn from ending up on their servers so isn't it better that it happens on your device with as much privacy and checks and balances built in as possible?
You are right - in the first step. But in the second step Apple plans that EVERY single app implements their scanning framework.
 
This functionality opens a backdoor and the iPhone is only a hash value away from detecting images like the famous tank picture and informing the chinese authorities.

„After a Chinese cybersecurity law came into effect in 2017, Apple started storing customer iCloud data—spanning emails, contacts, photos, and geolocation—on computer servers in China and handled by Chinese state employees.“
https://fortune.com/2021/05/18/apple-icloud-data-china/



Furthermore, Apples arguments are fake. Why on earth should Apple implement a system that searches for specific content on the phone when this is already implemented in the cloud?

The reason is that Apple developed a framework for scanning content and offers this framework to app developers. Maybe Apples wants to keep nudity from the iPhone - not only CSAM. The next step in iOS16 has to be to make this framework mandatory - or China makes it mandatory if you want to sell Apps in China. But China is only an Example - just think of whistleblowers or content that authorities want to block? You cannot block content on the web - but hey, if you could block it on every single device?


What we‘ve learned: If Tim enters the stage and talks about privacy, what he really means is „blah blah privacy blah blah“. For Tim „privacy“ ist just another word for „blah“.

Amazing how many people now think a closed iOS is a bad thing. *sigh*

You still need the cloud for this system to work, which is why no scanning will take place unless iCloud Photos is enabled. All this is documented in the technical implementation document.

I've not seen anything on the framework, so I can't comment on that yet, but I doubt it'll be as nefarious as people make it out to be - unless you're referring to the child protection tools in iMessage - that's completely different to the CSAM -> iCloud Photos system.
 
Apple just proofed those advantages to be fake.
https://www.macrumors.com/2021/08/17/apple-appeals-corellium-copyright-lawsuit/

So what about „Craig Federighi said that security researchers would serve as a check“? Apple, get rid of the system and please stop throwing BS in my direction.

There you go:


I don't see how its more auditable. By whom?
I can't look at my phone, see what photos triggered a match and had a version sent to Apple for manual confirmation.

Apple says the reason they have to secure their phones is people carry their most private and intimate moments and data around with them. Now they are saying if you use the phone as it is setup, by default, with everything turned on..... we will fuzzy hash your images and if any might be suspicious, they go for a person at apple to look at. Yes, you have to hit a threshold....I still don't buy that its not able to be abused. As someone mentioned, you could send someone a bunch of generated images designed to trigger those hashes and suddenly a ton of their photos get checked by an Apple employee.

I don't see a way to reconcile the privacy first marketing with this turd of a feature that no iPhone user was asking for. Way to destroy a 10 year marketing campaign overnight...

There you go:

Regarding the bold part: there is no way you can do this in the real world. I've seen some examples on reddit, and this is done by people that implement a hash algorithm themselves. If you know how a hash is produced, then you can also reverse engineer the contents.
Without access to Apple's algorithm, you'd need on average 2.1536865546727644e+28 years to find a photo that generates the same match as a CSAM photo (assuming a 30 character hash in length with only lower case letters).
 
Last edited:
According to another site Reddit has found the hash algorithm in iOS 14.3 and,

"For example, one user, dxoigmn, found that if you knew the resulting hash found in the CSAM database, one could create a fake image that produced the same hash. If true, someone could make fake images that resembled anything but produced a desired CSAM hash match. Theoretically, a nefarious user could then send these images to Apple users to attempt to trigger the algorithm."

If this is true then I hope Apple finds a solution for this before rollout.

But even if this one is not true, we can probably expect a number of other problems to be discovered and perhaps lives ruined after rollout before they are fixed.
If the fake images are sent to the user and if those are uploaded to iCloud and if the number exceeds the notification threshold, then Apple will get a notice and their investigators will be able to unhash those images to check if they match the ones in the NCMAC database. Since these images will not match, they will not have a reason to report the images.

Unless the nefarious sender was sending actual child pornography, in which case THEY will have a problem.

This just doesn't seem like it is an actual compromise of the system.
 
German politician has no idea how this CSAM detection works and prefers a less private way of child porn scanning.
No. It is pretty clear he has a good technical understanding. There is or will be software on the phone that has access to cleartest versions of the data which it can hash.

The good news is that as a member of parliment he has some abilty to outlaw this kindd ofthing, then Apple would not be able to sell phones in either Germany of maybe the EU. That would force Apple to change.

The only REAL solution is to open up the phone so people can run whatever software they like. Make the phone more like a Mac. Yes, Apple will argue about malware on the phone, but this is not a big problem with Macs. This could be done by law too.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.