Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns

It is illegal to even have those pictures. Thus any cloud companies can be liable if someone uploaded those pictures to their servers. Thus many understands if the scans are done on server side. But here, Apple is doing the scans on users' iPhones locally. That's what most people are not comfortable with since the system itself lacks any transparency.
Incarcerating junkies doesn't solve the drug problem and putting people who consume this content in prison doesn't stop the abuse. All this does is increase the demand for new content because new content is less like to have a hash than old content. So this policy violates most people's rights while likely also increasing abuse rates.

It seems the people who suffer from this practice are the innocent, the ill, and the children while the people who produce the content see increased profit from increased demand for new content.

Who, exactly, is this helping?
 
Incarcerating junkies doesn't solve the drug problem and putting people who consume this content in prison doesn't stop the abuse. All this does is increase the demand for new content because new content is less like to have a hash than old content. So this policy violates most people's rights while likely also increasing abuse rates.

Who, exactly, is this helping?
Yup. Makes you wonder what's the actual purpose of setting up this completely opaque system, hard-coded into the OS, that can scan user's iPhone.
 
The scans are still being done whether it's a US iPhone or not since the hashes are coded in to iOS itself.
As for other countries, NCMEC does have international counterparts, so I guess they also have their databases of hashes. This is assuming that Apple only accepts hashes from these bodies, and they're not compromised in any shape or form.

The tricky part is, since this involves the judgement of morality, some countries have different standard than others, and thus what is considered illegal can differ.
Yes exactly. They'll presumably be divergence unless this database is international.
 
Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.
The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.
This is pretty much a non-answer. Sure as-is the system only looks for CSAM content over a threshold but why can't Apple be compelled to extend the system? If a government passes a law requiring Apple to look for certain things how are they going to fight it?

The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.
But they clearly have the ability to turn the system per country so it wouldn't be that hard to add a per country hash list as well. If the hash list truly is built into the OS then are we going to see even more iOS updates going forward to keep the list up to date? Or does the OS have a way to download an updated list? I suspect the list will be able to be updated outside and OS update so then it really isn't built into the OS.

This on-device scanning was a genie that should never have been let out of the bottle.
 
Yes exactly. They'll presumably be divergence unless this database is international.
The hashes will be international as whoever it's from, Apple code it directly into iOS. So all iPhones will have all the hashes, and will scan for them. The difference might be what will be matched from the server side, depending on the region.
 
I am not saying this is wrong or right, just yet. It won’t even affect me, since I am not American. I do have concerns. Any such technology can always be abused. And when it can be abused, it is only a matter of time.

But it ALWAYS could have been abused even before this, because they were scanning in the cloud already. So this new change makes no difference in that regard. It simply makes what they were already doing more private. That's a good thing.
 
I don't get this. If one is a criminal then all that person needs to do is switch off the icloud, right? So why is this a good thing in Apple's eyes when the criminals will avoid it (as usual) and us consumers might be facing other privacy issues down the road? As Apple said themselves when the Bernandino shooting happened - there is no such thing as good back door.
This is just wrong.
 
At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.
Okay but what's considered illegal? In Hungary, being gay is being equated to being a pedophile. So if Apple's definition of pedophilia must change with each country's local laws, then in Hungary, pictures depicting anything that seems to "promote homosexuality to minors" is illegal. There are many questionable laws and with the wrong intentions anything can be made "technically illegal".

Apple can't decide what pedophilia is. The US law does. And in other countries, the local laws do. And in many countries pedophilia is being redefined so that it can be taken advantage of to be used as a political weapon to turn people against "liberals" and people who go against traditional Christian family values in hopes of gaining the sympathy and votes of racist, homophobic voters who unfortunately make up a good portion of the uneducated, bigoted population.
 
Incarcerating junkies doesn't solve the drug problem and putting people who consume this content in prison doesn't stop the abuse. All this does is increase the demand for new content because new content is less like to have a hash than old content. So this policy violates most people's rights while likely also increasing abuse rates.

It seems the people who suffer from this practice are the innocent, the ill, and the children while the people who produce the content see increased profit from increased demand for new content.

Who, exactly, is this helping?
Ah yes, the unforeseen consequences of such a righteous endeavor that should not be opposed because the ends clearly justify the means, and how dare you even ask questions because think of just one that might be helped, and you are just so selfish.
 
But it ALWAYS could have been abused even before this, because they were scanning in the cloud already. So this new change makes no difference in that regard. It simply makes what they were already doing more private. That's a good thing.

They weren't "already" using our devices to do it.

I don't understand why those that are ok with this can't understand that concept. Maybe it's a lack of appreciation of what true ownership means?

I'm still not convinced if I'm ok with this or not, as I do fully understand how and what they're doing, and why they're doing it to be honest. But they're using my device to do what they can just as easily do in the cloud. Yes, it's less private in the cloud. But I (and anybody else using iCloud) have accepted that already.

And if we're being 100% honest, if they're scanning in the cloud already, isn't this a step backwards from a standpoint of helping out the NCMEC? Now people with CSAM are aware of a way they can keep their filth more hidden.
 
I have my doubts about the current system as-is though.

Hypothetical: we know of instances where 17-year-olds have been filmed because they lied about their age, and the law is clear that constitutes child pornography.

With the ability to share videos rapidly and the “revenge porn” trend, there exists the possibility that an individual downloads the above, unknowingly thinking the people in the video are of age (it is difficult to separate an 18 year old legal adult from a 17 year old “child).

Now are these photos/videos in the csam database? Could some poor coomer be in hot water despite not knowing the people in his private collection are underage?

We don’t know.
 
Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

Rehashing the timeworn "if you ain't doing nothing illegal you ain't got nothing to worry about!" argument.
 
They weren't "already" using our devices to do it.

I don't understand why those that are ok with this can't understand that concept. Maybe it's a lack of appreciation of what true ownership means?

I'm still not convinced if I'm ok with this or not, as I do fully understand how and what they're doing, and why they're doing it to be honest. But they're using my device to do what they can just as easily do in the cloud. Yes, it's less private in the cloud. But I (and anybody else using iCloud) have accepted that already.

And if we're being 100% honest, if they're scanning in the cloud already, isn't this a step backwards from a standpoint of helping out the NCMEC? Now people with CSAM are aware of a way they can keep their filth more hidden.

If you understand it is more private, then what's the issue? And if you don't use iCloud for photos, nothing is being scanned anyway (not that it matters, since no scan data would leave your phone if you're not uploading to iCloud). And, no, it's not a step backwards. Apple's goal here is not to prevent perverted people from having CSAM on their iPhones (THAT would be the overreach that everyone's acting like THIS is), but rather to prevent them from using their servers to store and distribute it. So this does the same thing but in a more private manner.

And let's get something straight. You own the physical device; you don't own iOS.
 
Okay but what's considered illegal? In Hungary, being gay is being equated to being a pedophile. So if Apple's definition of pedophilia must change with each country's local laws, then in Hungary, pictures depicting anything that seems to "promote homosexuality to minors" is illegal. There are many questionable laws and with the wrong intentions anything can be made "technically illegal".

Apple can't decide what pedophilia is. The US law does. And in other countries, the local laws do. And in many countries pedophilia is being redefined so that it can be taken advantage of to be used as a political weapon to turn people against "liberals" and people who go against traditional Christian family values in hopes of gaining the sympathy and votes of racist, homophobic voters who unfortunately make up a good portion of the uneducated, bigoted population.
You definitely make a valid point. And clearly Apple has to remain the gatekeeper here. Which is kind of scary: leaving it to a company to take the right moral course when governments and conservatives fail to do so.

However, Apple hasn’t changed the game today. It could have been scanning iCloud photos for years to comply with local laws. It will always have to stand up against this kind of thing. That’s why it’s a good thing they do the checks client-side, so no data leaves your device and iCloud libraries remain encrypted.
 
And let's get something straight. You own the physical device; you don't own iOS.

I know. And Apple has historically done a good job of respecting the fact that we own the physical device while they own and control the OS. There is precedent that they respect that relationship. This goes against that precedent, and tips that balancing act in the wrong direction. Ownership of the physical device should not be dismissed just because they have full control and ownership of the OS.

If you understand it is more private, then what's the issue?

We've already accepted the less private nature of iCloud - that's their servers, not mine. Apple also re-confirmed today that they're going to be doing CSAM scans of iCloud photos anyway. So... if they're doing that part anyway, please tell me what's the point of doing the hash matching and safety voucher process on a device that's uploading to iCloud? I truly don't see any benefit to it now that they've confirmed that.
 
We've already accepted the less private nature of iCloud - that's their servers, not mine. Apple also re-confirmed today that they're going to be doing CSAM scans of iCloud photos anyway. So... if they're doing that part anyway, please tell me what's the point of doing the hash matching and safety voucher process on a device that's uploading to iCloud? I truly don't see any benefit to it now that they've confirmed that.

All checks will happen client-side, even for photos already in the cloud. I assume they will be temporarily downloaded, similar to how pattern matching is done for Faces, objects, memories, … this way all pics in the cloud can remain encrypted and no data leaves your device. A much more privacy oriented approach than doing all checks server-side.

but I admit: Apple‘s communication department is making a mess of everything lately. Not their engineering department.
 
I know. And Apple has historically done a good job of respecting the fact that we own the physical device while they own and control the OS. There is precedent that they respect that relationship. This goes against that precedent, and tips that balancing act in the wrong direction. Ownership of the physical device should not be dismissed just because they have full control and ownership of the OS.

I don't understand your thinking here at all. Apple isn't trying to take your physical iPhone/iPad away from you. This has nothing to do with ownership of your physical device.

We've already accepted the less private nature of iCloud - that's their servers, not mine. Apple also re-confirmed today that they're going to be doing CSAM scans of iCloud photos anyway. So... if they're doing that part anyway, please tell me what's the point of doing the hash matching and safety voucher process on a device that's uploading to iCloud? I truly don't see any benefit to it now that they've confirmed that.

So now iCloud is going to be more private in this regard. The article you linked (which I assume is true, though they don't cite their source for the claim "Apple has confirmed that photos already uploaded to iCloud will also be checked against CSAM hashes when the feature goes live this fall") indicates that the in-the-cloud scan is for existing iCloud photos, not new ones. That of course makes sense. It's not saying your photos are going to be double-scanned: once on the device and again in the cloud. The whole POINT of this new method is to move scanning of any new photos to the device where it's much more secure.
 
The hashes will be international as whoever it's from, Apple code it directly into iOS. So all iPhones will have all the hashes, and will scan for them. The difference might be what will be matched from the server side, depending on the region.
Yeah probably a not replaceable *.so or *.dylib file, suuuurrrrreeee Apple! So coded into iOS.
 
We've already accepted the less private nature of iCloud - that's their servers, not mine. Apple also re-confirmed today that they're going to be doing CSAM scans of iCloud photos anyway. So... if they're doing that part anyway, please tell me what's the point of doing the hash matching and safety voucher process on a device that's uploading to iCloud? I truly don't see any benefit to it now that they've confirmed that.
"going to be doing CSAM scans of iCloud photos anyway." Well, that's not quite what they said... they said 'previously uploaded .. iCloud photos'

It certainly seems like scanning all iCloud photos themselves going forward is not their goal. I still suspect this is an expensive processing option that they can easily offload to the source devices which are already running a ML/AI pipeline on every photo, making it inexpensive processing wise to do there before encryption (although I guess downloaded images don't go through that pipeline today? So maybe they would have to get additional processing.) Does this mean all iPhone images now get an EXIF 'HashCode'?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top