Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,545
30,856


Apple's Head of Privacy, Erik Neuenschwander, has responded to some of users' concerns around the company's plans for new child safety features that will scan messages and Photos libraries, in an interview with TechCrunch.

Child-Safety-Feature-Blue.jpg

When asked why Apple is only choosing to implement child safety features that scan for Child Sexual Abuse Material (CSAM), Neuenschwander explained that Apple has "now got the technology that can balance strong child safety and user privacy," giving the company "a new ability to identify accounts which are starting collections of known CSAM."

Neuenschwander was asked if, in retrospect, announcing the Communication Safety features in Messages and the CSAM detection system in iCloud Photos together was the right decision, to which he responded:
Well, while they are [two] systems they are also of a piece along with our increased interventions that will be coming in Siri and search. As important as it is to identify collections of known CSAM where they are stored in Apple's iCloud Photos service, it's also important to try to get upstream of that already horrible situation.

When asked if Apple was trying to demonstrate to governments and agencies around the world that it is possible to scan for illicit content while preserving user privacy, Neuenschwander explained:
Now, why to do it is because, as you said, this is something that will provide that detection capability while preserving user privacy. We're motivated by the need to do more for child safety across the digital ecosystem, and all three of our features, I think, take very positive steps in that direction. At the same time we're going to leave privacy undisturbed for everyone not engaged in the illegal activity.

He was asked if Apple had created a framework that could be used for law enforcement to scan for other kinds of content in users' libraries and if it undermines Apple's commitment to end-to-end encryption.
It doesn't change that one iota. The device is still encrypted, we still don't hold the key, and the system is designed to function on on-device data... The alternative of just processing by going through and trying to evaluate users data on a server is actually more amenable to changes [without user knowledge], and less protective of user privacy... It's those sorts of systems that I think are more troubling when it comes to the privacy properties — or how they could be changed without any user insight or knowledge to do things other than what they were designed to do.

Neuenschwander was then asked if Apple could be forced to comply with laws outside the United States that may force it to add things that are not CSAM to the database to check for them on-device, to which he explained that there are a "number of protections built-in" to the service.
The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."

See TechCrunch's full interview with Neuenschwander for more information.

Article Link: Apple Privacy Chief Explains 'Built-in' Privacy Protections in Child Safety Features Amid User Concerns
 

jimbobb24

macrumors 68040
Jun 6, 2005
3,343
5,355
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,576
10,517
OK but this doesn’t change anything.
People who are overly paranoid about this feature are still going to be overly paranoid about this feature.
but I think the funniest thing is, the place where I’ve seen the most paranoia about this feature is Facebook.
If you use Facebook, this feature shouldn’t even slightly concern you because your privacy is already gone
 

Mebsat

macrumors regular
May 19, 2003
215
367
Florida
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,576
10,517
Regardless as to how Apple tries to Spin It, the chances of an iOS 15 Boycott are now real !
Highly doubt that.
99.999% of people will have their phones update to iOS 15 without them knowing, and never ever think about it ever again.
And even if you don’t update to iOS 15, who’s to say that iOS 14.8 won’t add this feature already?
Let’s be serious for a moment, millions and millions of people buy iPhones every year, this isn’t going to change that.
 

Jonas07

macrumors regular
Nov 2, 2020
149
245
Genève, Suisse
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us

It must be fun living in an imaginary world.
 

JosephAW

macrumors 603
May 14, 2012
5,960
7,914
Again I don't think Apple has a choice in the matter but has already been decided for them by unelected government officials.
Apple waited the last minute to make this public before the developers and beta testers discovered this “feature” on their own.

I don’t know if Apple realizes this but by implementing CSAM it will be targeting a certain religious group who’s religious practices and laws condone such behaviors especially in the state of CA signed and approved by the governor. Will they receive an religious exemption when positives are found? We will see.
 

Lounge vibes 05

macrumors 68040
May 30, 2016
3,576
10,517
If Apple had lead with this interview, I think a lot of peoples' concerns would have been laid to rest from the very start.
Na.
Read these comments from after the interview.
People just love things to be outraged about.
People like to think they’re important.
Apples going to track my photos, the governments going to track me with vaccines, Tesla is going to track me while I’m in my car, while the truth is that these companies don’t care about you.
If they can sell your data to advertisers for money, they will.
If they can’t, then what’s even the point. What does Apple have to gain from this feature. Learning more about you? They don’t care about you, as long as you’re spending money on them. That’s all they care about.
Do you really think some executives at Apple Park want to search through all of your thousands of photos of your life that they couldn’t care about? Of course they don’t
 

tylersdad

macrumors regular
Jul 26, 2010
200
520
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
Even if nobody else picked up on the sarcasm in your post, I did.
 
Enough with this already, please. Apple, STOP trying to brainwash us every day now. Apple, STOP feeding us with the bull. I have a compromise for you to take. Since you did not even bother what the consumer thinks about this CSAM and you are bringing the CSAM feature this fall...

At this point, I am willing to pay $99 per year for keeping my privacy to myself. It's a win/win. Go ahead and start charging consumer to keep their privacy to themselves. We are talking billion of dollars in PROFIT.

$99 Per Year Subscription

"No iCloud Photo/Messages Scanning, Keep Your Data to Yourself and 100% Privacy"

That's Apple. That’s iPhone.



I will wait for your response...
 
Last edited:

Cosmosent

macrumors 68020
Apr 20, 2016
2,315
2,693
La Jolla, CA
Highly doubt that.
99.999% of people will have their phones update to iOS 15 without them knowing, and never ever think about it ever again.
And even if you don’t update to iOS 15, who’s to say that iOS 14.8 won’t add this feature already?
Let’s be serious for a moment, millions and millions of people buy iPhones every year, this isn’t going to change that.

The vast majority of the General Public does NOT yet know about this issue.

It ONLY started hitting the wires last Thursday.
 

LeadingHeat

macrumors 65816
Oct 3, 2015
1,044
2,608
He definitely seems to skirt most of those questions. Which is interesting. I still trust Apple to do the right thing for privacy, which I’m glad it’s checking hashes and not doing image scanning… but we’re definitely in troubled waters with the mob mentality going around tech forums and the general media right now. Going to be an interesting next few weeks/months
 

jweinraub

macrumors 6502
Jun 26, 2007
371
219
Sol III
Bunch of ******** to hide the fact they will scan your photos and messages, you have to be stupid to believe it will only for children between 0-12yo.
This is what I don't get. How is it scanning this content? Is it just general nudity? A link to a porn site? People keep mentioning there isn't anything to fear because it is just a hash, not a naked picture of your child running through the sprinkler. Because while they are two very different things, it is the tattling feature that is more concerning. People that do store illegal content in the cloud should get what comes to them because it is just plain stupid to trust anyone. But the iMessage scanning, to me, and definitely should be to others, more alarming....that is what can be weaponised much easier, I gather.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.