Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,490
30,731


The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout.

eff-logo-lockup-cleaned.jpg

Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative feedback from a wide range of individuals and organizations, including security researches, politicians, policy groups, and even some Apple employees.

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

In its response to the announced delay, the EFF said it was "pleased Apple is now listening to the concerns" of users, but "the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely."

The statement by the digital rights group reiterated its previous criticisms about the intended features, which it has called "a decrease in privacy for all ‌iCloud Photos‌ users, not an improvement," and warned that Apple's move to scan messages and ‌iCloud Photos‌ could be legally required by authoritarian governments to encompass additional materials.

It also highlighted the negative reaction to Apple's announced plans by noting a number petitions that have been organized in opposition to the intended move.
The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children. This week, EFF's petition to Apple demanding they abandon their plans reached 25,000 signatures. This is in addition to other petitions by groups such as Fight for the Future and OpenMedia, totalling well over 50,000 signatures. The enormous coalition that has spoken out will continue to demand that user phones—both their messages and their photos—be protected, and that the company maintain its promise to provide real privacy to its users.
The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It's not clear when Apple plans to roll out the "critically important" features or how it intends to "improve" them in light of so much criticism, but the company still appears determined to roll them out in some form.

Article Link: EFF Pressures Apple to Completely Abandon Controversial Child Safety Features
 
Last edited:

Porco

macrumors 68040
Mar 28, 2005
3,315
6,909
I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.

Literally in the article you responded to:

”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
 

ducknalddon

macrumors 6502
Aug 31, 2018
285
485
In its response to the announced delay, the EFF said it was "pleased Apple is now listening to the concerns" of users, but "the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely."

Since when are they putting a backdoor in their encryption? If the EFF wants to be taken seriously they should get their facts right.
 

Nuvi

macrumors 65816
Feb 7, 2008
1,099
810
I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.

The EFF don’t seem to be proposing any alternative solution.

You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
 

laptech

macrumors 68040
Apr 26, 2013
3,572
3,944
Earth
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.

As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
 

InGen

Suspended
Jun 22, 2020
275
935
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
 
Last edited:

benh911f

macrumors 6502
Mar 11, 2009
427
447
“warned that Apple's move to scan messages and ‌‌iCloud Photos‌‌ could be legally required by authoritarian governments to encompass additional materials.”

They keep saying “authoritarian” governments in these articles. I can’t think of any government anymore that’s NOT authoritarian.
 

topgunn

macrumors 68000
Nov 5, 2004
1,556
2,060
Houston
Apple has been doing on device scanning of photos for years where they actually look at the content of the image so as to classify it as having donuts, mountains, or Jimmy. Now they propose to scan image hashes, not the image itself, for known child pornography and this is pushing people over the edge.

I suppose the difference is they don’t notify authorities if you have reached an arbitrary threshold of donut pictures. But if the fear is that this technology could be abused by bad acting nation states, the framework already exists and has for some time.
 
Last edited:

mintier

macrumors newbie
Apr 26, 2012
15
8
I understand the objections here: the system as planned relies that we trust Apple. Personally I believe that Apple have, at least more than any competitors, earned that trust: rather than claim that they're throwing it away, I think they're spending it wisely.

If we can assume that Apple will act in good faith then this seems like a well balanced proposal. If we can't assume that Apple will act in good faith then this proposal isn't the problem.
 

ForkHandles

macrumors 6502
Jun 8, 2012
457
1,098
It’s interesting to note that Apple solves problems. That’s at its very core.

How do they know that children are being exposed to explicit images and that they’ve got lots of CSAM images on their servers?
They’ve already looked and for them to propose this action I suspect the issue is HUGE otherwise they’d do nothing!

I understand a lot of forum members have concerns about privacy issues but rather than bleating about privacy how about suggesting better alternatives to keep our children safe!
 

Infinite Vortex

macrumors 6502a
Mar 6, 2015
541
1,107
Personally I think Apple are going to do this regardless. They may change how they represent it and how much they tell the user but they clearly want this to happen.

The only reason I think they're put it on pause is iPhone 13 sales. Once iPhone 13 is out and iOS 15 on enough devices they'll just push it out. Its not like anyone at that point, once on iOS 15, is going to say no to software updates from then on out.
 

rme

macrumors 6502
Jul 19, 2008
292
436
It’s interesting to note that Apple solves problems. That’s at its very core.

How do they know that children are being exposed to explicit images and that they’ve got lots of CSAM images on their servers?
They’ve already looked and for them to propose this action I suspect the issue is HUGE otherwise they’d do nothing!

I understand a lot of forum members have concerns about privacy issues but rather than bleating about privacy how about suggesting better alternatives to keep our children safe!
Inform the public about signs of child abuse? Encourage people to report it?
 

michaelsviews

macrumors 65816
Sep 25, 2007
1,476
467
New England
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
So you don't want Apple to use CSAM? But is it ok with you that you can never delete emails, post's, tweets and more from Apple, Google, Twitter and others?
 

McG2k1

macrumors 6502
Jun 22, 2011
341
536
Originally this was supposed to just compare the hash of photos to the hash of existing known child porn as it went into or out of encryption. I don’t really have a problem with that. Active AI scanning of actual images seems like a great idea, but it’s encryption back door requirements and near instant potential for misuse by governments worldwide is shocking.

If you don’t understand this, just ask yourself if you think someplace inside of Apple there will be a server full of child porn waiting to be compared to images on your phone…. Or will those images and services be provided by governments, who would then need direct unencrypted instant access to your device. Scary.
 

yannxou

macrumors member
Jun 7, 2012
69
34
Barcelona
The thing is that in a way that tech is already there but without the 'reporting' feature or comparison against a database of 'matching' things. iOS has this indexing mechanism that allows you to search your photos for things like 'chair', 'wedding', 'summer', etc.

Is there information regarding what usage Apple is giving out from this indexed data? Is is stored only in the user's device or also in the cloud? Is it already being shared for 'shared for statistical analysis'?

Maybe photos are encrypted but what about the metadata? Under some potential laws, could governments require Apple to deliver information regarding user's photos metadata?
 

Saturn007

macrumors 65816
Jul 18, 2010
1,449
1,311
Ah, the slippery slope fallacy.

It's no fallacy. It's how freedoms and privacy are eroded, and how authoritarian, fascist governments come to power.

When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

When Apple receives this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know it's doing something wrong.

There. Fixed it for you!
 

threesixty360

macrumors 6502a
May 2, 2007
699
1,362
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
absolutely!
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse. People need to get it in to their heads that if they are online in anyway there is no such thing as complete privacy. Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
 

AndiG

macrumors 6502a
Nov 14, 2008
979
1,860
Germany
The bad things about this "delay" is, that the framework will be already installed. Just not activated.

CSAM is really bad cause it demonstrates that Apple doesn't care about privacy. Money talks. I'm sure that neither Tim nor Craig stopped the official roll out - the marketing department did.

Another example for the danger of Apple being the single point of failure shows up right at the moment:
"Russia threatens to fine Apple, Google unless they remove Navalny app, Ifax reports"
https://www.reuters.com/technology/...y-remove-navalny-app-ifax-reports-2021-09-02/

So it is absolutely clear that regimes like Russia, China - younameit - will abuse CSAM for removing all stuff from the phones they don't want. Or even worse, what about people that have a picture of Nawalny on their phone? Find em, remove em! One wrong picture is enough to mark you a suspicious person.
Photos of women not covered like the religion demands it? Find owners of those phones and send em to jail.

Apple once fought Orwells 1984 just to worship surveillance now ....
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.