Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,537
30,847


Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods.

Child-Safety-Feature-yellow.jpg

Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.

The majority of criticism was leveled at Apple's planned on-device CSAM detection, which was lambasted by researchers for relying on dangerous technology that bordered on surveillance, and derided for being ineffective at identifying images of child sexual abuse.

Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more, in order to allay concerns.

However, despite Apple's efforts, the controversy didn't go away. Apple eventually went ahead with the Communication Safety features rollout for Messages, which went live earlier this week with the release of iOS 15.2, but Apple decided to delay the rollout of CSAM following the torrent of criticism that it clearly hadn't anticipated.

Apple said its decision to delay was "based on feedback from customers, advocacy groups, researchers and others... we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."

The above statement was added to Apple's Child Safety page, but it has now gone, along with all mentions of CSAM, which raises the possibility that Apple could have kicked it into the long grass and abandoned the plan altogether. We've reached out to Apple for comment and will update this article if we hear back.

Update: Apple spokesperson Shane Bauer told The Verge that though the CSAM detection feature is no longer mentioned on its website, plans for CSAM detection have not changed since September, which means CSAM detection is still coming in the future.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," Apple said in September.

Article Link: Apple Removes All References to Controversial CSAM Scanning Feature From Its Child Safety Webpage [Updated]
 
Last edited:

DaveFlash

macrumors member
Nov 6, 2010
53
90
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.
 

Jim Lahey

macrumors 68020
Apr 8, 2014
2,482
5,091
If I ever find out this has been surreptitiously added without my knowledge then I will sell every Apple device I own and never buy another. Anyone who doesn’t have an issue with this has no clue of the concept of mission creep. If these systems are allowed to exist then it’s only a matter of time before the Feds batter your door in for having a [insert future dictator here] meme in your iCloud library. The road to hell is paved with good intentions.
 

Morgenland

macrumors 65816
May 28, 2009
1,476
2,204
Europe
123.gif



Well, so far so good.
However, neither those responsible nor their public clowns, who had originally initiated this espionage attempt, became public.

CSAM? LOL!
In its own interest  should show more offensively that it has removed these forces internally in the interest of privacy. This could restore the trust that Apple so desperately needs for its future projects. Strictly speaking, the 'CSAM' initiative was a business-damaging incident for Apple. Apple needs to defend itself better and more consistently, at least as far as consultant personnel and forces involved in it are concerned.

The fact that  does not position itself more clearly with the possible USP (compared to Android) worries me.
 
Last edited:

Royksöpp

macrumors 68020
Nov 4, 2013
2,249
3,750
Smart move. It’s best for them pull the plug before it blows up in their face.
 

oldoneeye

macrumors regular
Sep 23, 2014
134
418
I am skeptical. My guess is that they will either turn it on (or they have already turned it on) without further fanfare -- "quietly" like the way they do many things -- or they're still working on it and it will come soon.
I applaud your skepticism, but that would I think they'd be foolish to risk so much reputation damage if it ever leaked that they apparently listened and then snuck it through without communication
 
Last edited:

BulkSlash

macrumors 6502
Aug 20, 2013
267
697
This is good news I think. Apple made a big song and dance about how iOS is a single image distributed world wide and that they would allow security researchers to monitor changes to the CSAM hash database for transparency. For them to remove all references to CSAM and then go ahead and quietly implement it without any of that promised oversight, would destroy last the few shreds of credibility their claims of caring about privacy have left. So hopefully this means it's dead and buried.
 

Mayo86

macrumors regular
Nov 21, 2016
104
304
Canada
All hail the "screeching voices of the minority".

I am having a hard time grasping the necessity of implementing this would-have-been feature. You publicly announce you are scanning users iCloud Photos for known sexual abuse images. Now MONTHS later, the probability that pedophiles would utilize iCloud to store their filth has decreased substantially.

Now there are a few outcomes from this:

1. Pedophiles are now nervous to store their filth in iCloud Photos, thereby saving any potential legal damage to Apple anyway. Whether the "feature" will ever be released in some form or another by Apple remains to be seen as this may have been a scare tactic for pedophiles from the get-go.

2. A large amount of distrust, including from myself on so many personal levels, in Apple for a huge oversight in their belief that this would be acceptable. Don't tell me "I have nothing to hide, so scan away" - you're missing the point entirely.

3. Implementation done below users knowledge that is dug up years from now, us clicking "I agree" on vague Terms and Conditions. Lawsuit anyone?


I am watching this topic closely. And so is my wallet.

"Screeching voices" my a**.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.