Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
54,475
16,528


Amid the ongoing controversy around Apple's plans to implement new child safety features that would involve scanning messages and users' photos libraries, Facebook's former security chief, Alex Stamos, has weighed into the debate with criticisms of multiple parties involved and suggestions for the future.

Child-Safety-Feature.jpg

In an extensive Twitter thread, Stamos said that there are "no easy answers" in the debate around child protection versus personal privacy.

Stamos expressed his frustration with the way in which Apple handled the announcement of the new features and criticized the company for not engaging in wider industry discussions around the safety and privacy aspects of end-to-end encryption in recent years.
Apple was invited but declined to participate in these discussions, and with this announcement they just busted into the balancing debate and pushed everybody into the furthest corners with no public consultation or debate.

Likewise, Stamos said that he was disappointed with various NGOs, such as the Electronic Frontier Foundation (EFF) and National Center for Missing & Exploited Children (NCMEC), for leaving little room for discussion in their public statements. The NCMEC, for example, called Apple employees that questioned the privacy implications of the new features "the screeching voices of the minority." "Apple's public move has pushed them to advocate for their equities to the extreme," Stamos explained.

Stamos urged security researchers and campaigners who were surprised at Apple's announcement to pay closer attention to the global regulatory environment, and speculated that the UK's Online Safety Bill and the EU's Digital Services Act were instrumental in Apple's move to implement the new child safety features.
One of the basic problems with Apple's approach is that they seem desperate to avoid building a real trust and safety function for their communications products. There is no mechanism to report spam, death threats, hate speech, NCII, or any other kinds of abuse on iMessage.
He also said that Apple does not have sufficient functions for trust and safety, and encouraged Apple to create a reporting system in iMessage, roll out client-side ML to prompt users to report something abusive, and staff a child safety team to investigate the worst reports.
Instead, we get an ML system that is only targeted at (under) 13 year-olds (not the largest group of sextortion/grooming targets in my experience), that gives kids a choice they aren't equipped to make, and notifies parents instead of Apple T&S.
Stamos said that he did not understand why Apple is scanning for CSAM locally unless iCloud backup encryption is in the works, and warned that Apple may have "poisoned" opinion against client-side classifiers.
I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups. A reasonable target should be scanning shared iCloud albums, which could be implemented server-side.

In any case, coming out of the gate with non-consensual scanning of local photos, and creating client-side ML that won't provide a lot of real harm prevention, means that Apple might have just poisoned the well against any use of client-side classifiers to protect users.
Nevertheless, Stamos highlighted that Facebook caught 4.5 million users posting child abuse images, and that this is likely only a proportion of the overall number of offenders, by scanning for images with known matches for CSAM.

Article Link: Facebook's Former Security Chief Discusses Controversy Around Apple's Planned Child Safety Features
 
Last edited:

JosephAW

macrumors 601
May 14, 2012
4,236
5,123
Obviously Stamos doesn’t actually use iMessage, Yes there is a mechanism to report spam within iMessage, Many times I have marked a particular iMessage as spam, And when Apple receives enough reports they block that particular iCloud account.
 

MJaP

macrumors regular
Mar 14, 2015
167
474
Wow, it's like a mini cancel-culture starting to form here... "he's from Facebook so his views should be mocked with a snide comment and disregarded"... you learn by listening, not by shutting down conversations.
 

ersan191

macrumors 68000
Oct 26, 2013
1,553
2,899
iMessage isn’t a social media platform…. It doesn’t need a report function.

“I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups.”

Bingo, in my opinion.
 

thefourthpope

Contributor
Sep 8, 2007
1,162
441
DelMarVa
Regardless, this guy seems to know his stuff. I guess it’s one thing to be good at your job, and another to know when to toe the line when it comes to a company like facebook.
Not being a user, I’ve never spent time on how Facebook operates internally. But it’s not hard to imagine someone like this, being good at and thoughtful towards their job, still not being able to push the company in the right direction.
 

I7guy

macrumors Penryn
Nov 30, 2013
27,393
15,744
Gotta be in it to win it
So this guy is advocating for more transparency (which would have been a good thing), hooks into imessage for reporting content. In my opinion, while I can see the benefits, the pitfalls of abuse of the system are there.
 
  • Like
Reactions: thefourthpope

thefourthpope

Contributor
Sep 8, 2007
1,162
441
DelMarVa
So this guy is advocating for more transparency (which would have been a good thing), hooks into imessage for reporting content. In my opinion, while I can see the benefits, the pitfalls of abuse of the system are there.
Indeed. That’s the “no easy answers” part.
 

PBG4 Dude

macrumors 68040
Jul 6, 2007
3,702
3,521
iMessage isn’t a social media platform…. It doesn’t need a report function.

“I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups.”

Bingo, in my opinion.
Too bad the way Apple has announced their new system has been all stick and no carrot. “Yes please, I’d like to sign up to have my phone snitch on me.”
 

DoctorTech

macrumors 6502a
Jan 6, 2014
733
1,940
Indianapolis, IN
"Screeching voices of the minority" - wow...
I am not an expert on the work of NCMEC but they clearly have a noble goal. That may lead them to view anyone who disagrees with any position they take as being "evil" for not supporting their ideas of HOW to achieve that noble goal. Using language such as "screeching voices of the minority" is a prime example of how we can't have a calm discussion about serious issues. Their position seems to be, "I am smart, I am good, therefore anyone who disagrees with me is automatically dumb and/or evil..."

I have never questioned Apple's intentions in this matter. I believe from the bottom of my heart Apple is rolling out these changes with good intentions. I do have concerns that other entities will seize upon this precedent to attempt to force Apple to scan for other "harmful content" in the future such as images of protest flyers, drugs, firearms, "hate speech" (however that is defined in the future), etc. I am just tired of the people on here who accuse anyone concerned about privacy of "supporting child exploitation".
 

xyz667

macrumors newbie
Aug 9, 2021
4
7
“I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups.”

Bingo, in my opinion.
According to Reuters they gave up on E2EE iCloud years ago:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.