Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
59,694
23,866


More than a dozen prominent cybersecurity experts hit out at Apple on Thursday for relying on "dangerous technology" in its controversial plan to detect child sexual abuse images on iPhones (via The New York Times).

Child-Safety-Feature-Purple.jpg

The damning criticism came in a new 46-page study by researchers that looked at plans by Apple and the European Union to monitor people's phones for illicit material, and called the efforts ineffective and dangerous strategies that would embolden government surveillance.

Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery.

"It should be a national-security priority to resist attempts to spy on and influence law-abiding citizens," said the researchers, who added they were publishing their findings now to inform the European Union of the dangers of its plan.
"The expansion of the surveillance powers of the state really is passing a red line," said Ross Anderson, a professor of security engineering at the University of Cambridge and a member of the group.

Aside from surveillance concerns, the researchers said, their findings indicated that the technology was not effective at identifying images of child sexual abuse. Within days of Apple's announcement, they said, people had pointed out ways to avoid detection by editing the images slightly.

"It's allowing scanning of a personal private device without any probable cause for anything illegitimate being done," added another member of the group, Susan Landau, a professor of cybersecurity and policy at Tufts University. "It's extraordinarily dangerous. It's dangerous for business, national security, for public safety and for privacy."
The cybersecurity researchers said they had begun their study before Apple's announcement, and were publishing their findings now to inform the European Union of the dangers of its own similar plans.

Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, politicians, and even employees within the company for its decision to deploy the technology in a future update to iOS 15 and iPadOS 15.

Apple initially endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more in order to allay concerns.

However, when it became clear that this wasn't having the intended effect, Apple subsequently acknowledged the negative feedback and announced in September a delay to the rollout of the features to give the company time to make "improvements" to the CSAM system, although it's not clear what they would involve and how they would address concerns.

Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

Article Link: Apple's Proposed Phone-Scanning Child Safety Features 'Invasive, Ineffective, and Dangerous,' Say Cybersecurity Researchers in New Study
 
Last edited:

_Spinn_

macrumors 68040
Nov 6, 2020
3,947
8,546
Wisconsin
Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.
I just don’t see how Apple thinks this is even feasible. How do they expect to ignore the laws of a local government?

The whole idea of scanning content locally on someone’s phone is a terrible idea that will eventually be abused.
 

LV426

macrumors 68000
Jan 22, 2013
1,659
1,857
“Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system”

Apple cannot refuse such demands if they are written into a nation’s law, so this is a worthless promise. The UK government has the power (since 2016) to compel Apple – amongst others – to provide technical means of obtaining the information they want. But, worse than that, Apple are not permitted to divulge the fact that any such compulsion order has been made. They must, by law, keep those measures secret. It’s all very very Big Brother.
 

iGobbleoff

macrumors 6502
May 2, 2011
350
467
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Because it’s the beginning of a slippery slope of scanning your device for anything else that some group can think of. Phones are now nothing but trackers for big tech and governments to abuse.
 

H2SO4

macrumors 603
Nov 4, 2008
5,228
6,455
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
Are you sure?;
Announced in August, the planned features include client-side (i.e. on-device) scanning of users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
 

Shirasaki

macrumors G5
May 16, 2015
14,150
8,837
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
If government ask you to permanently drive under 30km/h because “it is safe for Children” even on high way, would you do it?

Would you like to stop watching and reading anything but kids books and kids TV shows? The world is scary and bad news hurts kids, plus technical Stuff is useless for kids anyways.

Everything Can be banned nowadays because “it’s harmful for kids otherwise”. So we reduce ourself into mindless robots only serving the machine?
 

Shirasaki

macrumors G5
May 16, 2015
14,150
8,837
So another point here is that the solution is not only privacy-invasive, but also ineffective.
After all, any person in his or her right mind won't upload illegal pictures to cloud services. Supporters for the CSAM plan tend to ignore this.
Yeah. Any pedo who uploads photos to any cloud service deserves to be caught. Professional criminal even build their own cell service Btw. Apple’s approach will not help catching one more pedo than what we do today, but assuming every single person is guilty until proven otherwise may also open another can of worm where law enforcement may be authorised to prosecute anyone without consequences, just because of one photo.
 

JaydenA

macrumors newbie
Sep 22, 2021
15
19
I think the telling part of how Apples anti-CSAM plans will increasingly be leveraged by countries around the world are in this part of the article:

"According to the researchers, documents released by the European Union suggest that the bloc's governing body are seeking a similar program that would scan encrypted phones for both child sexual abuse as well as signs of organized crime and terrorist-related imagery."

Soon enough, with Apple enabling the tech, each country will come and ask for more and more things to be detected and reported.

Apple won't stop selling into the EU so will need to add detection and reporting of photos of guns and bombs, and it won't stop selling into Australia, so will need to report any cartoons of Bart Simpson naked.
 

laptech

macrumors 68020
Apr 26, 2013
2,082
2,625
Earth
The problem with Apple over this issue is not on what they are saying rather it being what they are not saying. it appears that everyone who has been critical of Apple to introduce CSAM is in agreement that there needs to be ways to prevent child poronography BUT the system that Apple has introduced can be manipulated in other ways and it is this that has got critics of the system worried.

Critics of the system are being very careful on the wording they use to convey their disgust of the system and Apple for their part are being very careful on their responses. Basically it comes down to this, if the system was to go live and a government then uses it's countries laws to try and force Apple to get CSAM to search for other stuff too, would Apple comply or would they say no, because the consequences of saying no to a countries law could mean they are forced out of the country and thus prevented from selling their products there.

This quote from Apple is where I believe the problem lies:
Apple has also said it would refuse demands by authoritarian governments to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although it has not said that it would pull out of a market rather than obeying a court order.

People are being critical of CSAM being installed on Apple devices because Apple is not being forthcoming on what they would do if a countries government uses it's laws or court orders to use the system to scan for other types of images, hence by the example in the quote. Critics of the system want definite answers from Apple but Apple is not doing that.
 

threesixty360

macrumors 6502a
May 2, 2007
606
1,096
I will put another angle on this. The technology is here to do this, that's why the researchers were already researching the issue because it is also being proposed by the EU. If thats the case then countries like China will already be asking their OEM's to put this type of technology in place.

Once the genie is out of the bottle its out.

The issue will now be the oversight we have for such technology. So it's going to come down to who you trust, Apple or Huewei, OnePlus, Samsung, Xaomi etc.. etc..
The difference will be that the other OEM's wont produce a white paper about it. You'll just find its already there in your next phone. Or you wont even realise it is.
 

mrsebsin

macrumors newbie
Feb 27, 2021
25
159
Apple is not getting one more cent from my family and I until this crap is officially ended.

We have supported and used Apple products for 30+ years. We’ve never had a reason to completely abandon them until now and the only way they will listen is if we speak with our wallet.

We’ve already canceled all services we had including Apple TV+, iCloud, Apple Arcade, and Apple Card. I’ve gotten rid of both our Apple TVs. We went with Sony XM4 headphones instead of AirPods Max.

My son has already jumped the Apple ship completely and my wife and I are not far behind. I will not purchase any new Apple hardware. If we absolutely need to replace our iPhones, iPads or MacBook it will either be used or outside of the Apple echo system.

Tim Apple and co have lost the f-n plot. Not one more cent.
 

jamajuel

macrumors newbie
Dec 11, 2020
6
11
If government ask you to permanently drive under 30km/h because “it is safe for Children” even on high way, would you do it?

this is pretty nonsensical. there is clear consensus that you would not drive faster than 30 km/h in dense urban areas and not on highways. so clearly it is not black and white and neither are these measures.

Would you like to stop watching and reading anything but kids books and kids TV shows? The world is scary and bad news hurts kids, plus technical Stuff is useless for kids anyways.

no. but again: you do want to protect kids from not watching Scream at age 5y. Seems pretty consensual. Not black and white.

Everything Can be banned nowadays because “it’s harmful for kids otherwise”. So we reduce ourself into mindless robots only serving the machine?

registered child abuse images/videos are not "everything". Surely this is agreeable, if not I think we have very very different thresholds of what’s acceptable in society (and I doubt you have children).

I get the slippery slope argument, but maybe a bit more nuance would be helpful. All the mean while, my phone is happily scanning and analyzing my entire photo library looking for beaches, cats, and dogs. if a nefarious government wants to get that data, the only thing apple can do is not sell phones in their jurisdiction.

edit: btw non of the above means I am "all for" Apple’s planned measures, but I absolutely do believe something can and should be done to curb proliferation of this horrible stuff.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.