Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
65,677
34,295


An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to "build surveillance capabilities into iPhones, iPads, and other products" – a reference to the company's intention to scan users' iCloud photo libraries for images of child sex abuse (via Reuters).

Child-Safety-Feature-yellow.jpg
"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the groups wrote in the letter.
Some signatories of the letter, organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT), are concerned that Apple's on-device CSAM scanning system could be subverted in nations with different legal systems to search for political or other sensitive content.
"Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," reads the letter.
The letter also calls on Apple to abandon planned changes to iMessage in family accounts, which would try to identify and blur nudity in children's messages, letting them view it only if parents are notified. The signatories claim that not only could the step endanger children in intolerant homes or those seeking educational material, it would also break end-to-end encryption for iMessage.

Some signatories come from countries in which there are already heated legal battles over digital encryption and privacy rights, such as Brazil, where WhatsApp has been repeatedly blocked for failing to decrypt messages in criminal probes. Other signers are based in India, Mexico, Germany, Argentina, Ghana and Tanzania. Groups that have also signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Apple's plan to detect known CSAM images stored in iCloud Photos has been particularly controversial and has prompted concerns from security researchers, academics, privacy groups, and others about the system potentially being abused by governments as a form of mass surveillance. The company has tried to address concerns by publishing additional documents and a FAQ page explaining how the image-detection system will work and arguing that the risk of false detections is low.

Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Article Link: Global Coalition of Policy Groups Urges Apple to Abandon 'Plan to Build Surveillance Capabilities into iPhones'
 
Last edited:

cyanite

macrumors 6502
Sep 28, 2015
358
472
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

 

alexiaa

macrumors member
Sep 19, 2020
53
123
Europe
Ah, here goes the "abusive parents" argument again. They conveniently leave out the fact that it's for children under 13 only, and Apple or the parents never learn the actual contents of the message (the parents might if they take the child's phone). A 12 year old child is way too young to be complaining about parents "snooping" on their internet activity IMO, they deserve some extra protection. And it's not like they can't choose not to view the photo (there's even an explicit warning) and then the parents will never know.
 

giggles

macrumors 65816
Dec 15, 2012
1,050
1,285
Remember that summer when an additional step in the iCloud Photos upload pipeline was treated by some people like it’s “spyware”, a “backdoor”, a brand new slippery capability, etc.?

And when said step, fully air gapped from the internet and cryptographically impenetrable, was considered “on-device scanning” when actually its results are only verified once on-server?

Hope we’ll forget about this drama based on technical misrepresentation of what is happening sooner than later.
 

mw360

macrumors 68020
Aug 15, 2010
2,067
2,476
If governments could compel Apple to change these features to do something awful, they could compel Apple to change any features to do something awful. And there are plenty of features that know far more about our data than a double-blind hash matching algorithm, and they aren't under this intense level of suspicion and analysis.
 

aroom

macrumors regular
Nov 26, 2014
145
278
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

There is so much noise produced around this announcement, so much approximative statements or scenarios that are false or purely speculative. It's a total bummer because things should be addressed though, but the discussion is too much emotional and based on opinons right now.

I guess time could help to decant all this noise and help us focus on how to improve privacy. Let's say that right now, using any cloud services based in the US in not a great idea.
 

dragje

macrumors 6502a
May 16, 2012
874
681
Amsterdam, The Netherlands
Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
 

Grey Area

macrumors 6502
Jan 14, 2008
433
1,030
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

The open letter was published today, so no, this article was not posted earlier.

Maybe something similar was, and if so, great - more and more organizations are protesting. This will not just go away quietly. I am also glad that these protests come despite the matter involving CSAM, a touchy topic normally well suited to enforce whatever measures. That so many have the courage to speak out against Apple in this indicates that Apple crossed a serious line and that "think-of-the-children" is wearing thin as an alibi.

The technical documents do not address the core objections in any satisfying way. Many people, including experts, have read these documents and still oppose the new system.
 

haruhiko

macrumors 604
Sep 29, 2009
6,685
6,235
If governments could compel Apple to change these features to do something awful, they could compel Apple to change any features to do something awful. And there are plenty of features that know far more about our data than a double-blind hash matching algorithm, and they aren't under this intense level of suspicion and analysis.
Apple could claim that they didn’t have the ability but now they no longer can.
 

giggles

macrumors 65816
Dec 15, 2012
1,050
1,285
The technical documents do not address the core objections in any satisfying way. Many people, including experts, have read these documents and still oppose the new system.

Doesn’t prove much at this stage. There is much noise and politics around this at the moment. Experts are not saints (see: some covid policy experts that sold out to right wing “just a flu” theories last year).
 

yellow8

Suspended
Mar 14, 2017
540
1,061
I think there is still clarification and analysis to be done on this matter.

- As a parent, I'm very aware of the need of protecting children
- Privacy is one of Apple sales arguments, they wouldn't launch something like this if it would really compromise privacy

I'd love for 100% clarity on this. Let's wait.
 

cyanite

macrumors 6502
Sep 28, 2015
358
472
Obviously the start of something very sinister here. I just didn't expect Apple to be the ones leading the way :/
How is that "obvious"? You're just speculating about the future. Just because something seems obvious to you, doesn't mean it's true or will be.

Let's say that right now, using any cloud services based in the US in not a great idea.
Why not, though? Many millions do and I have yet to hear of any dire consequences coming from that. People are spreading a lot of FUD, but in reality cloud side scanning has happened for years without any publicly known major issues. Sure, you may be in a position where a different threat scenario applies, but that's not the case for the majority of people.

😱 w00t unbelievable, these “Screeching Voices of the Minority.”

But I’m sure there are still reasons to side with Apple. Apple is never wrong, Daddy Tim just want our best💰💵💴💸💶💷💳💎.
Your comment just looks like a teenage rant. Do you have any actual arguments?

„build surveillance capabilities into iPhones, iPads, and other products“

That’s exactly what this new “feature“ is Tim!
How exactly is this surveilance? Can you lay that out in detail? In particular, how is this different from the cloud side scanning everyone does now?

Maybe something similar was, and if so, great - more and more organizations are protesting.
Well, many of the protests are clearly uninformed, since they, after perhaps claiming to have read the documentation, still misrepresent how the system works. I don't mind informed protests or actual arguments. But this is almost exclusively speculation about the future, and not analysis of the present.
 

cyanite

macrumors 6502
Sep 28, 2015
358
472
I think there is still clarification and analysis to be done on this matter.

- As a parent, I'm very aware of the need of protecting children
- Privacy is one of Apple sales arguments, they wouldn't launch something like this if it would really compromise privacy

I'd love for 100% clarity on this. Let's wait.
There IS 100% clarity if you just bothered to read the documentation, I already linked in an earlier comment. The system is described in detail.
 

justperry

macrumors G5
Aug 10, 2007
12,627
9,931
I'm a rolling stone.

it has not said that it would pull out of a market rather than obeying a court order.
Apple stayed in China, there you have your answer.


Pretty positive Apple will backtrack, there's much bad noise right now, it will only get worse.

We should let hear our voices, for instance by not buying their products for a week, 100% guaranteed that will work.
 
Last edited:

Wildkraut

Suspended
Nov 8, 2015
3,583
7,675
Germany
I go so far and say that this CSAM mass surveillance would even have an overall negative impact regarding child porn.
Pedos would start to abuse children even more often, just to generate "new unknown" photos to escape the CSAM scans, instead of sharing and reusing the existing ones.

It must be punished hard, life long jailing no question, but not with a mass surveillance that way.
Now think twice, what's worse, old pictures of already abused children, or new pictures of abused children.
In my option this will push "new" children abuse even more, which will be a sad outcome.
 
Last edited:

giggles

macrumors 65816
Dec 15, 2012
1,050
1,285
I go so far and say that this CSAM mass surveillance would even have an overall negative impact regarding child porn.
Pedos would start to abuse childs even more often, just to generate "new unknown" photos to escape the CSAM scans, instead of sharing and reusing the existing ones.

It must be punished hard, life long jailing no doubt, but with a mass surveillance that way.
Now think twice, what's worse, pictures of already abused children, or new pictures of abused children.
In my option this will push "new" children abuse even more, which will be a sad outcome.

Organizations that actually deal with this problem (as opposed to armchair CSAM experts) seem to think otherwise.
Let’s not forget the “screeching voices” quote, while quoted by Apple in their internal memo, comes from NCMEC‘s mouth.
 

aroom

macrumors regular
Nov 26, 2014
145
278
Why not, though?
well I guess it's due to US laws.

if you want privacy, US laws are not that convenient, not only regarding CSAM. for example, one of the selling point of this Swiss based cloud service is : NSA non-compatible


my point being: the discussion about privacy is important but shouldn't focus only on this on device mechanism, client side. how could a server side scanning be better in any way?
 

Grey Area

macrumors 6502
Jan 14, 2008
433
1,030
Doesn’t prove much at this stage. There is much noise and politics around this at the moment. Experts are not saints (see: some covid policy experts that sold out to right wing “just a flu” theories last year).
The previous poster implied that people would not sign such protest letters if they had read Apple's documentation. However, people have read and understood* the documents and still protest. Experts alone protesting would not prove much, but the objections are easy to understand, and the documents do not invalidate them.

Claiming that protestors just lack understanding plays into Apple's people-are-confused-ruse. People know and understand what Apple intends to do, that is why they protest. That is also why adding more documentation without addressing the core objections comes off as a very blatant diversionary tactic: people do not care whether the gun operator has good references or whether the safety only fails in 1 out of a trillion times - they simply do not want that gun pointed at them.

*As far as the documents allow - Apple glosses over many critical aspects of the system. The system is objectionable under any interpretation, though. More details might make it even worse.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.