Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
54,473
16,515


Respected university researchers are sounding the alarm bells over the technology behind Apple's plans to scan iPhone users' photo libraries for CSAM, or child sexual abuse material, calling the technology "dangerous."

apple-privacy.jpg

Jonanath Mayer, an assistant professor of computer science and public affairs at Princeton University, as well as Anunay Kulshrestha, a researcher at Princeton University Center for Information Technology Policy, both penned an op-ed for The Washington Post, outlining their experiences with building image detection technology.

The researchers started a project two years ago to identity CSAM in end-to-end encrypted online services. The researchers note that given their field, they "know the value of end-to-end encryption, which protects data from third-party access." That concern, they say, is what horrifies them over CSAM "proliferating on encrypted platforms."

Mayer and Kulshrestha said they wanted to find a middle ground for the situation: build a system that online platforms could use to find CSAM and protect end-to-end encryption. The researchers note that experts in the field doubted the prospect of such a system, but they did manage to build it and in the process noticed a significant problem.
We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.

Knowledgeable observers argued a system like ours was far from feasible. After many false starts, we built a working prototype. But we encountered a glaring problem.
Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Nonetheless, concerns over the future implications of the technology being used for CSAM detection are widespread. Mayer and Kulshrestha said that their concerns over how governments could use the system to detect content other than CSAM had them "disturbed."
A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials.

We spotted other shortcomings. The content-matching process could have false positives, and malicious users could game the system to subject innocent users to scrutiny.

We were so disturbed that we took a step we hadn't seen before in computer science literature: We warned against our own system design, urging further research on how to mitigate the serious downsides....
Apple has continued to address user concerns over its plans, publishing additional documents and an FAQ page. Apple continues to believe that its CSAM detection system, which will occur on a user's device, aligns with its long-standing privacy values.

Article Link: University Researchers Who Built a CSAM Scanning System Urge Apple to Not Use the 'Dangerous' Technology
 

citysnaps

macrumors G3
Oct 10, 2011
8,118
14,167
San Francisco
Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments.

Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
 

oldoneeye

macrumors regular
Sep 23, 2014
107
302
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't I suspect a very tiny number will step up and follow through. That requires courage.

.
Maybe monitor the up-take of iOS 15 as an early indicator
 

Schismz

macrumors 6502
Sep 4, 2010
311
316
Apple totally has it covered, don't worry! They obviously understand security exceedingly well. Just look at SIP, cryptographically sealed volumes, all the protections that their "walled garden" affords us. It will totally keep everybody safe, and nothing could possibly go wrong.

...except for Pegasus, which appears to turn their "security" into Swiss cheese, and demonstrates that reality is not compatible with all these narratives and marketing points.
 

PBG4 Dude

macrumors 68040
Jul 6, 2007
3,702
3,521
It's time for Apple to admit it's made a mistake. It's seriously bizarre they ever thought this was a good idea.
The conspiracy theorist in me wonders if this is Apple’s latest warrant canary. Especially since the NeuralHash model was found in iOS 14.3, which Apple confirmed is “an earlier version of the current model”.
 

ececlv

macrumors regular
Sep 26, 2014
101
301
It's time for Apple to admit it's made a mistake. It's seriously bizarre they ever thought this was a good idea.
I bet they will have an announcement next week. They dont want to make the situation worse. They cant just cancel because half the users will be upset “for the kids”. So they have to come up with a coherent plan to get themselves out of this problem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.