Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
54,690
16,872


Security research firm Corellium this week announced it is launching a new initiative that will "support independent public research into the security and privacy of mobile applications," and one of the initiative's first projects will be Apple's recently announced CSAM detection plans.

appleprivacyad.jpg

Since its announcement earlier this month, Apple's plan to scan iPhone users' photo libraries for CSAM or child sexual abuse material has received considerable backlash and criticism. The majority of concerns revolve around how the technology used to detect CSAM could be used to scan for other types of photos in a user's library, possibly at the request of an oppressive government.

Apple will check for CSAM photos on a user's photo library by comparing the hashes of a user's pictures to a database of known CSAM images. The company has firmly pushed back against the idea that it will allow governments to add or remove images to that database, refuting the possibility that embodiments other than CSAM may get flagged if found in a user's iCloud Photo Library.

In an interview with The Wall Street Journal, Apple's senior vice president of software engineering, Craig Federighi, said that the on-device nature of Apple's CSAM detection method, compared to others such as Google who complete the process in the cloud, allows security researchers to validate the company's claim that the database of CSAM images is not wrongly altered.
Security researchers are constantly able to introspect what's happening in Apple's software, so if any changes were made that were to expand the scope of this in some way—in a way that we had committed to not doing—there's verifiability, they can spot that that's happening.
Corellium's new initiative, called the "Corellium Open Security Initiative," aims to put Federighi's claim to the test. As part of the initiative, Corellium will award security researchers a $5,000 grant and free access to the Corellium platform for an entire year to allow for research.

Corellium believes that this new initiative will allow security researchers, hobbyists, and others to validate Apple's claims over its CSAM detection method. The security research firm, which just recently settled its long-lasting dispute with Apple, says it applauds Apple's "commitment to holding itself accountable by third-party researchers."
We hope that other mobile software vendors will follow Apple's example in promoting independent verification of security and privacy claims. To encourage this important research, for this initial pilot of our Security Initiative, we will be accepting proposals for research projects designed to validate any security and privacy claims for any mobile software vendor, whether in the operating system or third-party applications.
Security researchers and others interested in being part of the initiative have until October 15, 2021, to apply. More details can be found on Corellium's website.

Article Link: Corellium Launching New Initiative to Hold Apple Accountable Over CSAM Detection Security and Privacy Claims
 

exmophie

macrumors newbie
Oct 23, 2019
17
93
How will a research firm trigger a CSAM flag on an image? Feed it known images from the FBI child porn database? How would they know when it’s reporting a scanned image hash out to Apple without Apple cooperating with them? Will they play nice with a company they just sued for IP breach over simulating iOS and lost?
 

adib

macrumors 6502
Jun 11, 2010
493
380
Singapore
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
 

cyanite

macrumors regular
Sep 28, 2015
105
61
Bring it on!

This article text is still a bit misleading when it says
Apple's plan to scan iPhone users' photo libraries for CSAM
since that doesn't make it clear that it's only the iCloud Photo Library (with pictures scanned when uploaded), and not the photo library if that feature is not used.


I bet not.

This is desperately needed. I hope Corneliumm is able review what exactly Apple’s intensions are. Why Apple all of a sudden Apple wants to perform CSAM via iCloud.
Uhm what? They are not performing it "via iCloud". Also, how would anyone "review" intentions? Intentions can only be speculated about. Also, I don't agree that it's "desparately" needed. The loudest people will not change their mind regardless.

How will a research firm trigger a CSAM flag on an image?
Read the documentation: https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf
 

cyanite

macrumors regular
Sep 28, 2015
105
61
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
Sure, and then it'll be discovered in audit and be a huge scandal for Apple. Do you think Apple wants that? I bet this doesn't happen. Besides, it's US only for the time being, so let's just wait and see.
 

Substance90

macrumors 6502
Oct 13, 2011
486
730
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.

EDIT: Let me elaborate for the down voters - if the photos are scanned only if uploaded to some cloud, you don't even have to cut your network connection. You just keep your photos on your device and you're safe. If the scanning is done on device that means that your privacy is not guaranteed no matter if you keep your photos offline or if you even cut your network connection.
 
Last edited:

brucewayne

macrumors regular
Nov 8, 2005
249
407
The reason why Apple has been able to stave off warrant requests in the past is by claiming 'they don't have the key'

The current administration (as well as governments around the world) have been pushing for the ability to access your messages. CSAM gives Apple a chance to 'create' their own backdoor under noble pretenses (who is going to argue against stopping child abuse?) and creating an opening for the governments to eventually exploit. It won't matter what Corellium finds now.

And when it happens, Tim Cook will get up on stage and in his soothing southern drawl claim to be the good guy as they had the best of intentions. They won't even lose any customers over because most people are oblivious to privacy (Amazon has sold 100 million Alexa powered products), and the people that do care will have nowhere to go after the precedent is set and Google / Amazon / Microsoft have joined in.
 
Last edited:

star-affinity

macrumors 68000
Nov 14, 2007
1,525
788
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....
So you don't think the below applies in this case?


I guess we'll have to wait and see and hopefully Apple will be open with that they add to that hash list. If it can also be monitored by external initiatives such as Corellium I think that's good.
 
  • Like
Reactions: Kengineer and IGI2

bobtem

macrumors 6502
Jun 5, 2017
285
304
As far as CSAM stuff, guess I really don’t have a big opinion on that. My gripe with “Privacy” is that the user experience has become so fractured with alerts, verifications, confirmations, and double-asks. That’s not really smart security, it’s just throwing up more hurdles.

You want to scan my photos, I don’t give a shot, just make the overall user experience less disruptive.
 

brucewayne

macrumors regular
Nov 8, 2005
249
407
So you don't think the below applies in this case?


I guess we'll have to wait and see and hopefully Apple will be open with that they add to that hash list. If it can also be monitored by external initiatives such as Corellium I think that's good.
I think we have 20 years of increasing government intrusion to conclude that if A happens Z won't be far behind.

Liberty once lost is lost forever.
 

kstotlani

macrumors 6502a
Oct 27, 2006
713
1,151
Apple bureau of investigation. I know it is hard to argue that children need to be protected but why is it Apple that is doing the policing. I have not been following everything but one YouTuber mentioned that this isn’t very different that facial recognition that Apple already does on your phone. Is it?
 

giggles

macrumors 6502a
Dec 15, 2012
820
628
Apple bureau of investigation. I know it is hard to argue that children need to be protected but why is it Apple that is doing the policing. I have not been following everything but one YouTuber mentioned that this isn’t very different that facial recognition that Apple already does on your phone. Is it?

Companies and banks do “the policing” all the time by reporting illegal activity they’re legally mandated to report.
 

hagar

macrumors 65816
Jan 19, 2008
1,236
2,689
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.
It’s exactly the opposite. Apple has chose the hard way (client-side) to protect your privacy as no data leaves your device.
If the check is done server-side, your photos need to be decrypted leaving them potentially vulnerable and access to Apple (and maybe other parties).
 

hagar

macrumors 65816
Jan 19, 2008
1,236
2,689
Apple bureau of investigation. I know it is hard to argue that children need to be protected but why is it Apple that is doing the policing. I have not been following everything but one YouTuber mentioned that this isn’t very different that facial recognition that Apple already does on your phone. Is it?
That has nothing to do with it. Apple doesn’t scan the photos or look what’s in it. A naked picture of your partner or child will not be scanned, shared or viewed By Apple.
 

giggles

macrumors 6502a
Dec 15, 2012
820
628
The fact that the analysis is done on device is even worse. That means that your privacy is invaded even with all network connection turned off.
So in your opinion a criptographically air gapped device is less private than servers that are potentially accessible (like, physically accessible) to thousands of employees and government officials.

Interesting theory.
 

PlayUltimate

macrumors 6502a
Jul 29, 2016
614
871
Boulder, CO
So in your opinion a criptographically air gapped device is less private than servers that are potentially accessible (like, physically accessible) to thousands of employees and government officials.

Interesting theory.
Apple has always granted access to iCloud Storage to from law enforcement. This scanning for CSAM tags does two things: 1) protects the iPhone user from uploading inappropriate images. 2) protects Apple from hosting inappropriate images that would be subject to law enforcement scrutiny. I know the optics of the announcement look poor. But it does seem to be a net benefit.
 
  • Like
Reactions: Kengineer
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.