Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
65,002
33,190


Developer Asuhariet Yvgar this morning said that he had reverse-engineered the NeuralHash algorithm that Apple is using to detect Child Sexual Abuse Materials (CSAM) in iCloud Photos, posting evidence on GitHub and details on Reddit.

Child-Safety-Feature-Blue.jpg

Yvgar said that he reverse-engineered the NeuralHash algorithm from iOS 14.3, where the code was hidden, and he rebuilt it in Python. After he uploaded his findings, another user was able to create a collision, an issue where two non-matching images share the same hash. Security researchers have warned about this possibility because the potential for collisions could allow the CSAM system to be exploited.

In a statement to Motherboard, Apple said that the version of the NeuralHash that Yvgar reverse-engineered is not the same as the final implementation that will be used with the CSAM system. Apple also said that it made the algorithm publicly available for security researchers to verify, but there is a second private server-side algorithm that verifies a CSAM match after the threshold is exceeded, along with human verification.
Apple however told Motherboard in an email that that version analyzed by users on GitHub is a generic version, and not the one final version that will be used for iCloud Photos CSAM detection. Apple said that it also made the algorithm public.

"The NeuralHash algorithm [... is] included as part of the code of the signed operating system [and] security researchers can verify that it behaves as described," one of Apple's pieces of documentation reads. Apple also said that after a user passes the 30 match threshold, a second non-public algorithm that runs on Apple's servers will check the results.
Matthew Green, who teaches cryptography at Johns Hopkins University and who has been a vocal critic of Apple's CSAM system, told Motherboard that if collisions "exist for this function," then he expects "they'll exist in the system Apple eventually activates."

"Of course, it's possible that they will re-spin the hash function before they deploy," he said. "But as a proof of concept, this is definitely valid," he said of the information shared on GitHub.

Because of the human element, though, another researcher, Nicholas Weaver, told Motherboard that all people can do with manipulating non-CSAM hashes into CSAM is "annoy Apple's response team with garbage images until they implement a filter" to get rid of false positives. Actually fooling Apple's system would also require access to the hashes provided by NCMEC and it would require the production of over 30 colliding images, with the end result not fooling the human oversight.

Apple is using its NeuralHash system to match a database of image hashes provided by agencies like the National Center for Missing Children (NCMEC) to images on user devices to search for CSAM. The system is designed to produce exact matches and Apple says there's a one in a trillion chance that an iCloud account can be accidentally flagged.

Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features, and it has been a hugely controversial decision with Apple receiving criticism from customers and privacy advocates. Apple has been attempting to reassure customers and security researchers about the implementation of the system with additional documentation and executive interviews.

Article Link: Apple Says NeuralHash Tech Impacted by 'Hash Collisions' Is Not the Version Used for CSAM Detection
 
  • Like
Reactions: IG88

nawk

macrumors member
Jan 2, 2002
89
458
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
 

LDN

macrumors regular
Jul 21, 2014
122
301
Make it all public. It’s for the children, right? What do you have to hide, Apple?
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.
 

Altivec88

macrumors regular
Jun 16, 2016
213
812
Apple... You don't get it. This isn't even out yet and people are already trying to hack into it. I don't care if that wasn't the final version. You really don't think your final version will be hacked. Its a ticking time bomb with huge ramifications.
 

miniyou64

macrumors 6502a
Jul 8, 2008
751
2,694
People giving Apple the benefit of the doubt here are making a tremendous amount of assumptions. This kind of tech never remains only for its intended use. No matter which way you spin it (for the children!) this is invasive. Someone on Twitter mentioned what happens if someone airdrops you a bunch of illicit photos and they sync to iCloud in a matter of seconds? Boom you’re flagged. There’s 1,000,000 ways for this system to go wrong or be exploited or worse ruin innocent peoples lives. And if you do end up being one of those people, you will have exactly zero recourse to prove your innocence. It’s over for you. This entire thing is very stupid on Apple’s part.
 
Last edited:
Apple... You don't get it. This isn't even out yet and people are already trying to hack into it. I don't care if that wasn't the final version. You really don't think your final version will be hacked. Its a ticking time bomb with huge ramifications.
Too much risk is involved. No idea, why Apple is even allowing this in first place.

The fact anyone can reverse engineer. Being able to find the code inside the platform, scary stuff!
 
Last edited:

Dremmel

macrumors regular
May 25, 2017
200
316
First they came for the paedophiles, and I did not speak out—
Because I was not a paedophile.
Then they came for the socialists, and I did not speak out—
Because I was not a socialist.
Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.
Then they came for the Jews, and I did not speak out—
Because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
 
Apple is planning to implement the NeuralHash CSAM system in iOS and iPadOS 15 as part of a suite of child safety features

Clearly, Apple is using child safety features as an excuse to spy on our iPhones.

If you disagree with my statement. WAKE UP! Before it’s too late. Apple is playing YOU.
 
Last edited:

canyonblue737

macrumors 68020
Jan 10, 2005
2,219
2,760
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

yes. every photo on your iphone will converted to a "hash" by your phone before uploading to icloud. the hash will be sent along with the photo. the hashes (not the actual image) will be scanned to see if they match a know child abuse photo hash and if you have more than approximately 30 such images in your icloud collection a human being at Apple will be able to view just those images to verify they are indeed images that match known child abuse imagery. if so law enforcement will be notified.

like you said the fear is apple could use the same technology to search someday for other forms of imagery and could be coerced to do so by foreign governments. they promise they won't but the backdoor is there now.
 
I’ve had more time to simmer on this whole CSAM/Apple situation since the story broke 10 days ago and I just can’t bring myself to feel like this is ok. I usually give Apple the benefit of the doubt on these types of things but I just can’t get past the hashing being done on-device.
Wouldn’t be surprised if people start protesting infront of Apple HQ. Banner of “STOP INVADING OUR PRIVACY”, “RESPECT OUR PRIVACY”.
 
Last edited:

Nick05

macrumors regular
Aug 5, 2011
108
149
Florida
As this whole scanning thing continues to get more publicity, it’s going to be difficult for customers to not have heard about it by the iPhone 13 release and all the security researchers calling out potentially dangerous flaws is surely going to make the masses have doubt in it. This is beginning to look really bad for Apple.
 

theluggage

macrumors 604
Jul 29, 2011
7,857
8,157
With hashing you will always have a chance of collisions because it is inherently lossy by design.
...and that's with "cryptographic" hashes that are designed to give different hashes from even slightly different sources with a low chance of collisions. This system is using perceptual hashes that are an image recognition technique to give the same hash for visually similar images (to detect images that have been cropped, scaled, changed in quality etc.) See https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

- maybe the technique is extremely effective, but the cryptographic hash vs. perceptual hash distinction is rather important in principle.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.