Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Analog Kid

macrumors G3
Mar 4, 2003
9,061
11,864
OK. Let’s assume that all of us on this forum are just postulating different assumptions and none are correct. Very possible.

However, one thing is fairly clear: by Apple saying that they are killing an expensive and invasive technology in the name of privacy is telling. And doing so on the same day they implement iCloud encryption further sets the tone that Apple is locking the door to user data and throwing out keys.

Maybe I’m wrong. Maybe we’re all wrong. Either way I don’t believe the benefits of CSAM are going to be fruitful as, once implemented, any criminal with half a brain would simply turn off iCloud sync.

Which was the whole point-- prevent people from putting criminal material in an encrypted cache and sharing the password.

Apple saying they're throwing in the towel doesn't mean they suddenly agree with you. It means Apple decided it wasn't worth all the negative PR they were getting and their attempts at educating the public over the past year have only led to more waves of FUD.

So now the question is what happens when one of those regimes you mention make end to end encryption on certain data illegal and use CSAM as the stalking horse. Apple has tried to address it in a way that undercuts the false pretext and prevents a fishing expedition but it blew back in their face. Now it's likely that they'll either be forced to play ball, or exit those markets and leave them to hollow companies who won't even make an effort to hold a line.
 

Fat_Guy

macrumors 65816
Feb 10, 2021
1,017
1,081
Look, forget about the CSAM fiasco.


USB C is coming next year to the iPhone! 👍👍👍👍👍👍👍👍
 

Grey Area

macrumors 6502
Jan 14, 2008
426
1,008
Just a point of clarification — the CSAM detection was against a set of known hashed CSAM material, and was not designed to use machines learning models of what might be said material.

Not really, if by "scan" you mean create a hash (not scanning), then sure. That isn't "looking at your photos' content" That is adding up all the pixels into a formula (say SHA-256) and matching them to the hash of known shared images.
No, this was not hashing like SHA-256. A cryptographic hashing algorithm like SHA-256 aims at creating wildly different hash values for two pieces of input data, even when those have only minor differences (e.g. one image and one copy with a few pixels altered). Also, the hash values are non-reversible, i.e. you cannot recreate the original data from the hash.

What Apple was planning to do was semantic hashing, which is very different: it is using a machine learning classifier trained to analyze the content of the images, and two images showing similar things are then supposed to get similar hash values. If this algorithm determines that an image on your phone shows things similar to those in a known CSAM picture, this would count as a match.

I do not think Apple ever provided any specifics on how generous this similarity measurement would be. They said the system was intended to catch minor alterations of the known CSAM pictures, so maybe it had rather tight tolerances. On the other hand, if that was the intention, Apple could have chosen to exclude photos the user took with the camera and only scan downloaded pictures, but from what I understand they did plan to scan user photos.

(Also, semantic hashes are reversible, which is why Apple planned to encrypt the known CSAM hashes stored on the phones and to keep them out of reach from the user.)
 

Analog Kid

macrumors G3
Mar 4, 2003
9,061
11,864
No, this was not hashing like SHA-256. A cryptographic hashing algorithm like SHA-256 aims at creating wildly different hash values for two pieces of input data, even when those have only minor differences (e.g. one image and one copy with a few pixels altered). Also, the hash values are non-reversible, i.e. you cannot recreate the original data from the hash.

What Apple was planning to do was semantic hashing, which is very different: it is using a machine learning classifier trained to analyze the content of the images, and two images showing similar things are then supposed to get similar hash values. If this algorithm determines that an image on your phone shows things similar to those in a known CSAM picture, this would count as a match.

I do not think Apple ever provided any specifics on how generous this similarity measurement would be. They said the system was intended to catch minor alterations of the known CSAM pictures, so maybe it had rather tight tolerances. On the other hand, if that was the intention, Apple could have chosen to exclude photos the user took with the camera and only scan downloaded pictures, but from what I understand they did plan to scan user photos.

(Also, semantic hashes are reversible, which is why Apple planned to encrypt the known CSAM hashes stored on the phones and to keep them out of reach from the user.)
It's worth reading the document.

It is a perceptual hash, not a semantic hash. It isn’t seeking images with the same meaning (semantics) it’s looking for images that look like the same image. It is looking to create a common hash value for variations of a specific image. Cropped, rotated, color shifted, and probably a lot of other things.

In particular, this is not an image classifier. It is not inferring CSAM/not CSAM. It is an image detector, determining if your image is already in a known database. The downside of this is that it is not looking for and will not detect abusive imagery you may have created yourself, it is looking to see if you have a copy of an image that is already in circulation.

If this algorithm determines that an image on your phone shows similar things to a known CSAM image, it would not count as a match. If this algorithm determines that an image on your phone is a manipulated version of a known CSAM image it would count as a match.

It won't be perfect. It will miss some CSAM depending on the manipulations. Hashing implies information reduction so there is always the possibility of a false positive. A false positive does not imply that your image is "similar" to CSAM, it just means it hashed to the same value and is most likely completely innocuous. This is why it requires 30 positive matches to generate an alert, to give most people 30 possible false matches (not to give everyone a pass for 29 criminal images).

Apple has said there is a 1 in a trillion chance that an account would be falsely flagged. If there are a billion iCloud accounts, there's a 1 in a thousand chance someone would get referred to Apple for manual verification. I suspect that number is based on a numerical analysis with some assumptions and probably underestimates the risk-- but still the risk to anyone should be exceedingly low. Your match results are encrypted so that Apple can't tell anything about them, including whether you have 0 or 29 matches, unless you exceed the 30 match threshold.

Excluding pictures taken with the camera leaves open the possibility of altering the metadata of downloaded images to avoid the scan, and still would mean that if you share your picture with someone else it'll get scanned on their phone.

The hashes are not reversible: see hash. If you have a hash you can't create the image from it. There is less data in the hash than in the source image, and that data can't be guessed. The hashes are spoofable. With effort, it would be possible to construct an image that has the same hash as another different image. Most likely the image would look like crap, but it would tickle the hashing function in just the right way.

The hashes are probably encrypted to prevent people from creating spoof images and bogging down the system, and to prevent people from being able to pre-test their own image library for matches.
 
  • Like
Reactions: MecPro

v0lume4

macrumors 68020
Jul 28, 2012
2,485
5,158
Great news but they lost all my trust with this. I will continue to use local backups on my Mac and keep my photos out of iCloud for good.
Ultimately, we shouldn't trust any cloud storage when it comes to our privacy. Plus, local backups are fun to manage. Or is that just me? 🙃
 

PBMB

macrumors 6502
Mar 19, 2015
324
126
Would not have mattered me either way.
Me too. I have never uploaded photos to iCloud because I did not like the idea since the beginning. And in fact, I mostly use my (standalone) digital camera to take photos. Sometimes I inspect them in a Mac, or store a small number of them, but that's it.
 

PBMB

macrumors 6502
Mar 19, 2015
324
126
Ultimately, we shouldn't trust any cloud storage when it comes to our privacy. Plus, local backups are fun to manage. Or is that just me? 🙃
No. You are not alone; see my reply above. But that does not mean I don't care about the subject discussed here. Apple did the right thing to abandon this potentially very dangerous technology.
 
  • Like
Reactions: v0lume4

MuppetGate

macrumors 6502a
Jan 20, 2012
655
1,088
Apple folded, and it was all for money. The anime hentai owners squawked and apple heard their cries from their parents' basements. What a disgrace. The victims will ultimately not receive justice because of Apple's greed. Every other tech company with cloud storage is using this. Google, FB, MS, etc.

Pedophiles are gonna love iCloud. Quite the humanitarian huh Timmy?

The difference is that Google and MS are footing the bill for their CSAM detection. Apple’s scheme would pass the cost on to their customers by using their phones‘ battery and processor to handle the scan.
 
Last edited:

SirAnthonyHopkins

macrumors 6502a
Sep 29, 2020
946
1,887
How sad, that I would have believed “privacy” before they brought up CSAM, but now that they say they’re not doing it, I don’t believe them at all.
If you don't believe them, find proof a raise a class action. Do you realise how bad it would be for Apple to say they're not doing something this invasive and then just go ahead and do it anyway?
 

No5tromo

macrumors 6502
Feb 17, 2012
412
1,041
What were they even thinking at first place? Proactively scanning all of our photos to find child pornography and then have actual people double checking potentially false positive private photos of your family? And who's to say that the person looking at our personal photos isn't a perv or something? Are they totally insane? I am all for measures that protect children but you can't just arbitrarily do stuff like that. This is as intrusive as if the police randomly and forcefully broke into our houses on a regular basis and started proactively searching everything that we own in hopes of finding something illegal. That's not how it works. It's a good thing that they won't go forward with it but the sheer fact that they even considered it after trying to convince us how much they care about privacy is bad enough.
 
  • Like
Reactions: huge_apple_fangirl

Grey Area

macrumors 6502
Jan 14, 2008
426
1,008
It's worth reading the document.

It is a perceptual hash, not a semantic hash. It isn’t seeking images with the same meaning (semantics) it’s looking for images that look like the same image. It is looking to create a common hash value for variations of a specific image. Cropped, rotated, color shifted, and probably a lot of other things.

Apple was most definitely going for semantics:
"The embedding network represents images as real-valued vectors and ensures that perceptually and
semantically similar images have close descriptors in the sense of angular distance or cosine similarity.
Perceptually and semantically different images have descriptors farther apart, which results in larger
angular distances."



In particular, this is not an image classifier. It is not inferring CSAM/not CSAM. It is an image detector, determining if your image is already in a known database.
The difference is not that clear-cut. The system extracts features from the image, and based on these features a neural network produces an image descriptor. If this descriptor is sufficiently similar to the image descriptor of a known CSAM image, the image will be flagged. Now yes, I understand that this type of system relies on existing images and is not capable of finding entirely new types of CSAM. But NCMEC was to provide its five million image hashes, that is a lot of images for a subject matter, and if you then go for similarity matching rather than exact matching, you have for all intents and purposes a CSAM classifier.

The hashes are not reversible: see hash. If you have a hash you can't create the image from it. There is less data in the hash than in the source image, and that data can't be guessed.

They are not perfectly reversible, that is true. But you can recreate a recognizable approximation of the original image. See: https://towardsdatascience.com/black-box-attacks-on-perceptual-image-hashes-with-gans-cc1be11f277

This has been shown to work with PhotoDNA, a perceptual image hashing system widely used for CSAM detection. Maybe Apple's system would have been immune to this, but they spent quite some effort to prevent people from even trying, so I have my doubts.
 

playtech1

macrumors 6502a
Oct 10, 2014
679
861
Apple may have been well intentioned but it is fundamentally creepy for the maker of my phone or laptop to scan its contents.

The type of content when introduced may be unobjectionable, but it would so obviously be a thin end of a wedge once the principle of compulsory private document scanning has been accepted.
 

Mercury7

macrumors 6502a
Oct 31, 2007
738
556
I would not worry too much about what Apple does going forward…. We have privacy advocates that will let us know, just keep your eyes open…. Which I’m sure just about everyone here does anyway, Apple is well aware of the repercussions if they tried to sneak something in under the radar
 

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
Apple sure has come a long way from the "You're holding it wrong" era. Good to see they actually consider user feedback (e.g. stage manager on non m1, etc)
This wasn’t exactly user feedback. The system was never put into place for people to even see how it worked and almost no one on here who didn’t like it understood how it worked and were often just quoting headlines and not he technology behind it.

That being said even though I was in support of the CSAM idea I also recognize that people need to feel comfortable with it and I need to accept the outcome. Apple likely could have done a much better job rolling this out and helping people better understand how it was designed to work. Not everyone was going to like it but in the end I feel it was a loss while others will feel like its a win.
 
  • Disagree
Reactions: MuppetGate

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
That's the Law Enforcement and Government job. Not Apple's.
This is everyone’s job. Often people are not good at identify and/or recognizing when it is happening around them despite the steps the pedophiles use to hide their actions. There are many instances when someone should’ve spoken up but didn’t everyday. We are flawed as well.

It is the Government’s job to pass laws that better protect and prohibit this deplorable behavior along with promoting programs that provide education and treatment to prevent people from succumbing to this mental illness. It is the Law Enforcement’s job to investigate and uphold the law. While law enforcement has its own flaws it still needs tools and technology to capture these people and stop them. Was this the right way? Many people felt it was not.

There is some good intentions from Apple trying to keep this kind of material off their servers and likely legal ones they were trying to address at the same time. I don’t want to suppress good intentions from anyone, company or otherwise, who take on the nearly insurmountable task of catching and inhibiting these people.

There were many so called security experts and companies who campaigned against this purely for monetary reasons and less because of privacy and back door issues.

I hope that people take away from this that pedophilia is still a problem and we need better tools to stop this. Not ‘Yay, we stopped a thing I didn’t fully understand because I read a headline. Sorry kids.’
 

Fat_Guy

macrumors 65816
Feb 10, 2021
1,017
1,081
Apple probably figured since the customer buys the phone there would be legal implications for them scanning on the device.



Let’s see how things go with their “rent a phone” subscription service. Then the phone is theirs and everything in it so they can scan away….




Always that - but the upside is that you can get a free yearly upgrade for something like: USB C (even though I got USB C on a new phone I bought for 150 bucks…) for a small extra fee.
 
  • Like
Reactions: MuppetGate

TheToolGuide

macrumors regular
Aug 11, 2021
118
87
Why?

Just because Apple doesn't scan for content, doesn't mean law enforcement cannot subpoena/compel access to the content that may be stored on Apple's servers once there is proper legal cause to do so.
I was for the idea of CSAM. More importantly nothing has really changed. This was trying to take a more pro-active way to inhibit and stop this type of behavior. Law enforcement can only do so much.

The people have spoken and said this was to far Apple. I was comfortable with it but many weren’t. Sadly I don’t think enough is being done on this particular issue and no one is really coming together to say ‘okay, we won’t do that but what will we do instead of just doing the same old thing.’

Of course this is often the case or any issue society has to deal with and people can’t agree. Likely no solution is going to make everyone happy. I just wish we could come up with something that can make more of an impact.
 

bobcomer

macrumors 601
May 18, 2015
4,949
3,694
I don't get why people think this whole CSAM exercise was just a means of Apple getting a backdoor into people's phones - why would they need to go to those lengths when they control the OS for pity's sake!
As one of the opponents to this whole idea, it wasn't about just getting a backdoor, it was about using your device and it's capabilities to spy on you. The backdoor was just what we worried about what comes next. You change the hash database and it could look for anything of yours.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.