Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What happens when CCP wants Apple to scan for Winnie the Pooh pic hashes in Hong Kong?

This has already been addressed. The hash database is global. There is no regional variations distributed. Every Apple user will have the exact same set of hashes, and those hashes must come from two separate sources. So two governments would need to agree to participate in the corruption and abuse of the system. Unlikely they would get away with that.
 
  • Like
Reactions: ikir
I’ve had more time to simmer on this whole CSAM/Apple situation since the story broke 10 days ago and I just can’t bring myself to feel like this is ok. I usually give Apple the benefit of the doubt on these types of things but I just can’t get past the hashing being done on-device.

You don't believe it's possible, technically? Hashing is an extremely simple computation compared to what iOS devices are already doing with photos and video in real-time. The sheer processing power that we carry around in our pockets is far more than the average consumer truly appreciates. This hashing algorithm will simply be yet another stop in the processing pipeline.

Doing this on-device is deliberate, as it's the safest way to ensure that privacy is maintained. What would the alternative be?
 
  • Like
  • Disagree
Reactions: oldoneeye and ikir
I don’t really understand the point of this. Anyone who knows anything about hashing knows that if you change a file in any way that alters its data, it will produce a completely different hash. Even one single bit of data. So technically pedos can just edit their images slightly to avoid detection. In fact someone will probably write a script to randomly alter the data in a way that doesn’t corrupt the image. Such a tool might even exist. I remember reading about a tool that could add encrypted messages to the metadata of images.

And even if Apple is hashing based only on the pixel data, changing a few of the pixels would have the exact same effect.
 
  • Like
Reactions: danskin
This has already been addressed. The hash database is global. There is no regional variations distributed. Every Apple user will have the exact same set of hashes, and those hashes must come from two separate sources. So two governments would need to agree to participate in the corruption and abuse of the system. Unlikely they would get away with that.
As long as you believe this. Who says it will stay this way? Esp in China I would not bet on it.
 
  • Like
Reactions: PC_tech
I don’t really understand the point of this. Anyone who knows anything about hashing knows that if you change a file in any way that alters its data, it will produce a completely different hash. Even one single bit of data. So technically pedos can just edit their images slightly to avoid detection. In fact someone will probably write a script to randomly alter the data in a way that doesn’t corrupt the image. Such a tool might even exist. I remember reading about a tool that could add encrypted messages to the metadata of images.

And even if Apple is hashing based only on the pixel data, changing a few of the pixels would have the exact same effect.
There is an article about this linked somewhere in this thread. Apple uses a hashing algorithm that accounts for small changes in images. basically it only hashes the general appearance of the image, not the raw bytes.
I work in IT and have used such algorithms before. Unless Apple found the holy grail for image hashing, the hashing is quite error prone and will give positive matches for images that are somewhat similar but definitely not the same picture. So expect quite some false positives, and a lot of work for Apple's reviewers.

Who is volunteering for this kind of work anyway, checking other people's photos for CP?
 
I’ve had more time to simmer on this whole CSAM/Apple situation since the story broke 10 days ago and I just can’t bring myself to feel like this is ok. I usually give Apple the benefit of the doubt on these types of things but I just can’t get past the hashing being done on-device.
Me either. It's so bad, that I'm literally selling my iPhone after using one for more than a decade and getting a dumb phone. I've been temped from time to time, but it wasn't until now where I realized that there's is no middle ground. Either you buy into the system and give up your privacy, or you outright reject it and go it alone.

I'm going it alone.
 
  • Haha
  • Like
Reactions: oldoneeye and ikir
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
Matching hash is not scanning photos. Nobody see your photos. Google and Facebook really scan, so Apple can guarantee privacy while fighting CSAM images.
 
Dear MR Mods:

Please consider moving this political topic to the politics section. It is VERY difficult to discuss security, crime, and criminology without getting ding'd, warned, penalized, or otherwise punished for political discussion. That's because these topics are, at their very root, political in nature.

Thank you your consideration of this request.

Your Flight Plan



!
IKR?

How can a discussion about this technology and with Apple saying they won’t let rogue governments affect them not also involving taking politics some of the time.

I got a note from a moderator about not taking politics outside of a politics forum. *shrug*
 
This whole mess makes it really hard to defend Apple. They are the ones who choose to build their business model by claiming to be privacy-first. At least with Google you know what you’re getting. Apple seems to be so proud of having created this CSAM scanning technology but if I’m going to be surveiled by my cloud provider it may as well be Google. At least with Google Photos the invasive scanning gives me the benefit of amazing search results in my image library. Meanwhile Apple can’t tell the difference between a photo of my cat and a photo of my car.
I wonder what the backstory to this CSAM thing that Apple is now considering.

From my perspective, it seems to have come out of nowhere, but of course that’s not that case. It has a history and was in the works in the background for a while.

it’s a disappointing move. We’ve gone with iPhones in part for privacy issues but this seems to be totally backwards.
 
What will happen is that someone will hack the hashcode and manipulate it so it flags up images of Donald Trump (obvious choice as many around the world seem to dislike the man). Apple's system will think it's CSAM images and when it's checked by a human, the human will see it's just images of Donald Trump because the intention of the hackers is to annoy Apple with false collision results.
 
CSAM hash comparisons on apple cloud - maybe that's OK, it's their server. Doing this on my device is absolutely not OK.

In this Apple have crossed a line.
 
  • Like
Reactions: Mebsat and JetLaw
So two governments would need to agree to participate in the corruption and abuse of the system. Unlikely they would get away with that.

You mean, like the five Eyes countries agreed to spy on their allies...

And I could easily imagine counties like Russia and Belarus or China and North Korea and a few others working together on this.
 
  • Like
Reactions: PC_tech and JetLaw
Matching hash is not scanning photos. Nobody see your photos. Google and Facebook really scan, so Apple can guarantee privacy while fighting CSAM images.
You need to scan the pictures in order to have a hash. And unlike traditional checksum hashes, Apple's Neuralhash compares images based on visual and semantic similarity. That means the CSAM-scanning even involves some degree of content analysis.

That is a quite thorough scan - and in most cases completely unnecessary for Apple's purported mission, since photos just taken with the camera cannot be the pictures from the NCMEC-database.
 
  • Like
Reactions: oldoneeye
There is an article about this linked somewhere in this thread. Apple uses a hashing algorithm that accounts for small changes in images. basically it only hashes the general appearance of the image, not the raw bytes.
I work in IT and have used such algorithms before. Unless Apple found the holy grail for image hashing, the hashing is quite error prone and will give positive matches for images that are somewhat similar but definitely not the same picture. So expect quite some false positives, and a lot of work for Apple's reviewers.

Who is volunteering for this kind of work anyway, checking other people's photos for CP?
But I thought that was the whole point of this: to avoid actual people looking through your images? If they can’t fully automate this (which they can’t) then they shouldn’t really do it.

Also, I don’t think what you’re referring to is hashing. Hashing is exact and relies on an exactly identical stream of bits - it cannot be applied with fuzzy logic. What you’re referring to is pattern matching which is NOT what Apple is claiming this does. Apple is claiming this is looking for an exact match (unless I’ve misread the articles).
 
IKR?

How can a discussion about this technology and with Apple saying they won’t let rogue governments affect them not also involving taking politics some of the time.

I got a note from a moderator about not taking politics outside of a politics forum. *shrug*
Ah, the good old “free speech unless we don’t like what you’re saying” approach. Shameful really. Unless people are being abusive to others, it should damn well be “free speech” full stop.
 
Also, I don’t think what you’re referring to is hashing. Hashing is exact and relies on an exactly identical stream of bits - it cannot be applied with fuzzy logic. What you’re referring to is pattern matching which is NOT what Apple is claiming this does. Apple is claiming this is looking for an exact match (unless I’ve misread the articles).
Apple is calling it hashing and what we used at my company is also hashing and not pattern matching. But it does not take the raw stream of bytes but a somewhat "abstracted" stream of pixels as it's input. This allows for detecting the same image even if it has been saved to a different file format, reduced in quality, slightly cropped or edited.
If I read Apple's documentation correctly, they don't say "exact match" as in "it's a jpeg image with exactly these bytes".
 
Apple is calling it hashing and what we used at my company is also hashing and not pattern matching. But it does not take the raw stream of bytes but a somewhat "abstracted" stream of pixels as it's input. This allows for detecting the same image even if it has been saved to a different file format, reduced in quality, slightly cropped or edited.
If I read Apple's documentation correctly, they don't say "exact match" as in "it's a jpeg image with exactly these bytes".
In which case: this is a very dangerous game Apple is playing because false-positives will be very, very common and if that then requires human intervention to make a decision, lots of non-criminal and perfectly legal pictures will be viewed, violating user privacy.

Consider also legitimate pictures parents have taken of their kids that just happen to be of the kids not wearing clothes. I would think many parents wouldn’t think twice before taking what they believe are private pictures of their 2 year olds running around after a bath, only to have such pictures inadvertently flagged by Apple’s algorithm. Unlikely? Maybe. Possible? Absolutely.
 
Maybe you should have considered such issues before deciding to carry a multi-sensor tracking device on your person all days every day that provides many more actual mechanisms for depriving you of your liberty than this. Do you have a Facebook account? Instagram? Twitter? Yeeeaaah…
No No and No.

I have an always on VPN connection an all my devices, I have little snitch running on my mac, I have gas mask on my mac, I have Pihole running and so on....just saying.
 
  • Like
Reactions: Violet_Antelope
Wouldn’t it be easier for a hacker to, say, take over your social media account and just start posting bad stuff? The scenarios people are coming up with that make them freak out about this are ludicrous compared to countless other scenarios we already accept without question.
Well that depends really. Account security on social networks is about the same as on Apple (both support strong passwords and 2 factor authentication). Both encrypt their data.

The main difference is that if a hacker broke into your social network, the most they could likely do is impersonate you or if you're silly enough to do in-app purchases via facebook then they could buy stuff using the card on record (note that they could only buy stuff on facebook as your card details would be obscured on-site).

On Apple they could trigger the kiddie porn system by uploading to iCloud photos if you don't pay them, they could use a linked card to make purchases (potentially), and they would have access to all your previous purchases (apps, music, movies, etc).

Regarding the legal situation, it would be harder to prove you are the legal owner of a social media account because verification doesn't take place on any of the networks (they all still run anonymously). So if your account was hijacked you could disown the account and state you no longer held ownership over it.

Apple ID's are something else, you usually fill out a fair bit of data over time, you often attach payment details, it's connected to physical hardware you own. It's tied to you unlike some random web account is. So I wouldn't want to test the legal system if someone hijacked it and as the owner you are held responsible for it's activity (as Apples own terms of service states).

Just my two cents.
 
Last edited:
Matching hash is not scanning photos. Nobody see your photos. Google and Facebook really scan, so Apple can guarantee privacy while fighting CSAM images.
Also, I don’t think what you’re referring to is hashing. Hashing is exact and relies on an exactly identical stream of bits - it cannot be applied with fuzzy logic. What you’re referring to is pattern matching which is NOT what Apple is claiming this does. Apple is claiming this is looking for an exact match (unless I’ve misread the articles).
Apple is calling it hashing and what we used at my company is also hashing and not pattern matching. But it does not take the raw stream of bytes but a somewhat "abstracted" stream of pixels as it's input. This allows for detecting the same image even if it has been saved to a different file format, reduced in quality, slightly cropped or edited.
It's a "perceptual hash" generated using machine learning/training techniques and designed to produce the same result for "visually similar" images - which is quite different from the cryptographic hashes, based on well-defined algorithms, designed to produce a different hash for even a small change to a file. Official Apple summary here:


The main purpose of the hash is to ensure that identical and visually similar images result in the same hash, and images that are different from one another result in different hashes. For example, an image that has been slightly cropped or resized should be considered identical to its original and have the same hash.
All these arguments that "it's not scanning your photos - it is matching hashes" as if there were some gross, fundamental difference are misleading. "Hashing" covers a wide range of techniques and applications. It's quite likely, for instance, that face tagging uses some sort of hash to look up matched faces in a database. Maybe the particular system that Apple has implemented is very safe, but the devil is in the details - saying "it's only a hash" proves nothing.
 
  • Like
Reactions: Nick05
But I thought that was the whole point of this: to avoid actual people looking through your images? If they can’t fully automate this (which they can’t) then they shouldn’t really do it.

Also, I don’t think what you’re referring to is hashing. Hashing is exact and relies on an exactly identical stream of bits - it cannot be applied with fuzzy logic. What you’re referring to is pattern matching which is NOT what Apple is claiming this does. Apple is claiming this is looking for an exact match (unless I’ve misread the articles).
This is incorrect. There are two types of hash functions: “cryptographic hash functions,” where changing a single bit in the source results in that source producing a massively different hash, and “perceptual hash functions,“ (which is what Apple is using). These produce similar hashes based on “similar” inputs, where that “similarity” is deduced algorithmically and, as been pointed out by many others, is subject to collisions on a not-so-infrequent basis.
 
I hope the people who must verify any matching hashing images are paid very well. They are going to see some pretty disturbing stuff daily.
 
If you're uploading your photos to Apple, you've likely already lost those protections. Whilst Apple have been forceful about not putting a backdoor into their devices, they've been more than happy to spill any iCloud data they have available which, as noted above, already includes iCloud Photos data.
No...there's a difference. The 4th amendment protects us from unwarranted search and seizure (like Apple looking at my phone for illegal activity without probably cause). When I upload something to iCloud, it is understood that scanning will probably happen (it hasn't been done in photos yet; only iCloud email). But that's on THEIR device, not mine.

DO. NOT. PUT. THIS. ON. MY. DEVICE.
 
  • Like
Reactions: PC_tech and Nick05
of course Apple will say that!
if you really think that this new policy will be limited to finding kiddy porn, you must be very simple minded.
they are opening pandoras box.
literally creating a HUGE backdoor on their devices.
you will have the LEAST possible level of privacy and security on iOS devices.
no other company is doing this on your device.
they only monitor the content on their servers.
Apple is the only one doing this ON YOUR DEVICE.
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

While I am sure people may agree with this, it seems like one step away from doctors/dentists submitting DNA samples of *every* patient because it is in the public interest.

This program seems just a small morale slip away from being an invasion of privacy on a monumental scale. Give what Snowden revealed the US government has a huge thirst for data collection like this. It’s a short hop to scan for compromising photos of your political rivals, yes?
not just iCloud,but the ACTUAL device will be searched constantly.
Apple claims only when matching content is uploaded to iCloud their alarm is triggered.
thats why it's so unique, excessive and invasive.
other services like Google,Dropbox and Microsoft also monitor content uploaded on THEIR servers,but none of the searches CONSUMERS DEVICE.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.