Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Apple will refuse such demands to add to the hash list, but they can build another system to scan, even your non-iCloud photos.

And you wouldn't know.
 
  • Like
Reactions: hans1972
I have read the white papers. And watched videos. And listened to podcasts.

Apples claim here is that ANY MODIFICATION to the image will still flag it. But have you seen how much people can modify an image in photoshop? They must be overstating this.

You’re looking at the problem with human eyes and not computer eyes.
”Similarity” in human terms (even if said human is looking at the bare pixels) has got nothing to do with what this PhotoDNA-style hashes are doing. You can’t fool them the way you’re thinking in human terms.
Not sure what else to say.
 
Last edited:
You’re looking at the problem with human eyes and not computer eyes.
”Similarity” in human terms (even if said human is looking at the bare pixels) has got nothing to do with what this PhotoDNA-style hashes are doing. You can fool them the way you’re thinking in human terms.
Not sure what else to say.

That still doesn’t explain how a severely modified image can produce a match per apples claims. I can make a subject look 90 years old and the hulk in photoshop. But that’s still a match? How? If a picture differs that drastically is still a match, how is it impossible for an adult in a bathtub to not get flagged?

Are you saying it’s impossible for us to understand since we are humans? Then why am I getting all this hate when ALL I am asking is for some damn help understanding.
 
  • Like
Reactions: Euronimus Sanchez
Exactly.

Everybody is hating on Apple when just about everybody else has been doing it too - and possibly for longer. Apple is an easy target, apparently.
Have those other companies been championing themselves as the purveyors of privacy for the masses? Selling it as a bedrock principle for their products and services? Could it possibly be that Apple has set expectations for themselves that when broken cause backlash? We can't possibly be expected to suspend context.

Oh, and as my preteens keep using the "everyone else is doing it" excuse I'll be sure to remember the advice from many here.
 
  • Like
Reactions: mr_jomo
I have read the white papers. And watched videos. And listened to podcasts.

Apples claim here is that ANY MODIFICATION to the image will still flag it. But have you seen how much people can modify an image in photoshop? They must be overstating this.
Okay...again...in order to modify an innocent image enough to look like an image in the child pornography database, YOU WOULD NEED SAID ILLEGAL IMAGE.

The odds of winning the Powerball lottery area bout 1 in 300 million on any given drawing on average. The odds of you having in your possession more than one photo that is nearly identical to one in the database and getting flagged for review is one in one trillion....You are over 3,000 times more likely to win a major lottery than get flagged for review (and even then, assuming you don't possess actual child porn, nothing will happen!!)
 
  • Like
Reactions: oneteam
Apple will refuse such demands to add to the hash list, but they can build another system to scan, even your non-iCloud photos.

And you wouldn't know.
Apple already stated prior to this FAQ that they can tailor the system, customized on per country basis. And that's Apple talking by themselves, literally advertising the system to governments worldwide. Nobody asked this, there was no legal pressure or anything on Apple. Apple just suddenly stated that they're doing this and "hey world governments, we can customize this to your needs."

I'm really really curious what Tim Cook got during that Sun Valley summer camp.
 
  • Like
Reactions: 09872738
That still doesn’t explain how a severely modified image can produce a match per apples claims. I can make a subject look 90 years old and the hulk in photoshop. But that’s still a match? How? If a picture differs that drastically is still a match, how is it impossible for an adult in a bathtub to not get flagged?

Are you saying it’s impossible for us to understand since we are humans? Then why am I getting all this hate when ALL I am asking is for some damn help understanding.
Just stop already...you have no idea what you are talking about....it's okay to not understand how.
 
  • Like
Reactions: oneteam
I have heard this story before... "We are putting this new technology / law / procedure in place but it will ONLY be used for one narrow, extremely limited, noble, and crucial purpose... for now".

A few years ago many states started passing laws that authorized collecting DNA samples from people convicted of felonies such as rape and murder. Then the laws were changed to collect DNA from people arrested for felonies (some of whom later had the charges dropped due to lack of evidence). Laws were then changed again to collect DNA from anyone arrested for anything "because non-violent offenders today may become violent offenders tomorrow." Keep in mind, in most states, the DNA data is never purged from the database even if the charges are dropped or the jury acquits the suspect. I know, police would never arrest anyone unless they were guilty (even if the prosecutor drops the charges) and since they are guilty they shouldn't have any rights so this is probably a bad example of the "nose under the tent" phenomenon /s.


There was also a time when cell tower data was only used for violent felonies to find out what devices (and presumably the owners of those devices) were in the vicinity of the crime scene around the time the crime was taking place. The bar has been repeatedly lowered so now cell tower data is routinely used for non-violent crimes and it has led to innocent people being arrested for riding their bike past a home while it was being burglarized.


But I am SURE this time will be different and Apple's new "features" will NEVER be used for anything other than what they explained in this FAQ. No three-letter government agency would EVER serve Apple with a National Security letter and a gag order demanding they use this technology to scan for anything else.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.

They scan shared documents and provide tools to help organizations fight CSAM. Apple is taking this to a whole other level.
 
Wow you have put a lot of time into this thread, many many replies. Curious what's important about this topic for you? How do you want the system to work? Is there something you all want us to do or believe? I'm not getting it so figured I would ask.
I’m tired of the noise that makes a very important discussion (about what can be done vs what should be done) fade against a backdrop of low signal to noise ratio and buzzwords.
 
Now that you've had time to process it, what does it mean for you as an Apple customer? Initially when you were angry, what actions were you thinking of taking? And now that you've had more thoughts, what are your next steps?

I don’t make irrational decisions - so initially after the news, I didn’t have any plans to change how I am as an Apple customer Lol.
 
  • Like
Reactions: majus and mr_jomo
Just stop already...you have no idea what you are talking about....it's okay to not understand how.

Okay so just no discussion at all? I can’t learn I should just stop? Right? Great attitudes these forums lately. Please explain it to me. I read the white paper, watched videos, listened to podcasts. What else do you expect me to do? I’m asking people more knowledgeable for help understanding. But I guess asking for help is not good in today’s world.
 
Bit of a red herring argument.


If they are liable for what is on their servers, CSAM is the LEAST of their worries in terms of volume.


Sharing nudes of others without their permission, pics of drugs, illegal guns, rape/sexual assault, etc etc. Those are all illegal; maybe just morally "under' CSAM.

I would bet there are 100+ times the photos of other illegal activities than CSAM.


I also get the bad press aspect, but this is a moral decision by Apple which should not be interjected to its users what is a more or less moral crime than others. But that is not an excuse to invade privacy that they tout as a feature of their devices.

If that is the slippery slope then how long before celeb nudes are hashed, or sexual assault hashes? What makes one crime more or less moral than another to check for?

They went how many years of icloud without such a policy? A decade now? How many times have they been sued for hosting CSAM. I've never heard once. So why now then? THAT is the million-dollar question here.

It’s not illegal to have images of illegal activity - it is illegal to possess child pornography.
 
  • Like
Reactions: MozMan68
Okay...again...in order to modify an innocent image enough to look like an image in the child pornography database, YOU WOULD NEED SAID ILLEGAL IMAGE.

The odds of winning the Powerball lottery area bout 1 in 300 million on any given drawing on average. The odds of you having in your possession more than one photo that is nearly identical to one in the database and getting flagged for review is one in one trillion....You are over 3,000 times more likely to win a major lottery than get flagged for review (and even then, assuming you don't possess actual child porn, nothing will happen!!)

Not necessarily. The original picture could be on someone’s pc with photoshop and the edited version is what “gets out there”. Then that version might get put in the database and another heavily modified version gets made.
 
What an ignorant response.

it’s sad (and disgusting) that so many MacRumors readers support child exploitation. I hope none of you have young siblings or children.
I wanted to capture the first input from “the mob” where they shift from a technical attack to a personal one.
 
Please ALSO realize that they are performing the scan on THEIR servers. Apple intends to perform the scan on my device. One is scanning their equipment (perfectly fine), one is scanning MY equipment, which is absolutely not fine.
It pre-scans on your device just as a courtesy to both you and them, then it wraps the results of the scan into an impossible-to-crack cryptographic vault than can only be opened if 2 things happen
1) you upload the pics to their servers (so the scan being on-device is MOOT, like the sound of a tree falling in a forest if nobody is listening)
2) you score multiple matching kiddie p0rn hashes

No human in this universe can look into those security vouchers if these 2 things don’t both happen.
Not sure about parallel universes, metaverses and multiverses.
 
Step 1. Apple does something new, but in a different way than the rest of the industry.

Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.

Step 3. Apple explains some better what it did.

Step 4. Everybody shuts up and the industry follows Apple’s lead.
ad 1. Yes, (technical) parts of the process are new. That in itself does not eliminate all the risks involved.

ad 2. "Better" way does not mean that no misuse can take place (by authoritarian gov´s) to get rid of unwanted
people like critics, journalists, opposition etc.

ad 3. Yes. Does Apple also explain how they want to fight requests and oppression from non-democratic countries? How do they plan to react on market bans in case of non-compliance to i.e. "expanded neural hash database entries" supplied to be included?

ad 4. Oh, sure. What a nightmare then! Be aware, we are not talking about missing headphone jacks, ARM based laptops, missing USB or Firewire or optical drives (or a single button mice, hehe).
 
I cannot upvote this enough.

If this paranoia had any logic, people wouldn't leave their home for fear of having police watching them... For the speed of their car to be monitored by cameras... For towns, cities and stores to have CCTV... It's laughable.
Clearly don't understand the concept of people having a "reasonable expectation of privacy".
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.