Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Which is partly why Apple decided it’s users should pay the cost of doing the checks.

For my own modest photo library, checking 30,000 photos against a database of 200,000 will require 6 billion checks.

And both numbers will only grow in the future.

Yeah thanks for totally ignoring the fact that Apple sucks at their gate keeping job. We haven’t gotten to how bad Apple is at fixing known security flaws.

Soon, billions, not millions, of comparisons will be made on user’s phones. What can go wrong?

You think the error rate of humans reviewing appstore apps is comparable to the error rate of these hashing functions? It’s orders of magnitudes different. Billions of comparisons are probably nothing to sweat about for the process comparing those hashes. The number I was pointing you at is what comes AFTER the first filter: those relatively few cases (compared to the iPhone installed base) that will escalates to human review because of multiple CSAM offences. If you think you can get multiple (not one) unlucky matches, you’re either putting an unreasonable little faith in those hashes or you think you can win the lottery 5 times in a year.
 
850 million iCloud users. For now Apple is only accusing its US customers of being perverts but let’s consider the inevitable application to worldwide users.

Let’s assume an average photo library of 10,000 photos and the current CSAM database is 200,000.

So an error rate of 1 in a trillion actually means Apple making 2 million false accusations.

Apple refuses to say how many accusations trigger a police call. Maybe 2. Maybe 200.

I hope anyone falsely accused sues Apple for all their worth.

In the past, America literally fought a war of independence over less abuse than what Tim Cook is inflicting on free people.

These calculations are a dumb take.
Almost fascinating the level of failing at estimates and compound probability this drama is uncovering.
 
  • Angry
Reactions: peanuts_of_pathos
You think the error rate of humans reviewing appstore apps is comparable to the error rate of these hashing functions? It’s orders of magnitudes different. Billions of comparisons are probably nothing to sweat about for the process comparing those hashes. The number I was pointing you at is what comes AFTER the first filter: those relatively few cases (compared to the iPhone installed base) that will escalates to human review because of multiple CSAM offences. If you think you can get multiple (not one) unlucky matches, you’re either putting an unreasonable little faith in those hashes or you think you can win the lottery 5 time in a year.

People usually only buy one lottery ticket.

If people bought 10,000 to 100,000 (like the number of photos many people have) then yes someone would have a decent chance of winning the lottery 5 times a year.
 
  • Like
Reactions: peanuts_of_pathos
But you can add hashes for media relating to the protests in HK that allow China to contextually identify the protestor in other ways. You can also do it in the Messaging System, as Apple explained. “Apple, monitor hashes that match with x symbol or feature that is a common denominator in the photos, I.e facial tracking and expression recognition (anger, happiness, sadness, all learned by the AI), or signposts or locational data based on specific building frames, etc”. There’s ALWAYS a way. And it’s probably already happening. If the AI can identify based on a database, it can also identify based on expressions that can be curated from other protests to “teach” the AI what anger and inciting information looks like. You know, like from the show Person Of Interest.

What system are you talking about, exactly?

I am talking about the "CSAM Detection" system and nothing else.

Let's say the Chinese have thousands of photos of protests in Hong Kong. They get the NeuralHash software from Apple and creates a unique hash for each photo. Apple receives those thousands of hashes. The list of hashes is obfuscated when being distributed on each iPhone. No iPhone has the same set of hashes.

I (for the sake of this argument) am a Hong Kong citizen. I was at many of those protests and took many pictures. I use iCloud Photo Library.

The probability that the hash of my photos from those same protests matching the hashes provided from the Chinese government is extremely low.

Which makes the CSAM Detection system inefficient for such use also probably a total waste of resources.

As you describe there are much easier way to do it, using AI to identify objects and faces in pictures.
 
  • Angry
Reactions: peanuts_of_pathos
That bean counter is the reason Apple became the 2nd largest company in the world. Some people ain't happy with the direction Apple is headed. Dollars to donuts, none of them are stockholders.

Many of the world’s greatest villains made a minority of people with special financial interests happy
 
  • Like
Reactions: peanuts_of_pathos
850 million iCloud users. For now Apple is only accusing its US customers of being perverts but let’s consider the inevitable application to worldwide users. Let’s assume an average photo library of 10,000 photos and the current CSAM database is 200,000. So an error rate of 1 in a trillion actually means Apple making 2 million false accusations.

Apple refuses to say how many accusations trigger a police call. Maybe 2. Maybe 200.

The 1 in a trillion isn't for pictures but for accounts.

When you reach the threshold, Apple will perform a manual review of the flagged photos. You can even file an appeal with Apple if you feel they have made a mistake. If Apple concurs with the system they will send a report to NCMEC. NCMEC will then decide if law enforcement agencies gets the information.

Google, Microsoft and Facebook have had similar systems for a decade. If there was a huge problem with false accusation we would have heard about it.
 
  • Angry
Reactions: peanuts_of_pathos
According to the graphic in the article the "database will be on our phones so the scan can happen on the phone. I do not want that data base on my phone. I have turned photos on the cloud off but that still leaves me with this database on my phone... not right. Do I own my phone or does Apple?
 
The 1 in a trillion isn't for pictures but for accounts.

When you reach the threshold, Apple will perform a manual review of the flagged photos. You can even file an appeal with Apple if you feel they have made a mistake. If Apple concurs with the system they will send a report to NCMEC. NCMEC will then decide if law enforcement agencies gets the information.

Google, Microsoft and Facebook have had similar systems for a decade. If there was a huge problem with false accusation we would have heard about it.

Yep.


A family put through 7 months of hell just because of a date error. What could possibly go wrong allowing them access to another 850 million user accounts?
 
What is child porn? Can you define it?

Is taking a picture of your child playing in a bathtub child porn?

Is taking a picture of your child’s private parts to send it to the doctor (because there’s a problem) child porn?

This is system is not looking for child porn in general. Only specific, known pictures which has been determined by law enforcement agencies to be illegal.

So your two scenarios will be fine from this system.
 
According to the graphic in the article the "database will be on our phones so the scan can happen on the phone. I do not want that data base on my phone. I have turned photos on the cloud off but that still leaves me with this database on my phone... not right. Do I own my phone or does Apple?
You own the phone but not the software. Scanning will still be done on your phone even with iCloud Photos off. Apple just won't be alerted.
 
  • Like
Reactions: peanuts_of_pathos
Yep.


A family put through 7 months of hell just because of a date error. What could possibly go wrong allowing them access to another 850 million user accounts?

It's one case due to incompetence by the police, not Facebook.

Apples's system will not cause this kind of problem since they can identify the owner of the account much better, providing phone number and credit card info.
 
Last edited:
  • Angry
Reactions: peanuts_of_pathos
It's one case due to incompetence by the police, not Facebook.

7 months to fix a problem reading a date.

Apple themselves took 6 months to fix a problem reading an iCloud user’s last name.

These are proven grossly incompetent organisations and your plan is to give them more responsibility and more complicated tools.
 
  • Love
Reactions: peanuts_of_pathos
You own the phone but not the software. Scanning will still be done on your phone even with iCloud Photos off. Apple just won't be alerted.
Yeah. Apple won‘t be alerted. That feature (Apple/LEO alerted) will be rolled out with iOS 16 / macOS 13 „Stool Pigeon“ next year
 
Last edited:
  • Sad
Reactions: peanuts_of_pathos
Winning the lottery is very unlikely. And yet someone wins the lottery every week.

If you’re happy for Apple to falsely accuse an innocent person of being a paedophile every week then that’s on you.

Some people aren't good at reading comprehension and most people aren't good at calculating probabilities. Combining those two things together with not having a lot of information about the system makes me question their conclusion.

In this particular case I know the use of the numbers were completely wrong.
 
  • Angry
Reactions: peanuts_of_pathos
Yep.


A family put through 7 months of hell just because of a date error. What could possibly go wrong allowing them access to another 850 million user accounts?
This is exactly what’s wrong with this whole idea. The system is sealed. The database of disallowed content is sealed. The number of your violations is sealed. And when they decide to alert law enforcement, they won’t even tell you why your house is being searched.

This is totally unfair and intransparent.
 
  • Love
Reactions: peanuts_of_pathos
How are you so sure that the scanned pictures on the user's phone will generate the same hash as the scanned photo performed by the government?

If they used this system to catch scanned text it would probably leading to high number of false positives.

In fact, in iOS 15 Apple is introducing a way to convert text in images to text. Much better technology to misuse.
Because that’s the way NeuralHash is supposed to work. Among the popular target images un-favorited by the CCP are pictures of Winnie the Pooh and The Tank Guy from Tianamen protests.
 
  • Like
Reactions: areudum and msp3
Will consider expanding system. There we go. It’s only a matter of time until this is scanning for masks and maga hats. Or whatever is the equivalent tribal coagulant in the next cycle. I’d love to see the breakdown of people tho think this is a good idea vs the people who think vaccine passports are a good idea.
 
Coming to a police state near you. Given what's happened over the years (esp. the past year), any thinking person will have realized by now consciously or subconsciously the state is out to harm you for the benefit of our rulers...

And for anyone who supports the police state, do you realize how the Inner Party treats the Outer Party, vs Proles? Let's be clear, the 0.1% who buy and own politicians are the Inner Party. Pretending to be trendy and being first in line to cower in submission to the state doesn't get you into the Inner Party, just marks you as a useful idiot ready to be expended for someone else's benefit
 
Feel free to provide better estimates.

1 in a trillion is already the final yearly probability per account, ballparked by Apple
- after taking into account the error rate of their matching system
- after taking into account the average number of pics per account
- after taking into account that you need to get multiple yellow cards before raising a red flag (we don’t know exactly how many yellow cards, but it’s included in the “1 in a trillion“ ballpark quote by apple, so in a way we kinda know what we need to know)

1 trillion is 10^12
there are 850 x 10^6 iCloud accounts (let‘s assume they are all active, with a usage profile that fits Apple‘s ballpark, etc. and let’s assume worldwide rollout of the system)

That’s 850 millions dice rolls per year of a die with 1 trillion sides.

In an infinite time (people always forget this part), that would gravitate around 850 x 10^(-6) unlucky false positives (not yet accusations, mind you) per year. Or 1 false positive every 1176 years (again, in an infinite time scale, maybe you get lucky for the first couple millennia). (this is assuming the iCloud users population stays at 850M till the end of times, but c’mon)

THEN that single false positive every 1176 years would need to be converted into a full blown “accusation” by means of a human review. The human reviewer would be reviewing both actual positives (we don’t know how many, we know Apple reported 256 of them to the authorities in 2020, whereas Facebook reported 20M CSAM incidents in 2020) and this unlucky false positive every 1176 years on average (let‘s call him Unlucky Brian). If Brian’s photo is something clearly innocuous (and if you know how hashes work, you’d think it could very well be something completely unrelated to child p0rn, maybe a bucket of Lego or a goldfish in a bowl, it would be an extraordinary coincidence if the faulty hash match would just HAPPEN to be of a picture somehow related to children, why would it be? It’s like saying two men with a prime number of letters in their first name may happen to be brothers), the human reviewer won’t ”accuse” him of anything and Brian would never even know about it.

To recap, stop fantasizing about automatic accusations and automatic calls to the cops in any significant number.
 
This is a horrendous idea with so many ways this tech could go wrong.

Limiting it to the U.S. is not a solution and it’s obtuse of Apple to think so. Apple needs to stop now. Get rid of the feature, both the iCloud and Messages versions. No one wants this.
So let me try to understand this. You are against any and all measures to protecting a child?! Why are some vocal critics so afraid of this? are they hiding something??how could anyone not support something that protects an innocent child?? Apple, once again, is doing what is right!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.