Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Analog Kid

macrumors G3
Mar 4, 2003
8,927
11,516
You're still missing the point. It doesn't matter whether the probability is 0.5 or 1.0E-20, you simply can't say:

probability of 1 event = p​
probability of 2 events = p x p​

without adding:
assuming that the events are uncorrelated/independent

Without that assumption (which you will see stated in any reputable explanation of conditional probability) your mathematics is wrong. You really can't use coin tosses/dice throws/radioactive decay/whatever - which are all known good approximations to independent events - as a model/analogy/example of false matches unless you know that those false matches are independent. The whole Sally Clarke debacle was because someone applied the "independent events" model to a poorly understood situation where the events turned out not to be independent.

You're not reading what I'm writing or the caveats I'm bracketing it with even when I directed you to those specific points answering your objection. The question I was answering came from someone else indicated they were struggling to understand the basic details. Sure, you can wow them with your unique knowledge of covariances and correlated events-- it has literally no impact on the conclusion but only serves to make a basic concept needlessly complicated.

Thanks. That's actually helpful and informative (and I haven't seen it explained before).

The paper describing the system has been linked many times in these threads. It's worth reading

 
Last edited:

Motorola68000

macrumors 6502
Sep 12, 2022
282
261
So many here are arguing about CSAM, but it had nothing to do with CSAM per se, but about CLIENT SIDE SCANNING. You fall into a trap if you only discuss an application that opens a door, and then only concentrate on that part of the door that would be opened. Its irrelevant as the problem is client side scanning, which has been made quite clear by many in the industry. You will always find governments especially coming up with something reprehensible they wish to stop, and child abuse is one such, but do not think for one second they really give a damn, as all they are interested in is access via client side scanning. So the arguments here about hashes etc. are totally irrelevant it is the method of scanning on our own devices which represented such a 'slipper slope'. When many of us stated this we were met with derision, yet Apple now confirm the slipper slope
 
  • Like
Reactions: Pummers and VulchR

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
Members argument's for not allowing CSAM seems to stem from the assumption that it will allow governments to use it for other means. Just what exactly makes you think Apple is going to allow government authorities to use the technology behind CSAM to be used for other things? Apple refused to help the FBI a government authority and even refused to help when a court order told them too.
 

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
So many here are arguing about CSAM, but it had nothing to do with CSAM per se, but about CLIENT SIDE SCANNING. You fall into a trap if you only discuss an application that opens a door, and then only concentrate on that part of the door that would be opened. Its irrelevant as the problem is client side scanning, which has been made quite clear by many in the industry. You will always find governments especially coming up with something reprehensible they wish to stop, and child abuse is one such, but do not think for one second they really give a damn, as all they are interested in is access via client side scanning. So the arguments here about hashes etc. are totally irrelevant it is the method of scanning on our own devices which represented such a 'slipper slope'. When many of us stated this we were met with derision, yet Apple now confirm the slipper slope
If governments want to stop other things then the issue becomes nothing more than privacy/security over profit because Apple would have to give into government demands. The technology and how it is used has and will never be the problem, the problem will be can Apple be trusted to never allow a government to get what it wants. In my opinion the answer is no, especially if a government say's 'you let us scan for x and if you don't we will ban the use of iphones in the country'. If China did that, do you honestly think that Apple would allow it's iphones to be banned in one of it's most profitable areas? hell no, they would allow China's CCP to have what ever it wants. Like I said, the issue is not with the technology/software, the issue is with Apple and if they capitulate to government demands.
 

Grey Area

macrumors 6502
Jan 14, 2008
423
1,004
Members argument's for not allowing CSAM seems to stem from the assumption that it will allow governments to use it for other means. Just what exactly makes you think Apple is going to allow government authorities to use the technology behind CSAM to be used for other things? Apple refused to help the FBI a government authority and even refused to help when a court order told them too.

2020:
Government: "We want to introduce a law that requires all phones to be equipped with snitch software that scans user data and informs us about things we do not like!"
Phone makers: "Sorry, we'd like to help, but that is technically impossible and a privacy nightmare."
Government: "Damn, ok."

2024:
Government: "We want to introduce a law that requires all phones to be equipped with snitch software that scans user data and informs us about things we do not like! Apple has made software that can do this while protecting privacy!"
Phone makers: "Damn, ok."
 
  • Like
Reactions: theluggage

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
2020:
Government: "We want to introduce a law that requires all phones to be equipped with snitch software that scans user data and informs us about things we do not like!"
Phone makers: "Sorry, we'd like to help, but that is technically impossible and a privacy nightmare."
Government: "Damn, ok."

2024:
Government: "We want to introduce a law that requires all phones to be equipped with snitch software that scans user data and informs us about things we do not like! Apple has made software that can do this while protecting privacy!"
Phone makers: "Damn, ok."
Your point is irrelevant because Apple has already said they will pull out of the UK if the UK introduces the Online Safety Bill into law. All this posturing by Apple saying implementing CSAM will cause problems is just hogwash in my opinion because any government requests would still require the cooperation of Apple and if Apple refuses to cooperate then there is no problem. It only becomes a problem if governments say they will ban Apple products from entering the country if Apple does not do what they are asked to do.

China already has laws that allow them to snoop/spy on their citizens. The question is, is Apple making excuses not to introduce CSAM because they know Chinese law would allow the technology to be used beyond it's intended purpose and thus face a clash with Chinese authorities wanting the technology to be used for other purposes and force a potential threat of do as we ask or face a ban in China. Therefore to prevent such a scenario from happening Apple make excuses in not making the technology. If it's not made, it cannot be abused.
 

VulchR

macrumors 68040
Jun 8, 2009
3,394
14,273
Scotland
Members argument's for not allowing CSAM seems to stem from the assumption that it will allow governments to use it for other means. Just what exactly makes you think Apple is going to allow government authorities to use the technology behind CSAM to be used for other things? Apple refused to help the FBI a government authority and even refused to help when a court order told them too.
Do you not understand that Apple's technical papers provide a blueprint for governments to concoct a similar system, where those government controls the hashes and the file types scanned and not Apple? AI-enabled chips are going to become commonplace on mobile devices. Do you really want to set a precedent whereby global surveillance is routine on mobile devices, without a warrant, probable cause, or judicial review, just because a cabal of wholly naive engineers thought it would be a good idea?
 

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
Do you not understand that Apple's technical papers provide a blueprint for governments to concoct a similar system, where those government controls the hashes and the file types scanned and not Apple? AI-enabled chips are going to become commonplace on mobile devices. Do you really want to set a precedent whereby global surveillance is routine on mobile devices, without a warrant, probable cause, or judicial review, just because a cabal of wholly naive engineers thought it would be a good idea?
So basically the argument for not introducing the technology behind CSAM is that if Apple create the beast and then release the beast, it will not be able to control the beast therefore to prevent such a thing from ever happening don't create the beast.

I don't buy it. Apple told the FBI and the US justice system where to stick it. Apple is telling the UK where to stick it if they introduce the Online Safety Bill but yet you and others are telling me Apple will not be able to do the same to others!!!, lol...utter rubbish.
 

VulchR

macrumors 68040
Jun 8, 2009
3,394
14,273
Scotland
So basically the argument for not introducing the technology behind CSAM is that if Apple create the beast and then release the beast, it will not be able to control the beast therefore to prevent such a thing from ever happening don't create the beast.

I don't buy it. Apple told the FBI and the US justice system where to stick it. Apple is telling the UK where to stick it if they introduce the Online Safety Bill but yet you and others are telling me Apple will not be able to do the same to others!!!, lol...utter rubbish.
I'll try one more time. I agree that Apple were always unlikely to volunteer to give governments access to the system they were proposing (however, it is possible that Apple would have had legal requirements in some countries to use government approved sources for hashes - what then? So far as I know, Apple is not in the habit of breaking the law. Apple declines to help law enforcement when they come to Apple without a proper search warrant.).

No, for me the issue were the technical papers describing the system that Apple posted publicly (in leiu of Apple reading the room and admitting they were wrong). While the details of the CSAM-monitoring system were hidden, it won't take much for governments to use the online information provided by Apple as a template for making their own surveillance systems. They could then require those systems be installed on all phones being sold in their country. Maybe this is inevitable, but the governments that do this can now say 'Look, we're just installing for public safety the same kind of system Apple proposed to detect CSAM images', giving this an air of respectability.

This was a monumental blunder by Apple.
 
Last edited:

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
I'll try one more time. I agree that Apple were always unlikely to volunteer to give governments access to the system they were proposing (however, it is possible that Apple would have had legal requirements in some countries to use government approved sources for hashes - what then? So far as I know, Apple is not in the habit of breaking the law. Apple declines to help law enforcement when they come to Apple without a proper search warrant.).

No, for me the issue were the technical papers describing the system that Apple posted publicly (in leiu of Apple reading the room and admitting they were wrong). While the details of the CSAM-monitoring system were hidden, it won't take much for governments to use the online information provided by Apple as a template for making their own surveillance systems. They could then require those systems be installed on all phones being sold in their country. Maybe this is inevitable, but the governments that do this can now say 'Look, we're just installing for public safety the same kind of system Apple proposed to detect CSAM images', giving this an air respectability.

This was a monumental blunder by Apple.
Apple refused to obey a court order that told them they are to work with the FBI but they still refused...a COURT ORDER!!!

As for your point about governments and templates, governments do not need to wait for Apple to create surveillance system for CSAM, governments can hire software designers/developers to create their own surveillance system and force it to be installed on all mobile phones in their country.
 

Grey Area

macrumors 6502
Jan 14, 2008
423
1,004
Apple refused to obey a court order that told them they are to work with the FBI but they still refused...a COURT ORDER!!!
Apple appealed within the deadline set by the judge, and there would have been another hearing, had the FBI not dropped the case after they found a different solution. Thousands of people appeal court orders every year, some win, others eventually have to comply. It is not like Apple was holed up in Cupertino, fighting off the sheriff and his men. They followed the rules. I see no reason to believe Apple would not have complied if the case had ended up at the supreme court and ruled in favor of the FBI.
 
  • Love
Reactions: VulchR

laptech

macrumors 68040
Apr 26, 2013
3,591
3,992
Earth
Apple appealed within the deadline set by the judge, and there would have been another hearing, had the FBI not dropped the case after they found a different solution. Thousands of people appeal court orders every year, some win, others eventually have to comply. It is not like Apple was holed up in Cupertino, fighting off the sheriff and his men. They followed the rules. I see no reason to believe Apple would not have complied if the case had ended up at the supreme court and ruled in favor of the FBI.
Your missing the point, Apple refused to help a government authority when a court required it to do so by appealing. The FBI knew that Apple would keep on appealing and appealing which would have drained the FBI of funds and time because they had an ongoing case to investigate. Apple had no intention of helping the FBI and put up every road block they could so what makes people think that Apple is going to bow down to the demands of other government authorities? because that is basically what the majority of members here are saying, that is Apple built scanning tech for CSAM that they would be required to use that tech to scan for other things.

Are people saying it is ok for Apple to defy US government authorities such as the FBI and tie up the US justice system to prevent them helping the US government authority but it is not ok to defy other countries when they request other things to be scanned and thus should not create the system in the first place?.

The fault here is not the surveillance system to check for CSAM, the problem here is with Apple with who they would allow to use the system and how.
 

theluggage

macrumors 604
Jul 29, 2011
7,519
7,430
You're not reading what I'm writing or the caveats I'm bracketing it with even when I directed you to those specific points answering your objection.
I have read what you posted and explained why that doesn't fix the problem. I've also linked to accounts and real-life examples of the Prosecutors Fallacy which occurred precisely because people who should have known better applied simple high-school conditional probability in cases where it didn't work - including specifically people who were convicted on the "1 strike = p, 2 strikes = p^2" argument.

I don't see where your caveats address the issue of independence. That's not some esoteric detail, its the fundamental assumption behind calculations such as:
Make everyone flip 30 coins and you have a false positive rate of 0.5^30, or about 1 in a billion.
...which is only valid for independent events ...and most math textbooks will go out of their way to stress that it was a fair coin in this kind of example. Otherwise, you're into Bayes theorem and all that...

Remember, the question you were answering was:
So if a false positive is so incredibly rare, probably less so than all false arrests combined, why not set the threshold to one or two then involve the authorities?
...so whether requiring 30 matches really does reduce the false-positive risk to something infinitesimal is very relevant. If the false matches are independent with probability p then, yes, the probability of 30 false matches is p^30, just like tossing coins (or fair dice, or a million-sided-hyperdie, as long as it is fair). But if one false match against a user somehow affects the probability of finding a second false match - e.g. because some common feature in their images is triggering a false match and they have lots of similar images in their collection - then the probability of 30 matches could be much less than p^30.

To be fair, I'm not sure Apple were explicitly claiming the power-of-30 thing, just that the threshold reduced the false positive risk to something very low - they may have some more sophisticated basis - but I can't find it in that technical report, and plenty of people in this thread are basing their arguments on p^30.

The paper describing the system has been linked many times in these threads. It's worth reading
Yes, it is - I was mistaken about the derived images/human checking thing because I'd missed/forgotten the vital point about how the ability to decrypt the derived image was the test for 30 matching keys. My bad.
 
  • Like
Reactions: VulchR

theluggage

macrumors 604
Jul 29, 2011
7,519
7,430
I don't buy it. Apple told the FBI and the US justice system where to stick it. Apple is telling the UK where to stick it if they introduce the Online Safety Bill but yet you and others are telling me Apple will not be able to do the same to others!!!, lol...utter rubbish.

People seem to be forgetting that Apple themselves have back-tracked on their own CSAM scheme, citing exactly the sort of concerns that are being discussed in this thread. See the original article in this thread for quotes from Apple talking about slippery slopes, opening the door to bulk surveillance etc.

So, no, even Apple don't seem to be confident that they can defend against these pressures. Posters here are still vigorously defending points that Apple themselves have already conceded.

It may or may not be a coincidence that this has come just at the time when Apple are playing hardball with the UK government over a closely related issue. It would be hard for them to press that case while simultaneously singing the praises of how their own end-run around end-to-end encryption was so secure and private.

Also, I think people are getting bogged down in the semantics of "on-device scanning" - I think the real line in the sand is that they would have been screening privately stored cloud data that was supposedly protected by end-to-end encryption and being used to store personal data. That's rather different from screening images on social media or file sharing sites.
 
  • Like
Reactions: Pummers

rick3000

macrumors 6502a
May 6, 2008
646
269
West Coast
Almost of every law, rule, or policy to "protect children" is eventually used to erode privacy, but "protecting children" is hard for people to protest against which is why it used to push the erosion of privacy. Based on Apple mentioning the slippery slope, they seem to have acknowledged the main criticism of the original CSAM idea, that it would be a perfect tool for authoritarians. If I recall correctly, the entire proposed system used hashes to crosscheck known images against those stored locally on devices (pretty sure they already scan anything upload to iCloud, just like every other cloud provider).

The problem is that a government could force Apple to scan for additional hashes, and issue a gag order preventing them from discussing the inclusion of the new hashes publicly, or just include additional hashes on whatever list is being used without Apple's knowledge. Apple cannot be forced to build a way to scan for hashes, but once that system exists, such as to scan for CSAM, they could be forced to use it to scan for other things, such as political imagery, speech, etc.

There are a lot of ways to catch criminals without compromising the privacy of 99.9999999999% of people, and they are probably more effective too. Most law enforcement would be easier if we gave up all of our rights, but at least in the US that's not how it is supposed to work.
 
  • Like
Reactions: VulchR

theorist9

macrumors 68040
May 28, 2015
3,703
2,804
I agree with your general point that very small probabilities add up when applied to a large population, but it can't be stressed too much that the math above is just plain wrong unless the probabilities refer to independent events.

Coin tosses, die throws, crypto-grade random number generators all produce independent events: I.e. even if you have just (by some molecular chance) thrown 10 double sixes in a row, the chances of throwing an 11th double-six are still only 1/36.

False matches from perceptual image hashing against any individual's photo collection would likely not be independent - if (say) the pattern on your wallpaper generates a false match then the chances of you having a second picture including that wallpaper are just about 1.

If Apple's NeuralHash (unlike other perceptual hashing systems) isn't prone to that sort of error then it's for Apple to prove. What they can't do is make the possibility go away by saying "oh, but we'll look for 30 matches before acting".
Nope, the math I presented is perfectly correct, irrespective of whether "False matches from perceptual image hashing against any individual's photo collection" are independent. It would only be incorrect if the probabilities between individuals weren't independent, but I haven't seen anyone here arguing that.

Yes, if the events within each individual aren't independent, the starting probability will be higher than the 0.5^30 value I used. But that doesn't make my math wrong. It just means you need to insert a different starting probability into the same math.

Specifically, I don't understand this well enough to know if the within-person probabilities are or aren't independent, which is why I only did the calculation for the portion after that. I made that explict when I said: "Let's assume ... that the probability any single test is wrong is 0.5^30."

If you think you have a better probability for an individual's chance of a false positive, you can insert that into my math, and you will get a correct result for the population (again, assuming there are no between-person dependencies).

Yeah, it's a pet peeve of mine when I present some math on this site, in which I say something like "assuming x = a, then we obtain y = b", and someone who disagrees with the value I used for x says my math is wrong (or, worse, "just plain wrong"!), when in fact my math was perfectly correct. Instead, they should say "I agree with your math, but I would not use a starting value of a for x because...."

Of course, if I actually did screw up the math, that's a different story ;).
 
Last edited:
  • Like
Reactions: Analog Kid

Analog Kid

macrumors G3
Mar 4, 2003
8,927
11,516
But if one false match against a user somehow affects the probability of finding a second false match - e.g. because some common feature in their images is triggering a false match and they have lots of similar images in their collection - then the probability of 30 matches could be much less than p^30.

No, the probability of 30 false matches with high correlation between events would be much more than p^30.


Ok, look, your attempts at pedantry are badly misplaced here-- and not only because you lack the same discipline you're demanding from others in your own responses.

It's misplaced because the answer to the question has nothing to do with whether the probability of 30 consecutive tails is exactly p^30, the question is if the probably of 30 consecutive tails is greater than p^1. You're going to have a hard time convincing me that 30 of 30 in a row is more likely than one of one in a row and if you can't then all of this is nothing more than a distraction from that simple point.

"If it's so good, why not set the threshold at one or two". The answer is because a threshold of 30 is part of what makes it so good.

It was meant to be is an example to someone trying to understand the basics of chained probabilities by creating a scenario someone who doesn't understand statistics, and hashes and neural nets can relate with. In this case comparing very unfair coins (always land tails) and coins they're familiar with and might use to pick sides in a game or settle an impasse because people naturally assume a coin is fair.

It doesn't need to be stated, it's an academic detail. Most people don't tell their friend "let's flip a coin and loser buys the next round" and then say "but it's important that we specify it to be a fair coin, preflipped more than 20,000 times to ensure it is fair to within one percent with three sigma confidence, flipped from an unknown starting state without inducing precession and allowed to fall to a hard and flat but mechanically isolated surface and fully settle. Subsection c) of this agreement will cover the action of each party in the event that the coin settles on an edge".

That's not how people think and, most importantly, it doesn't impact the answer to the question in any meaningful way.

I get that the whole argument in your mind hinges on whether I use the word "coin" or "fair coin" but that is really the least interesting facet of this discussion. You're ignoring the audience and the way I phrased the example. I don't understand why "always lands tails" wasn't the clue that this was a hypothetical example-- that's a much less realistic assertion than the fair coin assumption.

I have read what you posted [...]

...which is only valid for independent events ...and most math textbooks will go out of their way to stress that it was a fair coin in this kind of example.

obviously not:
Each flip of a fair coin has a 50% false positive rate

presumably because it would stand in the way of some other message you want to give and you think mine is the back you want to stand on to deliver it. So let's look at that message.

I've also linked to accounts and real-life examples of the Prosecutors Fallacy which occurred precisely because people who should have known better applied simple high-school conditional probability in cases where it didn't work - including specifically people who were convicted on the "1 strike = p, 2 strikes = p^2" argument.

That not only doesn't hold in my little example but also doesn't hold in the Apple example either. Nobody in either case is going to jail because of a chain of false positives. There is a human review before the authorities ever get notified and if the false positive gets through that review even the greenest of public defenders are going to show that the images their client is about to hang for are not, in fact, CSAM.

The hashing is just a way of automating the front end process to reduce the load on human reviewers-- which it does quite well and likely at a risk of false negatives due to unmitigated evasions rather than false positives.

I'm not sure Apple were explicitly claiming the power-of-30 thing, just that the threshold reduced the false positive risk to something very low - they may have some more sophisticated basis - but I can't find it in that technical report,
They don't make any such claims about the performance of a single hash, as I said and you ignored:
The math for the Apple scheme gets far more complicated because there's more images, and more things to match, and it's not just a binary coin flip. I don't think I can make any estimates of the underlying probabilities from what's published

plenty of people in this thread are basing their arguments on p^30.
Who? I haven't seen anything that looks like math anywhere except my attempt to explain this one narrow detail and the few responses it's gotten.
 

roger6106

macrumors regular
Jun 19, 2007
123
30
“Anything stored on-device would not be scanned at all.”

Literally from the white paper:above
I added an edit for clarity. Checking the database happens on device, but the only images that get checked are those that are getting added to iCloud. If you choose to store files locally only, they will not get checked at all.
 

theluggage

macrumors 604
Jul 29, 2011
7,519
7,430
No, the probability of 30 false matches with high correlation between events would be much more than p^30.
That was exactly the reasoning used in the Sally Clark case. Even there, nobody claimed the chance of two random deaths was exactly 1 in 73 million (the square of the chance of one death) - they just held that figure up as an example of how overwhelmingly unlikely a coincidence it was.

obviously not:
"Each flip of a fair coin has a 50% false positive rate"

This is not about whether your example was internally consistent - it is about whether your example was applicable to the real situation. You can safely extend your example to scenarios involving fair dice, radioactive decay etc. even though the probabilities are vastly different because those all generate independent events. Apply it to correlated events, though and you have "garbage in, garbage out". That's not academic pedantry, its the fundamental assumption behind the formula you are using for chaining possibilities.

It's misplaced because the answer to the question has nothing to do with whether the probability of 30 consecutive tails is exactly p^30, the question is if the probably of 30 consecutive tails is greater than p^1.
In your example you specifically calculated p^30 and even compared that to Apple's claimed rate (...and I'm not claiming that Apple calculated that from p^30 - its posters here who are coming up with that calculation).

The question is whether the probability of 30 consecutive false matches is an order of magnitude greater than p^1 - which is what you need to avoid the (general) base rate fallacy/prosecutor's fallacy. Yes, the chance of 30 false matches is going to be less than the chance of a single match - but unless you know that each further match is independent of the first one you can't just keep multiplying by p. Your example assumes that by making your 'false positive' event a toss of a fair coin, which is widely accepted as independent from previous tosses.

To put it another way, in your specific example, 'p' is the same whether you're talking about the chances of 30 people in the general population throwing a single tail or a specific person throwing their 30th sequential tail. In the real world case, p is the probability of a random photo from the entire population of photos generating a single false match - but the next 29 matches involve the probability of someone's personal collection of photos - mostly featuring the same people, houses, objects or types of subject - containing a second, third,... false match. You can't assume that those are the same p without knowing more about how false matches arise. If the matches are triggered by some characteristic present in one of your photos, the probability of two matching photos in that collection could be closer to one. So the true 30-false-match probability will be less than p but might be significantly more than the p^30 estimate you're giving.

Nobody in either case is going to jail because of a chain of false positives.
...but at that point you're beyond mathematics and falling back on your trust in human judgement. It's good to know that Apple physically can't look at (what people are assuming to be) thumbnails until there are 30 hash matches - but after that you're back to Apple reporting the user at their discretion. Still depends a lot on what the nature of any false matches is (if they turn out to be 30 landscapes and vases of flowers, d'uh, but what if they're 30 small, hard-to-make out pictures of kids?) and what level of certainty the checkers are instructed to require... and once the incident is reported, the next step is a police raid to get the original images (and those things never go south). By the time your public defender gets involved, the accused's life will already have been turned upside down.
 

theluggage

macrumors 604
Jul 29, 2011
7,519
7,430
Yes, if the events within each individual aren't independent, the starting probability will be higher than the 0.5^30 value I used. But that doesn't make my math wrong. It just means you need to insert a different starting probability into the same math.
Actually, I've re-read the post and you're right. It's the 0.5^30 figure that's the problem.
As long as you're calculating the possibility of 1 match from the total population, not subsequent matches from an individual you're good. Sorry.

The distraction is that people are getting the ^30 figure by taking some figure for a single match in the whole population and then swivelling to talk about 30 matches from an individual's collection. (I suppose its even possible that Apple got the number 30 from some tricky Bayesian stuff...)
 
  • Like
Reactions: theorist9

Razorpit

macrumors 65816
Feb 2, 2021
1,098
2,311
Unfortunately a lot more people are involved than you would think. You'll be able to find reports from people that have accessed the "dark web", to find thousands of communities with thousands of members, all circulating all kinds of awful material.

Worth a watch if you want to erode your remaining faith in humanity:
The dark web on Letterkenny looked so much more promising... 😞
 

Analog Kid

macrumors G3
Mar 4, 2003
8,927
11,516
This is not about whether your example was internally consistent - it is about whether your example was applicable to the real situation.

No it's not. It's about whether my example answered the question that was asked.

Yes, the chance of 30 false matches is going to be less than the chance of a single match

And we agree that my example answered the question.

Anything more is a change of subject, which you're welcome to do but not by trying to claim I said anything that is incorrect.
 

SactoGuy18

macrumors 601
Sep 11, 2006
4,373
1,535
Sacramento, CA USA
Let's all face it. Apple canceled CASM search because of the fear some state actor hacker working for an intelligence agency will hack the system so it could be used to search for images of a political nature.
 
  • Like
Reactions: VulchR
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.