Which is partly why Apple decided it’s users should pay the cost of doing the checks.
For my own modest photo library, checking 30,000 photos against a database of 200,000 will require 6 billion checks.
And both numbers will only grow in the future.
Yeah thanks for totally ignoring the fact that Apple sucks at their gate keeping job. We haven’t gotten to how bad Apple is at fixing known security flaws.
Soon, billions, not millions, of comparisons will be made on user’s phones. What can go wrong?
You think the error rate of humans reviewing appstore apps is comparable to the error rate of these hashing functions? It’s orders of magnitudes different. Billions of comparisons are probably nothing to sweat about for the process comparing those hashes. The number I was pointing you at is what comes AFTER the first filter: those relatively few cases (compared to the iPhone installed base) that will escalates to human review because of multiple CSAM offences. If you think you can get multiple (not one) unlucky matches, you’re either putting an unreasonable little faith in those hashes or you think you can win the lottery 5 times in a year.