30 images seems high (I know it could include false positives). And now that all this info is public those losers will know how to get around it
People who are into pornography of any type are usually addicted to it and collect hundreds and thousands of images, not just a dozen or so, so 30 seems reasonable to me. Cloud services have already been scanning for CSAM on their servers, so no one has "gotten around" this before or now. They still won't be able to store CSAM collections on iCloud without being detected. They could have 100,000 CSAM images on their phone and not use iCloud for photos, and Apple will never know about that, precisely BECAUSE this is not a "mass surveillance" as some people are twisting it.