So I have no facts for you specifically, however I would be suspicious of the number of CSAM images on the dark web. What exactly defines CSAM? In a broad sense it could be any image of a naked person under 18yo, viewed by someone 18 or older. I’m dubious about viewing these as a crime, but hey, that’s the law. So who are the primary producers of these photos? Most are probably kids themselves, innocents, or doting parents, again innocents. Now obviously there are a few really despicable people in the world who create harder core content, including parents. Those people should be locked up. But what Apple is doing won’t do anything about the worst producers of CSAM. It’s just a lot of technical jargon to create a backdoor using CSAM as the excuse. No children will be saved, but there will be the backdoor into iOS that the NSA has been craving for years.Again, correct me if I'm wrong (but only with facts if possible), but aren't the images they are scanning for known images from CSAM? So it's not scanning for any photos, It's matching photos against known photos. A little bit of a distinction there I think.
FROM APPLE:
the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
--------
Again, I'm not arguing for or against this system. In fact I'm more against than I am for it. And this was always the outcome. they were always going to postpone it after the initial reaction and concerns from privacy groups. But it will be back in some form because... "Online child exploitation is prevalent on the dark web – with over 25 million images of child abuse being investigated by NCMEC annually."
This is why Apple and the other tech giants won't give up on this.