With respect - people focusing on the specific case of scanning for CSAM are missing what the story is.
What's really happening here is Apple are putting a tool onto users phones whose purpose is to compare users private data against black box third party databases.
What it's used for is simply subject to Apple policy and what local jurisdictions might force them to use it for.
Installing a tool like this on user devices is a huge mistake.
I think there is room for some gray area here.
How much of a black box is the data source? Apple says that the CSAM hash list will be coded right onto the phone. But that database surely grows over time: what has Apple coded to enable updates to that list? Does it query NCMEC specifically, hard-coded? Does it query Apple? Or is there no mechanism for live updates, and an updated list needs to be added with an iOS update.
Querying Apple, as a third party, would be the least good. Querying NCMEC, as the first party (and a government-overseen one, if you believe that to be a good thing) is better. Hard-coding into iOS with no live update mechanism would be best, I think: that means that the list is a static object that watchdog groups can analyze and confirm is what it says it is.
I believe there is a balance Apple is trying to achieve here: put in some kind of mechanism to look for child pornography, but do it in as customer-private way as possible. Last year Facebook reported some 20 million cases of CSAM. Google did several hundred thousand, I think. Apple did 265. There is a reasonable way to look at this and say, "the cost of not reporting CSAM is worth it against protecting consumer privacy." I believe it is also reasonable to look at this and say, "Apple not reporting CSAM is unconscionable: Apple is protecting those engaging in child exploitation." I think the way they've described this implementation is as privacy-preserving as possible.
This is a tool that has awful potential uses. It turns this realm of consumer privacy from a technological hurdle ("You can't do this because the phone's design makes it impossible") to a policy hurdle ("You can't do this because we say you can't"). That's a step back. But if there is a company I trust to at least value and prioritize consumer privacy, it's Apple. They may not succeed at it, but they are incentivized to defend that line.
In exchange we get the good of Apple--one of the richest, most powerful companies in the world, with a user base of many millions--joining the fight against child exploitation.