The CSAM photo hashing is not the main issue here. The likelihood of one of the photos of my kids matching a CSAM entry is unlikely. It could happen but it's very unlikely. Factor in that your library actually needs to be matched to multiple entries makes it pretty much unthinkable, and most likely, theoretically impossible. Honestly, I trust that the mathematicians at Apple did their homework. We're not talking "Bob's Freelance Ltd." with student helpers here.
Abuse by governments wanting to detect same-sex couples, anti-government things, or track people of interest that might be in the background your photos? We haven't seen that happen on cloud services hosting private photos. At least it's not been reported to my knowledge. It's been done using Facebook, but, yeah, that's a chance you take when using a service like that.
The thing is, this sets a precedent for Apple adding things to the OS itself, for the greater good of the society. Willingly. It'll be way easier for governments to enforce country-based "greater good features" no matter what they say publicly. To me, that's the issue with this. They've lost their usual "we'd never do that" argument and, most importantly, they've crossed that line.
And for what? Saving the costs of the computing power needed to do this remotely, on their servers? Adding end-to-end encrypted Photo libraries? Come on.
The initial feature is noble and almost everyone on planet earth can get onboard with it. But it just doesn't stop with on-device CSAM and they're extremely naive if they think so.