While I appreciate that protecting our children is important, I have a couple of thoughts on this whole situation.
First, "Innocent until proven guilty". How many of you have taken a picture of your kids playing in the bathtub? Or in their new swimsuits? So, given the exact same picture... one is taken by the parents of their kid in a swimsuit at the pool, the other is taken by a pedo of a kid in a swimsuit by the pool. No AI is going to be able to differentiate between those.
Second, "persecute the many to guarantee prosecution of the few". This is not how our world should be. I should have an expectation that if I take a picture or two of my consenting wife for our own enjoyment, that it's not triggering some "Skin-o-meter" at Apple. I should have a reasonable expectation that the phone that I paid $1200 for isn't "spying" on me.
Third, isn't the CSAM thing looking for existing thumbprints of child porn/exploitation? Who would store those in iCloud? Who would have those in their camera roll?
I think that overall, this is not only a bad idea, but it is going to put additional load on our end-user devices without catching a single person, who is either taking their own pictures or not storing child porn on iCloud.
It's not up to Apple to be the "thought police". While I think that it was probably originally a noble thought from good people, they are crossing a line when they start persecuting innocent people in hopes of catching someone doing something wrong.
If law enforcement has reason to look at my photos / texts / etc. and Apple reacts by providing them those files, I have no issue with that. For Apple to be proactive in monitoring my actions and notifying the police if their AI thinks they found something, well, Big Brother is watching.
Apple should be reactive, not proactive in this situation.