People no longer value their privacy as they once did. Then again, people didn't used to worship corporations like as some now do Apple Computer.
Personally, I feel that this is a fairly non-controversial topic, but I also think that this is because it's a fairly straightforward feature with a clear intended audience and the benefit is pretty clear-cut. This feature is being pitched at parents who would appreciate more support in managing their children, and children below the age of 12 likely don't have the finances to purchase their own smartphones to begin with, and are typically using hand-me-downs from their parents. It's a pretty compelling argument to stay within the Apple ecosystem and for the entire family to keep using iPhones.
When I prepare a presentation, I typically start by putting myself in the shoes of the audience and asking myself "why am I here and what's in it for me?". Alerting parents to when children under the age of 12 open potentially explicit photos is something that, in my opinion, needs to become common practice across the industry. It’s a “why did it take companies this long to figure this out?” type of thing.
That said, I do think that Apple's PR has done a horrendous job this year, especially by bundling this iMessage feature together with their far more controversial CSAM-detection feature, and I do understand why nobody really has any reason to support the latter. Even if I have no child porn on my apple devices, and even if I have nothing to lose from having my photos being scanned, it still brings me back to my initial question - what's in it for me? At best, there is no downside, which is no way to sell a feature.
My take on the CSAM-detection measures proposed by Apple is that it would actually be far less invasive than what other companies like Facebook are currently doing (basically scanning every single photo you upload to their service, not just for pornography but basically data-mining it for everything it is worth). Nor is Apple combing through your entire photo library. Instead, what Apple is doing is comparing hashes of known child pornography against photos you would upload to iCloud. Apple is not looking through your vacation photos for images of your 5 year old daughter in a swimsuit. Apple still won't know what your photos contain, both before and after they are uploaded to iCloud. In short, Apple appears to have come up with a way to find CSAM images uploaded to one’s iCloud photos without actually dissecting and probing photo libraries in a way that voids privacy, assuming it works as advertised.
Human intervention is positioned as a type of fail-safe designed to catch the very few false positives (false allegations of someone possessing CSAM), rather than being the first line of defence.
As for concerns about governments forcing Apple to use this feature to spy on their citizens, my response is that given the numerous hoops they would have to jump through, there are likely already more practical and easier ways of going about doing so than trying to retrofit Apple’s CSAM detection system or getting Apple to create a different version of CSAM detection. So while not impossible, I am willing to go out on a limb that a lot of these "what if" scenarios are simply slippery slopes. The story of a vengeful ex stealing your iPhone and storing child porn on it just so it gets flagged sounds more like a plot out of a Hollywood movie.
I don't see it as an invasion of my privacy, I don't think there's much opportunity for abusing it, but it still doesn't answer the initial question - what's in it for me as the end user? What's the benefit to me if I am cleared of not possessing child porn on my iPhone, when I was never under any suspicion of having any to begin with? Any pedophile aware of said feature is simply just not going to use iCloud to store any such material, and it simply feels more like a way for Apple to get law enforcement off their backs than it is about protecting users (and protecting us from what exactly?).
I guess my point at the end of it all is that this to me is a discussion that deserves to be a lot more nuanced than being for or against privacy, especially when Apple's proposed implementation looks to be a lot less invasive than other companies.