Apple has been relentless on its stance concerning protecting user privacy, and I don't see this as any indication of the contrary. There are many reasons why some here might want to take a step back, breath, and reconsider their reactions:
First, I'm quite confident that there are countless "features" that Apple does not reveal to the public domain. This should go without saying considering the notoriety Apple has for secrecy. The fact they chose to reveal this is, in itself, quite telling. Apple does not want to tarnish its pro-privacy image in any way --- consider that for a moment while trying to think through what these two distinct features actually do.
Second, the peer-reviewed papers describing some of the cryptographic techniques being used are available for your perusal. The implementation described, if we are to trust that they are describing their actual techniques, in no way diminishes user privacy. The only caveat to this would be the iMessages alert to parents for children who have parental controls set on their devices, but the politics of surveilling your own children is outside of the scope of the general argument this thread has belabored.
The second feature, where you iCloud Photos library is "scanned" does not diminish privacy, either --- first of all, an actual "scan" of the image does not take place, the image is simply run through a neural network and outputs a hash; for those of you unfamiliar, this trained neural network is essentially a function f(x) = h, where it takes an input (x) and outputs a unique and unintelligible hash as a string of integers. It is the hash that is compared against hashes of known child abuse content.
The only real room for complaint here is if Apple were to update iOS to store hashes other than hashes of child abuse content; this would not be some server-side update that the government or some bad actor could manipulate -- Apple would have to intentionally change the database of hashes in iOS to include other hashes, and even then, it would still only work against images you upload to iCloud that would exactly match those hashes -- in short, this is not blanket surveillance. If an analogy could be used, this is more akin to a customs agent looking through your items as you cross into a different country, but even less than that, as the customs agent can open your suitcase and peek around, whereas this simply looks at a hash of a single item you are purposefully uploading to iCloud, not at everything on your phone, and not even the files themselves.
Simple: don't upload child porn to iCloud and you won't be tagged for human review. Stop worrying that this is enabling some backdoor for governments to spy on you, it doesn't.