Assume a worst-case scenario: <insert bad actor here> injects some hash of <actual picture of something> that gets sent to Apple. Apple updates to iOS 15.x, which includes this hash in the database. Now, if you happen to upload a near-exact match of <actual picture of something>, you account is flagged. Now, Apple reviews this photo --- if it's not child abuse, then they know that the hash of <actual picture of something> is pointing to some photo that doesn't belong in their database. They can then update to iOS 15.x+1 that removes hash of <actual picture of something> so that future owners of <actual picture of something> are not flagged for child abuse review.
In other words, if you trusted Apple up until now to protect your privacy, then it's up to you to trust them in the future. If Apple is a bad actor, then it really doesn't matter if they do this or not --- they could implement a backdoor without publicly announcing it whenever they see fit. However, if Apple is still pro-privacy, then you can be assured that this in no way allows an outside bad actor to manipulate the system.
Sorry, but that's a bit naive, I think. China won't allow Apple in California to review the images of state-designated terrorists in Taiwan. They'll very likely have a backdoor to the software Apple provides on the phone to upload new hashes to. What do you think intelligence services are like around the world? They will absolutely drool over this kind of tech. Every photo is going to be analyzed for bad actors and objects.
I'm a software engineer and I am very aware how certain loopholes can "find their way" into software. If it leaks, they'll apologize. But good luck figuring out what the encrypted data going over your 5G-connection really is. Gee, a few hundred bytes of data from the Apple server. Nothing to worry about! Unless there's something to worry about.
What about the US government? They are known to do crazy things. The Tonkin accident is a famous one, "weapons of mass destruction in Iraq" a more recent one. The world governments are full of power-hungry and untrustworthy bad actors.
Today, they're looking for missing children. Tomorrow, missing demented elderly people. Next week, terrorists. Next month, wanted criminals. Next year, well gosh, China demands that Apple sends positive hits directly to them; here's 100MB of hashes to check. And the next day a few thousand Uyghurs are taken away, never to be heard from again.
It's a slippery slope argument because it's a big scary slippery slope.
We don't know who is in charge, who is responsible, and nobody is making sure we can trust those people. And while that's the biggest issue, in many countries in the world where Apple sells their things, you can't even expect a government to be trusted to have an open and honest panel of reviewers to review the reviewers.
China, Russia, Kazachstan, Iran, Venezuela, Algeria, Congo, Belarus, Saudi Arabia, the UAE, and so many more.
This is ridiculously scary.