That's a fair point. But that's placing a LOT of trust on them, and basically making Apple the police in that situation. They hold all the power in that moment of deciding whether or not to shut you down and report you, or not.
What advantage would there be for Apple to waste their time reporting innocent images to NCMEC? It won't go anywhere. I simply cannot understand where you're coming from here. Sorry.
#1 - A middle school girl, we'll say 13 years old, sends a photo of herself topless to her 14 year old boyfriend. The boyfriend is an idiot and sends it to a few friends. Because of iCloud's photo backup, the photo gets saved, analyzed, and reported to this manual review group. Then suddenly they're sitting in front of a topless photo of a 13 year old girl. That's child porn. What do they do? And the boys that simply received a text message... they get reported / shut down?
No, that photo would not be flagged because it's not a "known" child abuse image. Same thing goes for your 2nd situation with the video.
#3 - An adult is looking at pornographic pictures and saves one that ends up being an underaged girl - she's 17 but he thought she looked 21. He's now in possession of what could be considered child pornography, and for all we know it could be an image that matches up with something on the NCMEC. Does his account deserve to get shut down and reported to NCMEC?
If that image is a known child abuse image, then of course it will be flagged. If people are downloading porn indiscriminately, then that's the risk they run. But keep in mind they've made it pretty clear that just one image is not going to be investigated. They said there's a certain number that have to be both flagged AND uploaded to iCloud before they review them. That makes sense, because most people who purposely download child porn aren't going to just download a few images. They normally have hundreds or thousands of images.