I keep hearing people say what I bolded in your post above, and I'm trying to figure out why you guys are so confused about this. Look at the very part of the article you quoted: "Apple then manually reviews each report to confirm there is a match." So unless you have confirmed child porn images on your device that you then upload to iCloud, your life will not be ruined because nothing will come of false positives. Why would Apple report an innocent image after review? Makes no sense. I don't understand your concern here at all.
That's a fair point. But that's placing a LOT of trust on them, and basically making Apple the police in that situation. They hold all the power in that moment of deciding whether or not to shut you down and report you, or not. I'm sure there are checks and balances in place to help prevent the innocent from being falsely flagged. But there's a gray area that gets close to crossing the line where you're naive if you think they'll get it right 100% of the time. And if they're wrong, it doesn't matter at that point. Your account is shut down and you've been reported. Let me give you a couple of scenarios... and I have middle school aged kids, so if you think this can't happen, you're unrealistic.
#1 - A middle school girl, we'll say 13 years old, sends a photo of herself topless to her 14 year old boyfriend. The boyfriend is an idiot and sends it to a few friends. Because of iCloud's photo backup, the photo gets saved, analyzed, and reported to this manual review group. Then suddenly they're sitting in front of a topless photo of a 13 year old girl. That's child porn. What do they do? And the boys that simply received a text message... they get reported / shut down?
#2 - A 16 year old boy takes a video of him and his 15 year old girlfriend... Because kids are idiots and don't lock their phones, a friend finds it and sends it to 10 other friends, b/c he's also an idiot. Then suddenly you have 12 teenagers with what is technically deemed child pornography on their phones. They should all be shut down and reported to NCMEC? (what ended up really happening in this situation was the police got involved, scared the living hell out of all kids involved, and taught them a very solid lesson on the dangers of photos/videos, revenge porn, etc etc. I do believe the kid that actually spread the message ended up getting in some legitimate trouble for it, too)
#3 - An adult is looking at pornographic pictures and saves one that ends up being an underaged girl - she's 17 but he thought she looked 21. He's now in possession of what could be considered child pornography, and for all we know it could be an image that matches up with something on the NCMEC. Does his account deserve to get shut down and reported to NCMEC?
Personally I'm against porn in general, of all types and ages, because I think it's unhealthy, and also one of the primary funding sources for for sex trafficking. But even I, who would be considered by most to be a prude in this area, still see over-reach here and think this is an extremely slippery and dangerous precedent.