First, the process apparently only matches known examples of content already deemed illegal, so it won't stop new material from being generated on somebody's iPhone. As I said in my first post, I appreciate Apple's sentiment. However, this search without probable cause, and therefore I think this idea lacks balance, to put it mildly. We can stop the vast majority of all sexual assaults by locking up all men. We can protect the rights of all men by making sure they are never locked up. The point is to get a balance between these two extremes. I believe Apple has got the balance way wrong because I believe they should have either probable cause or my permission to scan private photographs; I believe Apple are not properly estimating the likelihood of false positives; I am uncomfortable the human review they propose after a picture matches a hash; and I know that their system could be altered just ever-so-slightly for purposes far less noble than detecting child porn. If an autocratic government gets a hold of this, you can kiss goodbye to any political opposition and say hello to all sorts of systematised oppression.
It's simple: the pictures I take are mine (indeed in the UK the taker of a picture has an unalienable copyright). They are none of Apple's business unless they first suspect a crime or that I have otherwise violated their T&C's,. Apple should have probable cause before they do anything so invasive as scanning my photo's. Seriously, what's next? Scanning text? Tracking web traffic? Monitoring audio calls? Law enforcement is already doing that, and we don't need Apple piling on.
I am sorry to hear about your friend's sister. My family has been affected by sexual abuse, but nothing Apple proposes to do would have changed that. Better law enforcement would. If Apple wants to help, let it donate funds to law enforcement.