You don't seem to get it. If a threshold is reached, a person at Apple will review the images. Moreover, how is any perceptual algorithm going to classify an image as child porn without assessing the amount of skin exposed as a feature? Thus, a human reviewer might be looking at sensitive photographs of you, your partner, or some innocent photo of your kid swimming, as a false positive. And how, exactly, is Apple going to keep pedophiles and pervs out of that job or reviewing your photographs?The way I heard it described on a podcast was like this:
Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.
They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.
So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.
With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.
But as others have said... all of the big companies are doing similar things. So I dunno.
Apple in 2022: "We have reveiewed a photo on your iCloud for marching with CSAM system. We inform you that the photo doesn't match any known child abuse images. Nice bikini."