Apple should fully disclose how they are “hashing” these images, what constitutes a “match”, and what is the score or threshold at which point they hand your information over to some “nonprofit organization” who will then build a criminal complaint against you.
If Apple isn’t lying here (i.e., they never see the actual images on your phone), then they are creating a “fingerprint” of sorts and comparing the print of your image to a print of known child pornography. Is the system really going to be so unsophisticated that it only detects 1:1 matches for previously identified material? If so, then they need to be crystal clear about that. Or… is this system going to be ML-based and detect certain body positions, parts, postures, exposed skin, facial expressions, etc?
I am highly skeptical this is going to be a binary flag for each image (good/bad). Your entire library is going to be scored and, based on that score, it will be handed over to law enforcement who will use that score as gospel to secure a search warrant.
—
Let me make this point (as unpolitically as possible): at least in the USA, cultural norms and morals have been changing very rapidly in the last 10-20 years, and technology has played a large role in that. What was once cherished is now looked down upon; what was once reprehensible is now embraced as sacred and unquestionable. Right and wrong are evolving so quickly that a socially-accepted, non-serious tweet from 10-15 years ago will cost you future opportunities. There is no room for social or historical context, only bitter argument: you wrote that with yesterday’s standards and I’m judging it with today’s standards. So what happens when our technological overlords change their standards? How can I know that my thoughts, attitudes, and behaviors today will be acceptable to a future unknown standard?
I’ll leave you with this, and I tell it to my children every time they use the computer:
The internet never forgets, and it seldom forgives.