I think there is potential for this system to go badly where innocent photos are viewed in a different context. For example…
Let’s say, you are a parent and you take completely innocent photos of your child.
Some parents take photos where the child may be nude (doesn’t everyone have the classic embarrassing “baby butt” shot in their childhood photo album?) but nobody is offended because everyone knows there is no malintent behind the image. It’s just a child in a state of undress.
So, you have a photo like this of your kid, or your kid in the bath, or your kid at the pool/beach, etc. And you post it to social media, and nobody thinks anything of it because to anyone with a properly working brain, there is nothing to think about it.
But THEN, some creeper trolls public social media accounts like Facebook and Instagram for pictures other people post of their children, sees a few that excite them for reasons of their own, saves the good ones to their computer and shares them online on some sicko forum, or trades them with other perverts, etc.
Now when one of them gets caught, or their website gets raided, etc. all their files get flagged as CSAM because of the context in which they were being distributed and viewed by these people, completely unbeknownst to you, the child’s parent, who now still has this original photo on their phone or in their iCloud. And the checksums match because it’s the same file. Do you see how this goes wrong?
I do not know nearly enough about the process in which material is determined to be CSAM but this scenario doesn’t seem implausible to me.