Uggh people please read.
CSAM scanning is something completely different than this.
This is literally just a feature that you can turn on, if you so choose to, for children under the age of 13, so they can’t send or receive pictures of explicit body parts and such.
And it’s all done with machine learning, not hash matching.
So…
A: A won hundred percent opt in feature for only people under the age of 13.
B: not hash matching.
C: 99.99999% of people will never use it.
I am 100% against Apple‘s implementation of CSAM scanning that they talked about a couple months ago. This is something different, this is something that I’m not against.