Getting fed up of fallacies being trotted out here.
First, this move is not legally required. Yes, Apple is not allowed to store illegal images on its servers,
once alerted to their presence, but it is
not required to actively scan for them.
Second, this will not necessarily prevent new instances of child abuse. This is about detecting
past acts of abuse. Indeed the link between viewing child pornography and committing sexual offences against children is weaker than some seem to imagine (see
https://doi.org/10.1186/1471-244X-9-43). I think we all want people to stop viewing and producing these images, but Apple's actions are about the former, not the latter.
Third, will people posting here please stop claiming that objections to this scheme are due to ignorance please? This forum is visited by many technologically well educated people, and EFF are no novices to the field.
Fourth, just because Apple could have abused its position in the past by conducting surveillance, but didn't, is irrelevant. Apple is not only telling us it is going to conduct surveillance on our phones
soon, but seems to be proud of doing so.
Fifth, yes this might reduce the online presence of pedophiles, and that might protect some children at some point and find justice for others. However, there are other children at risk in authoritarian countries. Apple's scheme can be used to detect anything in any type of file. That might mean the faces of minorities (e.g., Uyghurs), gay people (yes, there has been an academic publication claiming to detect gay sexuality from the shape of faces), flags (e.g., BLM & rainbow flags, or even Confederate flags) memes, words, and even sounds (want to catch all the people recording a demonstration? simply play a series of loud sounds near the crowd and scan for that). The 'hash' in Apple's scheme is simply a perceptual summary, and that could be applied to anything.
This was a monumentally bad idea. It is the first step in extending AI-based surveillance to our mobile devices. Even if Apple abandons the idea, others will pursue it for far less worthwhile goals than detecting CSAM. Indeed, I honestly think the damage may have been done and it might be irreparable.