I believe you are missing the point. Apple shouldn't be actively scanning anybody's mobile device - their private property - without their consent.
You technically give your consent when you buy or use Apple devices. You agree to it when you agree to the terms. And I'm sure Apple argues there's lots of scanning, scanning to build indexes for instance. Are they supposed to shut down Spotlight for everyone because you don't like scanning?
I'm not saying it's right though. But there's plenty of legalese behind what Apple does, that's for sure.
Let Apple wait for a proper subpoena if there is evidence of criminal activity
I think you misunderstand, you don't technically wait for a subpoena, according to the law if you're aware of CSAM content you have to make it available to investigators and section it off from other data to protect it while the investigation happens. It doesn't take a subpoena for CSAM content. This is how the law works.
On the other side of it, though, Apple technically doesn't need to scan anything, the law doesn't require anyone to undertake their own investigations. What it does require is if you're aware of such content to report it.
And the issue is more complex than you make it out to be.
I wasn't saying this is the full extent of the issue. I took issue with someone claiming that forged hashes are a thing. They are not. No one but CMEC can add or subtract or change hashes in the database. They'll literally confirm it in microseconds that it's in the database or not. CSAM of course has more to it than that, but I took issue with a person spreading actual misinformation and corrected them.
Algorithms for modifying images to avoid CSAM detection have already been made public and the only way to catch these altered images is to lower the threshold for detection, which will result in increased false positives. And the fact that some human being would be reviewing my family's private pictures in the event of false positives does not fill me with confidence.
And hence why it's being evaluated again. However, someone right now can review your private pictures too.
In fact, quite the opposite. Apple's CSAM scanning idea was based on laudable motivations, but only engineers myopically focused on technology rather than the real-world outcomes of their work could think so superficially about the potential issues with what they proposed.
And that's why it's good for Apple to take a second look. But I am so sick of hearing nonsense spewed about the issue, like the person I replied to did. I've read time and time again that people will upload their family photos which might have their nude kids in it and get arrested. Those people misunderstand that it's a database of known CSAM images, not something newly created.