I wouldn't trust Nicholas Weaver as far as I can throw him.
This whole thing is his brainchild, he wrote up this idea, almost verbatim,
back in 2019 for Lawfare blog. A national security blog.
Since then, he's been zealously defending this move by Apple. Using his credentials to lend credibility to this whole mess, without actually engaging with his peers in the field on the issues and concerns that have been raised.
His "rebuttals" can typically be summed up as intellectually dishonest at best and downright lacking in substance to withstand a simple sneeze at worst.
Take his actual argument for example:
This is plainly false. Since what has been accomplished is pre-image, meaning that the researchers have managed to create a collision by merely using a hash. The resulting image causes a collision but looks like random noise.
All that is necessary for an adverse actor is to use the noise as a mask over legal pornographic material and the user is screwed. Apple's human reviewer isn't going to do an entire CSI analysis to make sure if the people depicted in the photo (or their body parts) were underage at the time the photo was taken.
They'll ask themselves one question: could this be CSAM? If the answer is yes then the account gets blocked and a report is files.
As for his statement that "it would require the production of over 30 colliding images", that's just intellectually dishonest. They don't have to be 30 unique images, it could be 30 of the same. And even if it would require unique images, generating a colliding image is trivial both in effort and time as has been demonstrated, to then apply that colliding image to legal porn is even less of a feat.