There’s no scandal.
It’s a design and corporate legal issue presently.
In the end CSAM protection will work like this.
Parents and guardians will enable CSAM protection on their children’s devices.
Everyone else can choose not to use it and if they upload any illegal images on iCloud/iMessage servers then the images will be scanned on the server side, the evidence will be quarantined so that Apple is legally cleared, and the user willl be reported to the authorities.
Simple.
Dropbox, Google, Microsoft has been detecting such material on their servers for half a decade using PhotoDNA.
en.wikipedia.org