Apple would be liable because they are expressly saying there is no need for them to carry out CSAM detection giving excuses as to why. Various vocal groups are saying Apple needs to implement CSAM detection on icloud and Apple have turned around and basically said no. Therefore if CSAM material is found on icloud by the police due to the apprehension of criminals and their investigations, Apple could be held liable for knowing such a thing takes place and refusing to implement something that would have prevented the unlawful material from appearing on their icloud in the first place.
So far as I am aware Apple and other companies are required only to alert the authorities if they discover illegal images on the servers. They are not obliged to search for them. However, the iCloud user license precludes putting illegal content on their servers, so they do screen for that. That's fine by me - their servers, their rules.
This could have been a good thing but a bunch of complainers who didn't even realize before that Apple is already scanning their iCloud email for child porn suddenly went "but my privacy!"
The issue was about installing on-device surveillance without the permission of the owner of the device. And such a system could be used not only to detect illegal CSAM images, but authoritarian regimes could use it to scan for faces of dissidents, forbidden flags and symbols, political slogans, religious text and pictures, and even faces of certain ethnic identities. To make matters worse, when this was pointed out, Apple published a technical paper outlining the system, giving authoritarian regimes an outline of how to do this even if it isn't a part of iOS. We'll see whether a scheme like this will get embedded into authoritarian regimes' surveillance of their citizens. It seems many failed to understand the ramifications of this.
Getting back to the news article:
First, those of us who objected were told we didn't understand the technical aspects of the system. We did.
Then we were told the system performance was excellent. Then, it became clear there would be false positives (hence the need for human review on Apple's end). Then it was established the system could be circumvented by minor modifications to images.
And now we are told 'sorry, bad idea because the system might be misused', which completely misses the point that the
intended use of the system was to search without a warrant, without probable cause, without judicial review, and without explicit permission by the owner of the device to do so.
This is what happens when you let engineers run amok without ethical, legal and social review. Not Apple's finest hour.