I think that Apple should stop implementing these kinds of techniques. It doesn't matter how you look at it, it's still a backdoor and this creates an opportunity for authorities to state that Apple should "scan" for other cases within a person's his or her phone, and Apple might find itself in a position where they can't refuge because
a. it's possible for them to apply
b. they can be forced to accept these requests because they need to follow the law of that specific country
Even though the intentions of Apple are noble, they do open the box of pandora with this.
i can’t get over the feeling apple is doing this because of some pending porn legislation somewhere (GB or EU), some government coercion somewhere else regarding market access (China) or some broad decryption lawsuits or threats of monopoly breakup somewhere else (US, from DOJ/FBI / new sideloading bill in US congress), and is either trying to comply, get ahead of, or appease.
Fact is by doing this, Apple is demonstrating proof of concept.
Fact is that Apple won’t be able to refuse when some government somewhere now enacts law, based on expanding this proof, that scans for other images, symbols, words, or text strings, without apple screening and with results delivered directly to that government, or else risk indictments, civil suits, market closures, test suits, breakup or regulatory threats and actions.
Fact is that Apple could be made to comply without being allowed to publicize its objections. For reference, just recall the National Security Letters that forbade/forbid companies from discussing the mere existence of being served such surveillance orders (this is in the USA, which theoretically is more transparent than repressive countries not to mention has a written Bill Of Rights that many countries lack.)
Fact is that Apple has encouraged us to think of, and use, our Apple devices as secure extensions of our brains, presumably subject to the same protections as our brains under their Privacy As A Human Right (BS), and further, that presumably as a US company their principles are informed by the traditions and the protections afforded by the Bill of Rights, especially the 4th and 5th Amendments pertaining to Search and Seizure, and Prohibiting Self Incrimination, but not forgetting specifically that a person is presumed innocent and no searches shall ensue without a judicial warrant justified by a reasonable suspicion of guilt.
Fact is that Apple has just (voluntarily?) become a willing extra-judicial adjunct of state security and law enforcement, with its plan to willfully perform warrantless searches, while thumbing its nose at the protections enshrined in the Bill Of Rights.
Fact is that Apple has already announced willingness (actually intention) to take further steps down the slippery slope by expanding to other countries and to 3rd party apps after starting with its own Photos app and iCloud services.
Fact is Millenia of human existence has given us some fundamental immutable lessons: a) the state will try to overrun the rights of the individual on a pretext, b) mission creep is a real thing, c) moral zealotry is a dangerous thing, d) just because you can do it doesn’t mean you should, e) appeasement doesn’t appease, f) if you have stated principles, they must be inviolate, else they are not principles, g) doing the wrong thing for the right reason is still doing the wrong thing, and h) the road to hell is paved with good intentions.
I think most people would agree that CSAM is a scourge but Apple is now so on the wrong side of its rhetoric, stated principles, the Bill Of Rights, millenia of learning and just good sense that one really wonders how Apple got itself tangled up in this issue, and wonders how Apple could at turns be both so naïve and so arrogant as to think that a legal push won’t now come to a statutory shove, one Apple won’t be able to “vehemently refuse”.