Matter of time, yes. And, that time is probably a few thousand years.

While I know you’re likely not interested in knowing how it works, in case anyone else finding this thread is interested…
There are images of illegal activities that have been captured. A mathematical hash of these images have been created. All image host companies (including Apple) scans their repositories, not using an actual image or machine language algorithm (that may cause false positives), but specifically using the hash to see if any images match. The CSAM algorithm is not looking for “salmon-colored (skin-colored)” tones. It mathematically computes the image’s hash and determines if that hash exactly matches one of the illegal images.