First of all, this system does nothing to prevent transmission of CSAM. Assuming that iCloud Photo Library is turned on—a big "if," considering the attention this is getting—illicit photos are still uploaded to iCloud, they just include a "voucher" that could lead Apple to eventually investigate. If Photo Library is turned off, child pornographers can transmit—via iMessage, AirDrop, WhatsApp, Telegram, Signal, Dropbox, etc.—all the CSAM they want.
Users also have no control under this system. Other than the "choice" of disabling iCloud Photo Library—really a Hobson's choice for innocent iOS users—we'll have no idea if or when one of our photos is tagged with a voucher, no idea if or when new databases are added to our phones, and no idea if or when Apple decides it's kosher to peruse our private photos.
Your third option is such a ridiculous straw man that it hardly warrants a response. "No protections at all for children" other than billions in law-enforcement resources, multiple high-profile advocacy organizations, and 100% social opprobrium.
And there's another option you're missing—Apple can gtfo of all our photo libraries and continue complying with lawfully-obtained warrants for suspected child pornographers as they do now.