There is a difference between software bugs, responding to legal court signed warrants, and saying "oops, we didn't think people thought Incognito meant we weren't tracking them". I know there are people who need to baste in the cynicism that everyone is equally bad, so I'm not going to take this line of reasoning much further, but just to address the items you listed so that they don't stand uncontested:
the CSAM-detection disaster
Most governments can get direct access to your account if they accuse you of possessing CSAM, and then they can go fishing. Apple created a secure, private means of proving a user does not have CSAM in their iCloud account, obviating the need to open your account to a government warrant. People freaked out because they don't understand how it worked, or the protections put in place. Was it perfect? No. Was it better than the status quo? Likely, yes. But we let perfect defeat better.
The argument that if Apple can scan for one thing, they can scan for another was always a stupid argument. First, they would only add fingerprints to the scan if they were approved by multiple child protection authorities in different jurisdictions-- so one government can't sneak new searches in. Second, if a government has the power to force Apple to modify their CSAM scanner, they have the power to force Apple to do the same thing without the CSAM scanner.
At least Apple made an effort at a privacy protecting solution, and published the full details of the approach they proposed well in advance of actually deploying anything. They then responded to public criticism and feedback from concerned organizations and did not deploy that technology.
So I don't see at all how this is somehow equivalent to Google secretly storing your browsing history while telling you that it isn't visible.
But to ignore Snowden leaks on big tech's intimate relationship with government surveillance
Snowden lost relevance a long time ago. That said, aside from the debacle over secure CSAM scanning, Snowden has praised Apple's pro privacy stance. Yes, the Apple logo showed up in a slide on PRISM, but Apple has said they don't give any government agency direct access to their servers and will only share information when presented with a court order (ie. in compliance with the law).
Unlike denials by other companies, Apple's claims are supported by the fact that it took the government 5 years longer than Microsoft and 3 longer than Google to convince Apple to participate and the leaked PRISM presentation stating that 98 percent of PRISM production is from Yahoo [sic], Google, and Microsoft-- meaning 2% was collected through the other 6 participants combined, of which Apple was one.
Tim Cook's obvious eagerness to expand Apple's ad business
I haven't seen Cook talk much about trying to expand ads, but certainly services more broadly. Again, Apple provides controls to maintain privacy when using their ad network, preventing personal identifiers entirely if you choose or rotating them with a click if you choose otherwise. In a world with ads, Apple's solution looks to be the most privacy preserving.
This is a true problem. I don't know if they haven't patched these because they just haven't gotten enough priority, or because they help some other service run, but I don't see any evidence that this is an intentional effort to scrape personal data. It could serve that purpose, but it would be an exception to all of Apple's efforts and wouldn't benefit Apple so much as network providers (Apple is the target endpoint for these packets either way).
unaddressed vulnerability to software like Cellebrite
Which unaddressed (but addressable) vulnerability? My understanding is that this is just a cat and mouse game between the developer of a complex software product and a sophisticated provider of what is essentially hacking software to intelligence agencies. I'm not aware of any vulnerability that Apple is intentionally leaving open.
To quote the Celebrite website:
Every year, Apple releases new iPhone devices and iOS versions with improved security measures, making it challenging for forensic examiners to access these devices. For example, new security mechanisms such as iCloud advanced data protection or lockdown mode increase the difficulty of extracting critical data from the devices. Exploit methods such as Checkm8 are no longer viable for newer devices.
Accessing and extracting the latest iOS versions and iPhone devices is a challenging task that requires a deep commitment to research and development, technical knowledge, and specialized tools.
That doesn't sound like Apple is leaving vulnerabilities unaddressed, it sounds like an arms race.