Private should never be in the same sentence as Google or Android without the qualifier "lack."
Without user identifiers? I would bet money that majority are logged into a Google account when searching, and even if not, they still set cookies.
You call it a "great deal"; I call it payola.
This is a *very* carefully worded statement that, in practice, means absolutely nothing for privacy:
Why not? You just have to put the appropriate verb between Google and privacy. 😜El oh el. Google & Privacy.....those are two words that don't belong in the same sentence...
Only you, Google’s employees, and a select few intelligence agencies and their contractors from the five eyes and Israel can access this. Plus some black hat hackers but we won’t detect them for 4 years anyways so they’re harmless. What you don’t know can’t hurt you! (Don’t worry they’re just running it to train for targeting data for drones. Google’s a military company now you can uh trust them!)Private…
It all depends on how that word is defined
It’s fine! Why do you need privacy anyway? You’re proud of who you voted for right? Why do you need to hide anyway?!With a history like this (https://www.thurrott.com/google/165728/google-caught-sneaky-location-data-collection) Surely this is very private.
Apple in my opinion was always ahead AI and machine learning technologies. There were behind generative Ai like generating images and LLMs. This isn’t bc Apple was incapable it wasn’t a priority their machine learning was focused on things like camera device search etc. in addition Apple gets a different level of scrutiny bc of apples customer base so they don’t tolerate certain things. Gemini and open Ai made up entire laws and headlines. No prob Apple summaries are incorrect for news notifications disastrous headlines everywhere.But I thought Apple was hopelessly behind on AI. Now I am starting to believe Apple is ahead!
As a developer working in the private/local AI space I find it disappointing that, evidently, neither Google nor Apple are providing this technology to third-party developers to integrate with and leverage. In the case of Apple, I really don't know what the implementation context for the technology is at all. Ostensibly, the Foundation Models run entirely on-device,
which is why the models are puny with puny context, really only appropriate for highly localized, targeted purposes (not to mention ridiculously strict guardrails). Does this imply Apple is going to deploy more substantial models, server-side, through the Foundation Models framework, that would leverage it's Private Compute?
Not in the documentation that I could find, for either FM or Cloud Compute. Similarly, it seems Google's only stated intent for its Private AI Compute is in service of its own products. What's interesting about the pitch, in that context, is that essentially Google is promoting how its tech protects you from...them.
Of course, a Pixel user interacting with Google's AI features has no visibility into the source code, and so were Google to want to gather a user's personal information they could simply send it on a parallel channel to an unsecured environment, at that moment, or in batch at a later time.
Most likely this was part of the deal where Apple pays Google a billion dollars, but on the premise that Google gets their privacy sh*t together.
This inadvertently helps Android users. Not bad at all.