Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Kneecaps their AI rival in OpenAI as now Gemini will be the default model for every mobile device in the world. Plus keeps another “competitor” in Apple from releasing their own (though this is probably not really factor cuz Apple seems to be quite behind no matter what with this)
I'm not sure that Apple is "behind." They seem to have made the decision that all these models will be about the same in quality, and no one is going to make much money from them due to competition between the model builders. So, Google, Open AI, Meta and others will be paying about $100-200 billion each annually to develop and deploy these things, and Apple will pay one of them $5B over a few years to access the tech from Apple devices.

This is such a good deal for Apple that I wonder about the viability of the entire AI business. If you can access Gemini from Apple devices for free (or a low cost), why would you pay much to Open AI for access to Chat GPT? And $5B won't likely even pay Googe's operating expenses for its models, much less the capital expenditures they're making.

Is "Math is hard" Barbie doing the business plans for the big AI players?
 
It is relevant because it’s keeps OpenAI from partnering with Apple further. That’s my point.

Well I suppose but OpenAI appears to have decided they didn't want to do that anyway. Maybe they'd have only done it had it got "ChatGPT" everywhere on Apple devices. For them it's the same as if Apple had finally come up with their own very powerful LLM.

I'm sure at some point Apple will catch up as we're reaching smaller and smaller gains with each frontier model, so they'll just be able to jump right in with whatever the "defacto" standard for programming LLMs as towards the end of the increase and tweak from there.
 
It is probably more than just the open weights and 'good luck'. OpenAI passed on doing the porting work. That extremely likely means that Google had to contribute to the porting work for their model also. It would be dubious path if Apple held bake-off between Google , Claude , and perhaps others on a solution and had to run around doing 100% of all the fine tuning from scratch themselves. Especially, if trying to get it all done on a reasonably short timeline. They would need access to expertise on the model's specific tendencies and not spend tons of time 'rediscovering the wheel'. That said I doubt Google is doing 100% of porting work, but it probably isn't 0% either.

Would make sense for Google ( and Claude , etc) to get expertise on making models run on Apple Silicon also for their own independent apps could leverage local processing when possible also. OpenAI seems mostly intent on building the most colossal cloud thing possible. In addition to broad ecosystem around that cloud first brain also. Their set up is more like a black hole that is out to suck up everything. Not be someone's good partner.

Yeah you're probably right. Also Apple has to somehow create a very good "router" model that takes requests and decides what should be on device and off device. Obviously they've got this already with the ChatGPT integration and I guess it works relatively well though it very simply just decides if it's a local API or world knowledge and hands it off. At some point it has to decide if it's a complex local API request and use their private cloud - like I guess the example they gave in their pulled advert of "what was the name of the person I met X months ago" - that needs to use on device APIs to pull the data and process the more complex command through a bigger model as, i'm assuming the mini on-device LLM can't do that level of thought/processing (I could be wrong?).

When testing Shortcuts the on device model does seems very simple and gets confused easily. It can't do vision etc either. But I guess it can be improved somewhat? Or they can quant a bigger model down? There's a lot to tune there - I don't think Apple is happy needing an internet connection for everything like say Alexa is - just to turn lights on/off when those lights are all connected locally with Homekit - and probably the same for calculator functions or accessing simple on device data - I guess it'll process that data with gemini only when the request is complex enough.
 
I'm not sure that Apple is "behind." They seem to have made the decision that all these models will be about the same in quality, and no one is going to make much money from them due to competition between the model builders. So, Google, Open AI, Meta and others will be paying about $100-200 billion each annually to develop and deploy these things, and Apple will pay one of them $5B over a few years to access the tech from Apple devices.

Google pays Apple $15-20B per year for search placement. Apple probably isn't 'apy' $5B at all. Apple is just going to take a smaller check. ( $10-15B ) There is no 'out of pocket' costs here for Apple at all. Instead of taking cash they take a service ( which I suspect is tax deductible because it is "operating costs". So possibly cut their tax bill at the same time.)


P.S. Over time AI might make the Google search ad placement get smaller. If that dropped down to $8-10 and went up to $2-3B per year for bater service then things get tighter.
 
Last edited:
The statement referred to their cloud technology which is apparently part of the initial training of the models, but the models themselves will then be customized for Apple and will still run on Apple's own silicon servers (for PCC).
Where do you get that from? Google doesn't use Google cloud for training. Google Cloud is their hosting service they offer for Gemini. Apple uses Google Cloud to host iCloud too.
 
Where do you get that from? Google doesn't use Google cloud for training. Google Cloud is their hosting service they offer for Gemini. Apple uses Google Cloud to host iCloud too.
They didn’t say anything about “Google Cloud” as a hosting service in the announcement. They said “the next generation of Apple Foundation Models will be based on Google's Gemini models and cloud technology”. Google refers to this cloud technology often when talking about their AI model development. See: https://cloud.google.com/tpu
Also the announcement specifically said Apple Intelligence will continue to run on Apple devices and PCC for privacy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.