Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It has been quite obvious for a while now that Cook only concerned himself with stock price, not leading any markets. Not what Jobs did when he was alive. Jobs released so many new innovative products, Cook has released a few that were Jobs idea anyway and wasted time and money on everything else, but boosted that share price and stock value...

I would also add that a lot of the software that launched under Jobs was developed elsewhere and bought by Apple. Mac OS X/macOS itself is based on NeXTSTEP (and the Darwin foundation underlies iOS and all other derivatives), Safari was based on KHMTL, iTunes was based on Sound Jam MP, iDVD/DVD Studio Pro, Final Cut, Logic, and Siri were all developed by third parties. Countless other features that Apple's implemented over the years used open standards for ease of development. Apple back then knew they didn't need to do every damn thing from scratch.
Do not forget the graphical UI bought by Apple then copied by MS that had been developed by Xerox PARC but that Xerox corporate was too short sighted to see the value of.

Xerox could have [should have IMO] been more dominant than Apple or MS, and sooner because they had every advantage in the early 1980s except foresight. Like today all those folks here dissing on the AVP because it is not an iPod similarly lack foresight.
 
Last edited:
Oh, no. There's a reason why I added Google's AI text on a search to my filter list. It's incredibly inaccurate.
 
Please don't talk about things you know nothing about.

1. Why is Apple working with Google in this space? Because Apple trains its (MULTIPLE) models on TPUs.

2. Apple has at least two known "sexy" models, along with multiple more specialized models. The two sexy models are called AFM (Apple Foundation Model). They are both *combined* language and vision models.
The on-device model is ~3B parameters, quantized to 2b per parameter.
The server model runs on Apple HW (the GPU of what are presumably M2 or M3 Ultras) and has about 200B models, quantized to about 3.5b per parameter.
Both are competitive (not obviously worse or obviously better) than counterparts *at this size* (which is definitely not the largest size, guesses are eg chatGPT5 is ~8 times larger).

Unusual (or at least non-obvious) aspects of these models include
- combined text and vision rather than two separate models
- multilingual, handling about 15 languages
- somewhat curated training rather than a raw flood of internet text (unclear if this curation helps or hinders performance, but it is there)
- emphasis on "IQ 100" level training not advanced training or reasoning. Apple wants the model to answer sensibly if you want to split tips, but does not [for now...] care how it responds if you give it your calculus homework

3. BY FAR the most important difference of these models, compared, to other models, is that they have been trained to handle a very specific task: a developer can, to simplify immensely, using natural and simple Swift, construct a query to be given to the LLM, and get back a response that is defined in terms of the structs and APIs of the calling app. This is not the same thing as "ask the LLM a question and get back text", and it's not the same thing as " ask the LLM a question and it gives you code that, hopefully, you can compile to do what you want".
The idea is that in a random app, like Uber Eats, I can say "I'd like to order that food we had two weeks ago, it was Asia, Thai I think, but I can't remember the name" and this will result in Uber Eats throwing up a useful order that you can click on. Look at what's involved here: the query has to go into the LLM (so that it can be "understood", Uber Eats also has to provide the database of recent orders (so that the LLM has a clue what was ordered over the relevant time period) and the response can't just be a text string like "looks like you ordered Satay Chicken, and by the way that's Indonesian not Thai", it has to be some sort of structure that plugs into Uber Eats API's to allow the construction of a genuine order.

No-one else has anything like this. THIS is what I mean when I say that Apple is constructing an AI UI, not an AI model.

4. So the position Apple finds itself in is that
- it's trying to figure out how to utilize LLMs AS API.
- this is a research task, so it makes no sense to try to do this research at the same time as you're constantly modifying an LLM that takes $100M per training run! Instead you fiddle around with tiny models, until you think you have the basic concepts required for the APIs (and their pieces within the OS runtime and Swift) all working.
- then you scale this up to a mid-sized model and validate that it still works.
- are they scaling it up to a much larger model? Maybe? Or maybe there is no point in doing that until we get a year or so of experience with this machinery to see what needs to be improved, changed, or discarded?


Apple is not playing the same game as Gemini, OpenAI etc. And it doesn't need to, just like Apple is not trying to compete with Google as a search engine. As long as the leading edge LLMs continue to provide as good an experience on Apple as they do anywhere else, then no-one feels any need to stop buying Apple HW just to get "optimal" ChatGPT.

This is all described in

Aspects of the Swift API were demo'd and examples given at multiple talks at WWDC earlier this year.
Apple did something very stupid last year with the announcement of Apple Intelligence before it was ready and before all this infrastructure was in place. That tells us that Apple Marketing were stupid in this case, and should have followed their normal rule of **** until you're ready to ship. But Apple Marketing is not Apple Engineering, and Apple Engineering have a plan in place grander than anything you can imagine.
Wow. Thank you much for that very useful commentary.
 
Several things wrong here....

Firstly this is a clear admittance by Apple that they have utterly failed in the Ai market, totally, when you ignore the market for sooo long that your competitors not overtake you, but jump so utterly far ahead it is impossible for you to catch them in a time frame that is worth it.

Secondly bang goes any and all security and the realisation by Apple that no, you cannot have a phone capable of running complex Ai models locally on it... you HAVE to use the cloud!

IMO this alone should see Cook and several others kick out, it is ALL on them. Also should make some think why bother with Apple when you can go direct to Google for far less cost....

First they utterly failed wasting billions and endless months on a car project that went no where, and as such they chose to all completely ignore THE hottest biggest and most tech important market on the planet...
Read the analysis by name99 elsewhere in this thread. Hopefully it will enlighten you if you read carefully.
 
Read the analysis by name99 elsewhere in this thread. Hopefully it will enlighten you if you read carefully.

That is not an analysis nor does it explain why Apple is so utterly far behind in Ai and chose to ignore it for so long, I still stand by my original comment. It is far too late for Apple sorry, everyone else is miles ahead now in Ai no matter what they do.
 
  • Haha
Reactions: novagamer
Well, this is officially pathetic. Apple is now asking Google to bail them out with their horrible Siri implementation. Why not go all the way and ditch Siri and make Google assistant the default on iPhones
 
  • Sad
Reactions: HazeAndHahmahneez
It’s insane that Apple didn’t develop their own LLM so far. Beginning of the end?
Not necessarily. LLMs are not compatible with privacy. Besides just being caught with their AI pants down Apple's desire to at least pretend that privacy is still a thing met up with reality. On top of that what AI talent they had was poached by companies willing to take the big risk for the big payoff, and if it fails they will just file for bankruptcy and stiff the stock holders. Then there is the matter of the large scale literature piracy that were used to train the LLMs. They didn't stop at Project Gutenberg.

Outsourcing the AI does not expose Apple to large scale financial risk, and when the privacy invasion suits start Apple can just point the finger at the AI company. The piracy suits against AI companies have already started. A court has already ruled that collecting statistical data on word usage does not violate copyright, but regurgitating it into works for sale does indeed violate copyrights.

Apple just need to admit they lost this one and stay out of it. Make whatever AI they chose a third-party optional download with a caveat that any data the user shares with the AI is outside of Apple's control.

AI generated kiddie porn is now a thing too with uncertain legal ramifications for the AI supplier. Apple just needs to back away from AI. Period.

Eventually the legal and market situations will settle down, but today is not that day.
 
  • Like
Reactions: HazeAndHahmahneez
Maybe it's just me but without google mail, maps, youtube, chrome, photos, drive, etc. my iPhone would feel a lot less capable if I had to lean on Apple's counterparts.

One huge difference with all of those .. none are baked in at the system level like an Assistant.

I use and enjoy Google Maps, but I like being able to keep it "at bay" ... I would worry about the ability to do that if Google were literally "powering Siri".
 
  • Like
Reactions: HazeAndHahmahneez
Very sad that Apple is not working on their own LLM. Hopefully they are and this is just a bridge deal, because otherwise the future for Apple will be dim.
I’m sorry to have to ask this… but did you bother reading the article?

“While Apple is exploring partnerships with different AI companies to power an improved version of ‌Siri‌, Apple still has not made a decision on whether it will use a third-party AI solution or go with the LLM models it has been developing in-house.

Apple is testing multiple LLMs, including its own, to determine which will provide the best results to customers. There are two versions of the new ‌Siri‌ in development, including one that is powered by Apple's own models and one that runs on third-party models.”
 
The interesting/weird/scary thing about Grok is that it shows that an AI or LLM can be steered in a certain (political) direction.

Some years from now when someone gets a weird answer from an AI, people will ask “Which AI did you use to get that answer?” followed by “Oh yeah, you shouldn’t use that one because of <whatever>”

It sort of undermines the premise.
That is all LLMs given that they are bias to their training material. There is a reason that Gemini was spitting out ridiculous images 12 months ago, or why ChatGPT would tell you it couldn't talk about politics when asking about one political candidate, then tell you the reasons to vote for another when asked.
 
  • Like
Reactions: Ghengis LeMond
Why would Apple want to integrate a manipulated AI model that heavily leans into conspiracy theories and neonazism? Apple should indeed stay as far away as possible.
Weird, I have a sub to ChatGPT, Gemini (I got that for free for a year) and Grok due to my X sub... I have yet to run into any of those things, but overall Grok seems to be the most accurate when I ask it a question or ask it to complete a task.

But go off king
 
Part of happy is that they are 'admitting' they can't catch-up and do their own.

The other part is angry that they got to that point.
Not angry, but certainly very disappointed. But perhaps this is just a step on a roadmap. Perhaps a big upgrade by using other AI services for now, meanwhile Apple continues to train their own models for Siri 3.0

I just hope we get Siri 2.0 and not Siri 1.1
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.