Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,353
39,189


Apple is developing its own large language model (LLM) that runs on-device to prioritize speed and privacy, Bloomberg's Mark Gurman reports.

hey-siri-banner-apple.jpg

Writing in his "Power On" newsletter, Gurman said that Apple's LLM underpins upcoming generative AI features. "All indications" apparently suggests that it will run entirely on-device, rather than via the cloud like most existing AI services.

Since they will run on-device, Apple's AI tools may be less capable in certain instances than its direct cloud-based rivals, but Gurman suggested that the company could "fill in the gaps" by licensing technology from Google and other AI service providers. Last month, Gurman reported that Apple was in discussions with Google to integrate its Gemini AI engine into the iPhone as part of iOS 18. The main advantages of on-device processing will be quicker response times and superior privacy compared to cloud-based solutions.

Apple's marketing strategy for its AI technology will apparently be based around how it can be useful to users' daily lives, rather than its power. Apple's broader AI strategy is expected to be revealed alongside previews of its major software updates at WWDC in June.

Article Link: Gurman: Apple Working on On-Device LLM for Generative AI Features
 
Last edited:
I hope they know where they are going with this, because so far it seems unclear and muddled. If this it to be touted as the next big thing (which seems to be what Apple is doing) it needs to be seriously good, whether it is on device or cloud. If this is another Siri style disaster it will probably affect AAPL in the coming years.
 
There are huge benefits if this is done right. But I hope they allow the option for these aspects of the OS to be disabled… just as Siri can be disabled now…

AI should be a choice… especially at the OS level

I do admire apple doing as much as possible on device… this really can enhance privacy.
 
I just read the newsletter and this guy is really something. Cheaper iPhones? Oh yeah, keep'em coming! I don't believe a word after what they did with iPhone SE.
 
  • Like
Reactions: macvicta
It'll be interesting to see their approach to this rapidly-growing segment, especially when they were late to the game.
 
  • Like
Reactions: macvicta
"On-device" AI just means that Apple will argue for the implementation of the (un)necessary means to incentivize users to upgrade their iPhones. "More Neural Engines! 25% faster execution than previous generation!" spiel. It is a way of rapidly obsolescing current devices on the grounds of on-device AI as a speed and privacy feature.
 
Last edited:
Sooner or later Apple, MS, Google or some other big player will personalize LLMs by leveraging your personal data (iCloud/Notes/Github repos) to 'fine tune' an existing LLM to be your best personalized AI.

Apple will need to persuade users that their way is the best/most secure way or MS and Open AI will eat their lunch.

The question will be how much processor power and memory is needed (and if 8GB of memory too little) to make this run. I would think this would push Apple to raise minimum memory offerings on base models. If only Apple had found their Satya...
 
If Apple truly cared about privacy, their cloud AI would also be in-house.

This just shows that Apple is behind in cloud based AI
…and? the sky is blue man. 😂 we all know Apple’s privacy values are virtue-signaling to a degree; we all know Apple hasn’t invested big-time in cloud-based AI. I don’t envision a lot of privacy-minded users taking advantage of features that are made possible by licensing Gemini (or something from OpenAI). as of right now, Apple sees an immediate need to capitalize on the AI boom, and they’re taking stock of their own gaps/filling them in because—surprise!—they’re trying to please shareholders (and users that want the latest and greatest) at the end of the day. we can only hope that they’re working on their own cloud-based AI, reworking Siri from the ground up. given rumors in recent months about Apple being in talks with publishers about licensing material for an LLM, I don’t think it’s too far off. they’re kinda in a do-or-die moment with that, given Android is open enough that whichever AI toolset is the most useful to the everyday user can simply gain market dominance through that user base.
 
It'll be interesting to see if Apple sides with Google to use Gemini, considering its iCloud servers are based on Google's. I wonder how deep the partnership really goes...

This is an example of misinformation.

Apple operates its own data centers and integrates Google Cloud and Amazon AWS for various purposes. These external cloud services may be utilized to enhance Apple’s infrastructure during peak usage periods or for specific functionalities.
 
Things have changed guys! Some of us are running the free and open source version of Llama 3 with 70 Billion parameters on our M3 Max MacBooks Pros with 64 GB of RAM. Llama3 with 70 Billion parameters is close to or more powerful than Open AI's ChatGTP 4! Llama 3 with 8 Billion parameters screams with speed and runs fine on lower memory devices like the iPhone!

 
If Apple truly cared about privacy, their cloud AI would also be in-house.

This just shows that Apple is behind in cloud based AI
They said they would be license code/tech. Of course, it's in-house. Just need to pay some royalties possible until they figure out a way that's their own.
 
Nobody wants to miss the boat on the possibility of The Next New Thing, so everyone is scrambling over so called Artificial Intelligence. It could very well end up being a flop for everyone (like Siri) but for now, since it’s all new, the rush is on…. and unfortunately we’re going to have to hear about it ad-nauseum for the next few years. Sigh

The stakes are literally life threatening for . If Google ends up creating a superb ai. Goodbye iPhone. The world will migrate to Android.
Betcha it’s all hands on deck at a-inc Since they’re behind the 8 ball on this.
 
I assume/hope Apple has a way to make sure that any data calls I make to Gemini by way of using new versions of Apple's own LLM/AI/Siri are not recorded by Google specific to me--in other words, I won't want to use Apple's AI if it means there is any way for Google to track what I am doing. That's half the reason I stay away from the Google ecosystem. I am not hiding anything, other than my right to nondisclosure without consent, which Google tramples all over. So far, I continue to believe Apple doesn't do this--they don't need to as part of their business model (they make either no money. or very little of their money by selling user data as compared to Google which makes pretty much all their money this way). The more Apple integrates with Google, the more I fear that Apple will not be able to control the data of its users.
 
  • Like
Reactions: gusmula
Apple should make sure they make it available for older systems, because so far I’m not planning on switching back to their biggest joke called Siri. Neither would I spend $1500 on a new phone for this just to test drive another useless assistant.

ChatGPT does an impressive job.
 
Without it being able to access the internet for information, I'm not sure how useful it would be on an iOS device. I mean, better than nothing I suppose, but don't most people want up-to-date information when using their mobile devices?
The way GPTs work is they process a ton of data every so often to create their LLM (Large Language Model). This costs millions of dollars every time that they decide to update their model so they don’t do it very often (this is why NVidia is raking in the dough). When you query a LLM, it will give you a result based on that data which it has, and like you guessed, it gets stale after a while. To compensate, some of them will also query the internet if you need data that’s more fresh.

Let’s take ChatGPT For example. It’s current LLM dates back to Feb 2023. If you request something that’s newer than that, it will use Bing (Microsoft) and then process that data.

The answers you get when it uses the LLM are vastly more superior than the real time ones.
 
  • Love
Reactions: wilhoitm
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.