Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,732
38,276


Apple has been quietly making a series of artificial intelligence related acquisitions and staff hires in a bid to bring on-device AI to its next-generation iPhones, claims a new report by the Financial Times.

siri-symbol-iphone-feature.jpg

According to the report, indications suggest Apple has been focusing on "tackling the technological problem of running AI through mobile devices." To that end, it has acquired several AI-related startups, the last one occurring early last year when it purchased California-based WaveOne, which offers AI-powered video compression.

According to a recent research note from Morgan Stanley, almost half of Apple's AI job posts have included the term "Deep learning," which relates to the algorithms used to power generative AI.

Previous reports have suggested Apple has been testing its "Ajax" large language model (LLM) since early 2023, but in contrast to LLMs like OpenAI's ChatGPT, Apple's primary goal is to develop generative AI that works locally on-device, rather than being powered by cloud services in data centers.

The challenge in achieving that aim involves optimizing the LLM while reducing its size, as well as heavier reliance on high-performance mobile hardware and faster Apple silicon chips. For example, Apple is said to be planning a significant upgrade to the iPhone 16 microphone to improve a new AI-enhanced Siri experience.

Just last month, Apple AI researchers said they have made a key breakthrough in deploying large language models (LLMs) on iPhones and other Apple devices with limited memory by inventing an innovative flash memory utilization technique.

Apple is said to be on schedule to announce a series of generative AI-based tools at its Worldwide Developers Conference (WWDC) in June, when iOS 18 will be previewed. Morgan Stanley analysts expect the mobile software will be geared towards enabling generative AI and could include its voice assistant Siri being powered by an LLM.

Article Link: Apple's Big AI Push Implied By Startup Acquisitions and Job Postings
 
  • Like
Reactions: SFjohn
Well tvOS because there is no web content reader at least can't get answers to a lot of content related questions via Siri command syntax, so you left without a lot of digital assistance for all those apps on the AppleTV. iOS/IPadOS/MacOS could go a lot further because of info gleaned from the web when you need specific info and details.
 
  • Like
Reactions: mech986
I'd be happy if Siri could finally pronounce "Calle" when the phone is set to another language than Spanish. Is this too much to ask!

I also love the feature from Samsung where you can write in your native language in a messenger app and then it shows it automatically as a translated version (with your native copy smaller underneath). This would totally help me learning another language. I attempted it once but I got tired of copy pasting from the translate app to the Messages back and forth.
 
Siri has to be one of the biggest missed opportunities in Apple's history.
Had FMA and blew it because the "Operations" guy at the helm was content with switching up the layout of cameras year over year.
 
  • Like
Reactions: tYNS
This is desperately needed, Siri is just sooo bad. Yesterday I asked it what the average temperatures of Barcelona and Rome are in March, I got web links. I asked Siri to perform some basic currency conversions, on two occasions it gave me an answer, whilst on the other two occasions I was given web links.
It is staggering how long Siri has been left to languish and fall behind, definitely the worst aspect of iOS.
 
Well, this bodes well for battery life of iPhone 16.
Can't come up with any reason why running LLMs on a mobile device would drain power. Apple magic!
 
Why couldn't Apple give users the option of using cloud vs on-device? For example, it could be a simple privacy setting which makes it clear that you can choose stronger data protections but at the expense of some performance. It's clear that there are some benefits to running these LLMs on the physical device, but the power of cloud computing cannot be denied.
 
The FT is referring to “Edge AI” which the overwhelming majority of IT journalists, analysts and various studies indicate will be the future of AI not “cloud AI”. Who better than Apple to capitalize on this. Looks like the new wave of the Apple “iPhone AI 1”is on its way.
 
We’ll see what Apple comes up with. On device LLMs so far are underwhelming. ones like Llama/Alpaca/Vicuña aren’t bad but are limited by size of the models and speed.

Apple engineers have been doing some great work with AI imaging related tools (some of these are on GitHub) and making good progress with other ML tools.

As for Siri, I’m not sure what you are all doing with Siri but it works well for me. 🤷
 
  • Like
Reactions: jimbobb24
As for Siri, I’m not sure what you are all doing with Siri but it works well for me. 🤷

Siri understands what I'm saying (I have "Always Show Speech" enabled, so I can see what it thinks I said), but it's what it does with the info that's a problem. In the last few days:

1) I said "remind me that Ronnie will be away when I get to work". It added a reminder that simply said "Ronnie" and didn't set my workplace as a location.

2) I asked Siri to play a song by a band that I frequently listen to. It gave me a cover version of said song by some obscure band. Why isn't the OS smart enough to see that I listen to the band regularly and give me the version of the song that I most likely want to hear.

3) I asked for directions to a business that I was near (a road was closed and I had to re-route - not that familiar with the area so I asked for help), and instead Siri gave me directions to a town that shared the same name as the business. The business shows up in Maps, so why is that not the first choice instead of a town that's seven hours away?

4) On Apple TV, I asked it to "find the movie Maestro on Netflix" and it took me to the preplay screen for the movie "My Octopus Teacher". Not even close!

It's pretty good at setting a timer though.
 
Let's just hope Siri AI will be available with most of its features on previous iPhones, and not only iPhones 16 +
 
  • Like
Reactions: Fraserpatty
Siri understands what I'm saying (I have "Always Show Speech" enabled, so I can see what it thinks I said), but it's what it does with the info that's a problem. In the last few days:

1) I said "remind me that Ronnie will be away when I get to work". It added a reminder that simply said "Ronnie" and didn't set my workplace as a location.

2) I asked Siri to play a song by a band that I frequently listen to. It gave me a cover version of said song by some obscure band. Why isn't the OS smart enough to see that I listen to the band regularly and give me the version of the song that I most likely want to hear.

3) I asked for directions to a business that I was near (a road was closed and I had to re-route - not that familiar with the area so I asked for help), and instead Siri gave me directions to a town that shared the same name as the business. The business shows up in Maps, so why is that not the first choice instead of a town that's seven hours away?

4) On Apple TV, I asked it to "find the movie Maestro on Netflix" and it took me to the preplay screen for the movie "My Octopus Teacher". Not even close!

It's pretty good at setting a timer though.
In fairness to Siri I have the same issues with both Google Assistant and Alexa. I am not sure what Google has done but Assistant has really gone down in the past few years.
 
  • Like
Reactions: Tagbert
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.