Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster


In the iOS 26.4 update that's coming this spring, Apple will introduce a new version of Siri that's going to overhaul how we interact with the personal assistant and what it's able to do.

Finder-Siri-Feature.jpg

The iOS 26.4 version of Siri won't work like ChatGPT or Claude, but it will rely on large language models (LLMs) and has been updated from the ground up.

Upgraded Architecture

The next-generation version of Siri will use advanced large language models, similar to those used by ChatGPT, Claude, and Gemini. Apple isn't implementing full chatbot interactions, but any upgrade is both better than what's available now and long overdue.

Right now, Siri uses machine learning, but it doesn't have the more advanced capabilities that LLM models impart. Siri relies on multiple task-specific models to complete a request, going from one step to another. Siri has to determine the intent of a request, pull out relevant information (a time, an event, a name, etc), and then use APIs or apps to complete the request. It's not an all-in-one system.

In iOS 26.4, Siri will have an LLM core that everything else is built around. Instead of just translating voice to text and looking for keywords to execute on, Siri will actually understand the specifics of what a user is asking, and work out how to complete it.

LLM Improvements

Siri today is usually fine for simple tasks like setting a timer or alarm, sending a text message, toggling a smart home device on or off, answering a simple question, or controlling a device function, but it doesn't understand anything more complicated, it can't complete multi-step tasks, it can't interpret wording that's not in the structure it wants, it has no personal context, and it doesn't support follow-up questions.

An LLM should solve most of those problems because Siri will have something akin to a brain. LLMs can understand the nuance of a request, suss out what it is someone actually wants, and take the steps to deliver that information or complete the requested action.

We already know some of what LLM Siri will be able to do because Apple described the Apple Intelligence features it wants to implement when iOS 18 debuted.

Promised Siri Apple Intelligence Features

Apple described three specific ways that Siri will improve, including personal context, the ability to see what's on the screen to know what the user is talking about, and the capability to do more in and between apps.

Siri will understand pronouns, references to content on the screen and in apps, and it will have a short-term memory for follow-up requests.

Personal Context

With personal context, Siri will be able to keep track of emails, messages, files, photos, and more, learning more about you to help you complete tasks and keep track of what you've been sent.
  • Show me the files Eric sent me last week.
  • Find the email where Eric mentioned ice skating.
  • Find the books that Eric recommended to me.
  • Where's the recipe that Eric sent me?
  • What's my passport number?
Onscreen Awareness

Onscreen awareness will let Siri see what's on your screen and complete actions involving whatever you're looking at. If someone texts you an address, for example, you can tell Siri to add it to their contact card. Or if you're looking at a photo and want to send it to someone, you can ask Siri to do it for you.

Deeper App Integration

Deeper app integration means that Siri will be able to do more in and across apps, performing actions and completing tasks that are just not possible with the personal assistant right now. We don't have a full picture of what Siri will be capable of, but Apple has provided a few examples of what to expect.
  • Moving files from one app to another.
  • Editing a photo and then sending it to someone.
  • Get directions home and share the ETA with Eric.
  • Send the email I drafted to Eric.
Bigger Than Promised Update

In an all-hands meeting in August 2025, Apple software engineering chief Craig Federighi explained the Siri debacle to employees. Apple had attempted to merge two separate systems, which didn't work out.

There was one system for handling current commands and another based on large language models, and the hybrid approach was not working due to the confines of the current Siri architecture. The only way forward was to upgrade to the second-generation architecture built around a large language model.

In the August meeting, Federighi said Apple had successfully revamped Siri, and that Apple would be able to introduce a bigger upgrade than it promised in iOS 18.

"The work we've done on this end-to-end revamp of Siri has given us the results we needed," Federighi told employees. "This has put us in a position to not just deliver what we announced, but to deliver a much bigger upgrade than that we envisioned."

Adopting Google Gemini

Part of Apple's problem was that it was relying on AI models that it built in-house, and that were not able to match the capabilities of competitors. Apple started considering using a third-party model for Siri and other future AI features shortly after delaying Siri, and in January, Apple announced a multi-year partnership with Google.

For the foreseeable future, Apple's AI features, including the more personalized version of Siri, will use a custom model Apple built in collaboration with Google's Gemini team. Apple plans to continue work on its own in-house models, but for now, it will rely on Gemini for many public-facing features.

Siri in iOS 26.4 will be more similar to Google Gemini than Siri today, though without full chatbot capabilities. Apple plans to continue to run some features on-device and use Private Cloud Compute to maintain privacy. Apple will keep personal data on-device, anonymize requests, and continue to allow AI features to be disabled.

What's Not Coming in iOS 26.4

Siri is not going to work as a chatbot, so the updated version will not feature long-term memory or back-and-forth conversations, plus Apple plans to use the same voice-based interface with limited typing functionality.

Apple's Embarrassing Siri Delay

In what became an infamous move, Apple went all-in showing off a smarter, Apple Intelligence-powered version of Siri when it introduced iOS 18 at the 2024 Worldwide Developers Conference. Apple said these features would come in an update to iOS 18, but right around when launch was expected, Apple admitted that Siri wasn't ready and wou... Click here to read rest of article

Article Link: Why Apple's iOS 26.4 Siri Upgrade Will Be Bigger Than Originally Promised
 
Last edited:
Today, I can ask Siri: What is the volume level right now and she'll be like, sorry I can't help you with that, but I can say set the volume to 50% and she does it.

Another weird one is I'll be wearing my AirPods and I'll say, turn on Spatial Audio Fixed and she doesn't know how to do that either. So dumb.
 
the only reason I have Siri on is (wireless) CarPlay.
I have Apple Intelligence OFF on my 17PM, had it on initially but turned if off after ~ 2 months as I neither used it nor did it have benefits for me.
I will install 26.4 shortly after release and will/might do some testing with Siri.
 
I use Siri every morning to set a 00:11:15 timer (eleven minutes and 15 seconds) for my coffee brewing. At the end of the time, the watch chimes. Time then to go to the kitchen and pour the freshly made coffee first for my wife and then me.

Siri can usually spell words for me on the iPhone and sometimes on my Ultra 2 watch.

I will keep my expectations lower then the bottom a snakes belly and probably will not be disappointed. Hopefully there will be fine tuning off-on switches so I could fine tune what I want to work and or not use.

Supposedly, every piece of Apple gear I currently have, with the exception of the iPhone 13 mini and original home pod speakers, will run Apple Siri/AI. Every device was ordered with maximum memory and largest SSD at the time of ordering. So now the fans are being cleaned and plugged into the walls to see what happens when 24.4 arrives. 😎 😳 😱
 
Last edited:
From what I can tell, the LLMs people use today are economically unfeasible and won't be around for long. They require huge data centers, huge amounts of water, tremendous amounts of electricity and cost trillions of dollars. Right now they're running these things on hopium.
 
From the description it looks like it will be something similar to Windows Recall + Copilot. Everyone hates Recall, and not many like Copilot. I'm wondering if this is gonna be different when Apple does conceptually the same thing, but it's just not gonna be Microsoft 🙂
 
Can it still set timers though?

In all seriousness, I’ll believe it when I see it. That said, I have a 14 Pro and zero plans to upgrade, so I guess none of this matters anyways.
“Hey Siri, set a timer for 10 minutes” to which she replies, “I found this on the web”. I personally just hope that when I tell her to navigate to Paddy Caughlin’s in Fort Atkinson, Wisconsin, she doesn’t decide I’m wanting to go to a restaurant in San Francisco.
 
Actually I just pray for Siri in the home to get a lot more capable with the new home hub. Better natural language understanding for one plus just the ability to answer questions would be great.
 
the only reason I have Siri on is (wireless) CarPlay.
I have Apple Intelligence OFF on my 17PM, had it on initially but turned if off after ~ 2 months as I neither used it nor did it have benefits for me.
I will install 26.4 shortly after release and will/might do some testing with Siri.
I agree “ Apple Intelligence “ is awful. Siri is just limited but at least functions without messing up too much
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.