Whatever it takes, I guess? If they’re marketing an AI that can do a lot then seemingly simple requests like that should be possible. Otherwise- it sucks. I’m being harsh but really, it can either do those “seemingly” simple things or people lose confidence in it.Without LLMs, every single use case has to be separately integrated by an Apple developer. It’s not much more than sentence pattern matching currently.
Even with LLMs, it won’t be easy to make all sensible use cases work. I would expect many things to still not work when it launches in 2026, and to be added bit by bit over time.
My iphone 13 is more than 3 years old, but I noticed quite the reduction in battery since I installed iOS 18.Apple has to start working on making iPhone 16 Pro Max last longer than 2 hours.
I'm quite sure the new siri will be quite good, but the 2026 release date clearly demonstrates what happened.A bit behind the ball even for Apple, but if Apple Intelligence can't make Siri smarter, what will?
I grow ever so frustrated in seeing apple expanding into questionable new ventures (such as TVs and smart home hubs).At this point, I'll be happy with any improvements. I mean, Apple needs to get a lot more serious about this since Siri often can't even find music in my Apple Music library.
Not sure that's currently possible, since Microsoft now has a pretty big stake in OpenAI.Maybe Apple should just buy OpenAI?
To be fair, if you enable the chatGPT integration in the current 18.2 beta, “I found this on the web” is basically gone.So Siri understands context but without greater knowledge? I think that tells us "I found this on the web" will still be around for a while.
Yes and no.Even hardcore fans have to admit that the 14-year pioneering role with Siri was a bit of a waste![]()
Hope for better Siri keeps me aliveProbably depends when you die.
The battery life, after upgrading to iOS 18.1 is fracking terrible, it's a joke, on the 13(16) pro max. It isn't the phone, it's iOS' issue.My iphone 13 is more than 3 years old, but I noticed quite the reduction in battery since I installed iOS 18.
Not long ago, I could leave my phone on the night stand with 30% of battery and find it with 20% the next morning.
These days I find the phone almost dead each morning, with the same average battery starting point.
And I have been very gentle with my charging, charging 45% of the time with a slow 7.5w Qi charger, 45% of the time with a super slow 5w cable and only the remaining 10% of the time with a 20w cable.
Going out for any reasonable amount of time, with no powebank, has become impossible.
Even more - long term this idea of on-device processing is pretty dumb. Is the network latency really that high that we need local llm’s?The iPhones need 16-24gb of ram first, then they can worry about AI. Plus, what happens when Siri gets pissed off because she doesn’t have enough ram?
What are competitors doing that Apple isn’t?HolyS. Someone got really caught flat footed here! So the rumors about 2 years behind seem to be true. Incredible
Generative AI (like an LLM) is just a text/image/sound generation and transformation machine. Connecting it to actual functionality (like if it should control all kinds of iOS and app functions) in a robust way is not straightforward at all. It doesn’t have any concept of agency, or of time passing. Similarly for maintaining state/memory for it, because it doesn’t remember anything by itself. Everything it might need to know from prior contexts has to be re-fed to it internally for every single interaction. This is why most present-day AI functionality is oriented around generating or transforming some text or media, or around search.Whatever it takes, I guess? If they’re marketing an AI that can do a lot then seemingly simple requests like that should be possible. Otherwise- it sucks. I’m being harsh but really, it can either do those “seemingly” simple things or people lose confidence in it.
On-device processing will become even more desirable in the long term, especially with increases in the amount of data crunched and the number of times we use AI in a day. There are several reasons, among them being:Even more - long term this idea of on-device processing is pretty dumb. Is the network latency really that high that we need local llm’s?