Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you turn on the setting that shows what Siri hears on your phone screen, you can see what it hears vs what it does. I find that the transcription is usually exactly right. Where it goes wrong in how Siri interprets the meaning of that.

Doing voice to text with and I watch in spellbound awe as the words I speak are printed on the screen then shake ny head in bafflement as these correct words are themselves then corrected. Your point that it can hear is correct. What is does in the next step is just so wrong in so many ways.

UPDATE - had to go back and fix the voice entered entry. Manually.
 
I’m guessing there’s a 99.9% chance the original HomePod won’t get any of these new Siri features, right? They’re going to force us to buy new hardware in order to get this software, when this is the only place we ever use Siri at all (partially because it’s garbage and partially because it’s most useful in the home).
 
Too late ... to make iPhone the face of AI. For the mass market,

- iPhone will see upgrade cycle of 3-4 years
- Siri/assistant interface will open under consumer/market/regulatory pressure, opening the door to other assistants (the way we can pick our search engine)
- Apple will throw privacy tantrum until it stops selling (they already knows vast majority of people don't care)

And I'll be the first to ditch the dumbest assistant in the world and pay someone $10/month to run my life for me on my phone (while Apple still syncing my devices), and this won't be extra cost to me because I won't get new phones every 2 years.

I can't believe this, but I'm noticing how Siri alone has started to make me dislike my iPhone (and I've had nothing but iPhones since the first one). The more I use Claude/ChatGPT, I'm developing a higher expectation of a mobile assistant -- difference between the two when switching back and forth is becoming agonizing.

Sorry Apple -- more pixels, nits, storage ... and the privacy marketing nonsense won't cut it anymore.
 
Last edited:
Insiders from Apple's Siri team have expressed frustration over management's conservative decisions, which they feel are holding back innovation. After the release of ChatGPT, Apple quickly called an emergency meeting and decided to put their car project on hold to focus on improving Siri. This shift suggests that Apple is seriously stepping up its game, not merely following trends.
Biggest thing holding back Apple's services is their over zealousness on privacy. Yes, privacy is important but it has to be balanced with making services genuinely useful.
 
I'm looking forward to Apple reintroducing "Siri with AI" as the killer feature on upcoming iPhones and calling it beta again. Last time around, they called it a beta project while simultaneously advertising it as a reason to upgrade phones (the iPhone 4 never got Siri, but could use it no problem if jailbreaker).
 
I fail to see how privacy even comes into play here. Why would anything other than an anonymous request ever need to leave the device?

AI gets a lot of info from your device to effectively work. It is likely very limited (best case) regarding personal info. One thing to remember when it comes to Apple, their definition of private means they don’t sell your info, not that it isn’t communicated to Apple. This interaction between your devices and Apple may make it difficult for a true interactive AI on device.
 
  • Like
Reactions: arkitect
I’m guessing there’s a 99.9% chance the original HomePod won’t get any of these new Siri features, right? They’re going to force us to buy new hardware in order to get this software, when this is the only place we ever use Siri at all (partially because it’s garbage and partially because it’s most useful in the home).
I think newer devices will run a Siri AI locally but older devices may need to run it server-side. Perhaps on those Apple Silicon servers that we have been hearing about recently?
 
  • Like
Reactions: citysnaps
I just don’t have any faith that any amount of “new AI” can fix Siri. Most of the issues I’ve run into have as much to do with how Siri interfaces with built-in apps as it does with how well Siri understands the request. That means fixing the built-in apps and functions as well. Simple things like requests to play music fail miserably on a regular basis. Straightforward requests to do something like connect to VPN apparently could not be done unless I told Siri how via a Shortcut. That’s just a couple of examples of things that are built-in, as in placed there by Apple, and Siri can’t handle it. Unless the built-in functionality is handled, Siri is of little to no use to me. I’m not looking to “have a conversation” with AI, or have it “summarize” something for me (it would not be in my nature to trust such output anyway) or anything like that. I want it to actually perform tangible tasks within the device for me.
 
  • Like
Reactions: appleCakes
Siri has become such a joke anymore. If Apple truly does introduce a revolutionary new AI based virtual assistant, they should probably give it a new name.
Introducing Apple Bob, his wife Jane, and dog Rover. Can' innovate my...
 
since revamped Siri is coming with iOS *18* this is a good opportunity to Apple to refer to it as coming of age / growing up.

Siri will finally be an adult. no wonder it’s been childish and stupid forever.
 
since revamped Siri is coming with iOS *18* this is a good opportunity to Apple to refer to it as coming of age / growing up.

Siri will finally be an adult. no wonder it’s been childish and stupid forever.
That's what happen when you have bad parents
 
  • Love
Reactions: newyorksole
I just don’t have any faith that any amount of “new AI” can fix Siri. Most of the issues I’ve run into have as much to do with how Siri interfaces with built-in apps as it does with how well Siri understands the request. That means fixing the built-in apps and functions as well. Simple things like requests to play music fail miserably on a regular basis. Straightforward requests to do something like connect to VPN apparently could not be done unless I told Siri how via a Shortcut. That’s just a couple of examples of things that are built-in, as in placed there by Apple, and Siri can’t handle it. Unless the built-in functionality is handled, Siri is of little to no use to me. I’m not looking to “have a conversation” with AI, or have it “summarize” something for me (it would not be in my nature to trust such output anyway) or anything like that. I want it to actually perform tangible tasks within the device for me.
That kind of thing is something an LLM can do well. A lot of the problems with Siri are that it doesn’t understand the context and intent of a request. It if can use an LLM to interpret the intent, then it can do a better job of matching that intent to various app functions. Apple has an app Interface called App Intents that apps can publish to expose their functions and how to use them. A smarter Siri could check App Intents and find apps that correspond to your request.

Music is one that can be messier than we expect and that Siri is too rigid to handle. An LLM would be able to determine that when someone asked for Don’t Worry About a Thing by Bob Marley, they really want the song title “Three Little Birds” and play that.

If the VPN app has an App Intent published or a shortcuts interface, an LLM Siri could make use of that.

If you wanted a photo generated or something summarized from the web, they might call out to Midjourney or ChatGPT running on a server.

Obviously I don’t know that Apple will build their AI this way, but it does make use of the on-device app ecosystem they have built and it fits with their intent to run on -device where possible and to built a useful assistant. I’ll be watching the WWDC announcements closely to see what they are ready to reveal this year.
 


Apple's shift to develop its own AI technology to keep up with competitors was today detailed in a The New York Times report.

Apple-Silicon-AI-Optimized-Feature-Siri-1.jpg

Citing sources familiar with Apple's work, the report explains that the decision to revamp Siri was taken early last year by Apple's most senior executives. Senior vice president of software engineering Craig Federighi and senior vice president of Machine Learning and AI Strategy John Giannandrea are said to have spent several weeks testing OpenAI's ChatGPT to understand the ways in which the competitor made Siri look antiquated. The Siri team purportedly failed to receive attention and resources compared to other groups inside Apple, and the company has struggled to recruit and retain leading AI researchers.

Apple executives are said to be concerned that AI threatens the iPhone's market share because it has the potential to become a more compelling operating system with an ecosystem of AI apps that undermine the App Store. Apple apparently fears the iPhone becoming a "dumb brick" compared with other technology.

This conclusion triggered a significant reorganization at Apple amid determination to catch up in the race to develop AI tools. The company moved to reallocate employees and resources toward AI, and the change of strategy was a contributing factor in the decision to cancel its electric vehicle project. Apple's upcoming iPhone 16 models will supposedly feature more memory to support AI features.
Apple is expected to reveal a series of AI tools at its WWDC keynote on June 10, including an improved version of Siri that is more conversational and capable, with the ability to "chat" rather than merely respond to individual queries. The company is working on making Siri better at handling tasks such as setting timers, creating calendar appointments, adding items to Reminders, and summarizing text. Apple plans to market the new version of Siri as a more private alternative to rival AI services because most requests will be processed on-device rather than remotely in data centres. See the full The New York Times article for more information.

Article Link: Report: Revamped Siri to Be at the Core of Apple's New AI Strategy
When did Siri never look antiquated?
 
That kind of thing is something an LLM can do well. A lot of the problems with Siri are that it doesn’t understand the context and intent of a request. It if can use an LLM to interpret the intent, then it can do a better job of matching that intent to various app functions. Apple has an app Interface called App Intents that apps can publish to expose their functions and how to use them. A smarter Siri could check App Intents and find apps that correspond to your request.

Music is one that can be messier than we expect and that Siri is too rigid to handle. An LLM would be able to determine that when someone asked for Don’t Worry About a Thing by Bob Marley, they really want the song title “Three Little Birds” and play that.

If the VPN app has an App Intent published or a shortcuts interface, an LLM Siri could make use of that.

If you wanted a photo generated or something summarized from the web, they might call out to Midjourney or ChatGPT running on a server.

Obviously I don’t know that Apple will build their AI this way, but it does make use of the on-device app ecosystem they have built and it fits with their intent to run on -device where possible and to built a useful assistant. I’ll be watching the WWDC announcements closely to see what they are ready to reveal this year.
I agree LLM can improve understanding of the user’s intent. My point is that that is only half the issue, and the native apps themselves need to be fixed to serve Siri and no amount of LLM will improve the success rate of requests to on-device apps unless you fix them.

For the Music app as an example, when I ask Siri to play a song by title and use the EXACT song title, and Siri says it can’t find it even though is loaded on the device, I don’t see that as an interpretation of intent issue. Most of us agree Siri is subpar, but surely a request like “Play song titled [whatever by exact title]” does not require LLM to make sense. I don’t think Siri itself has a problem understanding that request. Whatever happens at the interface between Siri/app and beyond is the problem, and will continue to be so if you only address the Siri side (even with LLM). If “App Intents” is supposed to be that interface, it doesn’t look like Apple themselves is making full use of it.

And for the record, the VPN I was trying to connect to was the one built into iOS, not a separate VPN app. That’s why I focus on the native apps. Apple put them there. Apple built Siri and put it there, but didn’t ensure it’s own native functions are handled.

An amusing thing I found when trying to resolve the VPN issue (someone in MR forums helped me with the shortcuts, btw), I played around with the iPad as well though I wanted the problem solved on the iPhone. I found that Siri understood some WiFi-related requests just fine, just not VPN. On my WiFi-only iPad, I could ask Siri to disconnect from WiFi and it would happily comply. Effectively asking Siri to commit suicide was fine, but asking to connect to VPN was not. SMH 😆
 
  • Like
Reactions: Tagbert
That kind of thing is something an LLM can do well. A lot of the problems with Siri are that it doesn’t understand the context and intent of a request. It if can use an LLM to interpret the intent, then it can do a better job of matching that intent to various app functions. Apple has an app Interface called App Intents that apps can publish to expose their functions and how to use them. A smarter Siri could check App Intents and find apps that correspond to your request.

Music is one that can be messier than we expect and that Siri is too rigid to handle. An LLM would be able to determine that when someone asked for Don’t Worry About a Thing by Bob Marley, they really want the song title “Three Little Birds” and play that.

If the VPN app has an App Intent published or a shortcuts interface, an LLM Siri could make use of that.

If you wanted a photo generated or something summarized from the web, they might call out to Midjourney or ChatGPT running on a server.

Obviously I don’t know that Apple will build their AI this way, but it does make use of the on-device app ecosystem they have built and it fits with their intent to run on -device where possible and to built a useful assistant. I’ll be watching the WWDC announcements closely to see what they are ready to reveal this year.
Latest from Mark Gurman

 
I agree LLM can improve understanding of the user’s intent. My point is that that is only half the issue, and the native apps themselves need to be fixed to serve Siri and no amount of LLM will improve the success rate of requests to on-device apps unless you fix them.

For the Music app as an example, when I ask Siri to play a song by title and use the EXACT song title, and Siri says it can’t find it even though is loaded on the device, I don’t see that as an interpretation of intent issue. Most of us agree Siri is subpar, but surely a request like “Play song titled [whatever by exact title]” does not require LLM to make sense. I don’t think Siri itself has a problem understanding that request. Whatever happens at the interface between Siri/app and beyond is the problem, and will continue to be so if you only address the Siri side (even with LLM). If “App Intents” is supposed to be that interface, it doesn’t look like Apple themselves is making full use of it.

And for the record, the VPN I was trying to connect to was the one built into iOS, not a separate VPN app. That’s why I focus on the native apps. Apple put them there. Apple built Siri and put it there, but didn’t ensure it’s own native functions are handled.

An amusing thing I found when trying to resolve the VPN issue (someone in MR forums helped me with the shortcuts, btw), I played around with the iPad as well though I wanted the problem solved on the iPhone. I found that Siri understood some WiFi-related requests just fine, just not VPN. On my WiFi-only iPad, I could ask Siri to disconnect from WiFi and it would happily comply. Effectively asking Siri to commit suicide was fine, but asking to connect to VPN was not. SMH 😆
You didn’t quite understand what an app published intent list is.

Also, the last paper Apple published has a very cool research of using Vision to understand an OS UX, and how good it is in understanding a user real intent (not an app published intent) by doing it.
 
Tim Cook is a genius. When he took the reins from Jobs, Apple's market cap was $800 billion. Today, it's just south of $3 TRILLION. Tim is also now a billionaire, and heads one of the largest companies in the world. Is he a perfect CEO? No, of course not. But you can't name any CEO or human being who is. I'd bet you can't even come up with someone who would demonstrably do a better job of running a Fortune 5 company, either.
Right, because it would've been so difficult for anyone to take Steve Jobs' innovation and ride the wave for 15 years. :rolleyes:

The problem is Tim is a bean counter at a tech company. Even Steve Ballmer would've been a better fit... At least Ballmer was passionate about his products while CEO of Microsoft. Tim is clueless about tech. A CEO position at a tech company shouldn't belong to a former controller, revenue assurance officer, or accountant/CFO.

Who are you hiring as a hospital CEO? A doctor with extensive experience in medicine, or a bean counter? I'll hire the doctor who can make informed decisions. Leave those overseeing the budget and finances to general managers and CFOs.
 
And things like auto-correct -- I really don't understand... I get things suggested to me that are gibberish ... like not even actual words. Whatever they are doing there is so bad
I’m glad you mentioned it because autocorrect has been driving me crazy lately

- Randomly Capitalizing words for No apparent reason
- Sometimes it jyst stops correcting or suggesting corectuons in certian apps
- Weird word options that aren’t wart I want
- Autocorrecting phrases that aren’t what I want

It’s especially bad with swipe typing and that’s frustrating because I like typing that way
 
  • Like
Reactions: dk001 and bgarnett
Yea, that’s pretty much what I’ve been expecting from Apple.

People should remember that AI is not just one thing. It is a range of things from machine learning features in apps, to LLM chat bots, to LLM powered assistants. Some of those will be from Apple and some from third parties.
 
What "perch"?

The majority of the world's smartphones are not iPhones. (Though in the US Apple is the dominant brand.)

In traditional small computer sales, Apple is fourth in overall units, behind Lenovo, Dell and HP.

This idea that Apple is some sort of monopoly (the way IBM was for a couple of decades, in enterprise computers) is just wrong.

Apple *is* more profitable than those other computers, in no small part by positioning itself as the aspirational brand.
The perch of being one of only two companies with their own platform on consumers’ most used piece of tech.
 
I use Siri a couple dozen times a day, every day of the week. It's *very rare* when there's an issue.

I chalk up those complaining about Siri day in/day out also complaining about Tim Cook day in/day out. What a coincidence!
Lucky you!
 
  • Like
Reactions: arkitect
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.