Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When I ride my motorcycle I used to be able to use Siri to send a text, read and reply to texts. Now she says "you will need to unlock your phone" If I need to unlock my phone then I don't need you.
How do you have Allow Access When Locked set?

PS Elsewhere I read if you enable Messages Notification Previews Always, you can say “Read New Message” when locked.

I just use my Apple Watch now.
 
Last edited:
What the hell? One step forward, two steps back. Apple just isn’t investing in Siri enough. It needs to be more useful, not less.
 
I believe that Apple is taking a different approach than what they originally planned. Those third party Siri hooks have notably made Siri worse because they limit what you can you say to Siri in natural language.

Today, Siri uses machine learning to suggest apps for common tasks, including third party apps. So rather than have rigid commands directly into apps, a Siri that understands natural language will learn what apps you use for specific tasks. “Hey Siri, I need a ride” will suggest to bring up Uber because that’s what you typically use to get rides.

The more you use Siri, the better it will get at bringing up the relevant app for common tasks, rather than you having to remember specific commands for each app.
 
I believe that Apple is taking a different approach than what they originally planned. Those third party Siri hooks have notably made Siri worse because they limit what you can you say to Siri in natural language.

Today, Siri uses machine learning to suggest apps for common tasks, including third party apps. So rather than have rigid commands directly into apps, a Siri that understands natural language will learn what apps you use for specific tasks. “Hey Siri, I need a ride” will suggest to bring up Uber because that’s what you typically use to get rides.

The more you use Siri, the better it will get at bringing up the relevant app for common tasks, rather than you having to remember specific commands for each app.
This claim has been made for years, but it's somewhat hollow. If I don't have a mechanism for indicating something I want to do frequently, then Siri has no way to get the process started. Even ASSUMING that
- the app is indicating to Siri various interactions
- Siri is set up to notice the pattern

For your example:
- Uber would have to be telling Siri that "app was used to take a ride" so that Siri knows that you use Uber for this purpose. (The system could know you launch the Uber app a lot, but how does it know that Uber is a ride-sharing app, if that's something it's going to LEARN rather than something hardwired?)

- How does Siri learn that "I need a ride" implies "call Uber" rather than "call Mom"? There is, specifically, no way for Siri to learn that there's a connection between calling Mom and going somewhere.

I am not saying something as obvious as "Siri needs total semantic understanding". I am making the much weaker point that to transition to the world you are describing requires a lot more data than Siri has access to, and much of it is negative (in the sense that Siri is highly unlikely to see the links between calling mom, and then being in a car ten minutes later; just as even today Siri does a somewhat terrible job of guessing where I'd like to drive when I bring up CarPlay. This could be better -- for example if I just spoke or texted with someone at work, in PRINCIPLE Siri could know that I'm more likely to have Work as one of my destinations. But that would require Siri tracking connections between recent phone/text interactions and driving destinations; it doesn't happen for free.)
 
This claim has been made for years, but it's somewhat hollow. If I don't have a mechanism for indicating something I want to do frequently, then Siri has no way to get the process started. Even ASSUMING that
- the app is indicating to Siri various interactions
- Siri is set up to notice the pattern

For your example:
- Uber would have to be telling Siri that "app was used to take a ride" so that Siri knows that you use Uber for this purpose. (The system could know you launch the Uber app a lot, but how does it know that Uber is a ride-sharing app, if that's something it's going to LEARN rather than something hardwired?)

- How does Siri learn that "I need a ride" implies "call Uber" rather than "call Mom"? There is, specifically, no way for Siri to learn that there's a connection between calling Mom and going somewhere.

I am not saying something as obvious as "Siri needs total semantic understanding". I am making the much weaker point that to transition to the world you are describing requires a lot more data than Siri has access to, and much of it is negative (in the sense that Siri is highly unlikely to see the links between calling mom, and then being in a car ten minutes later; just as even today Siri does a somewhat terrible job of guessing where I'd like to drive when I bring up CarPlay. This could be better -- for example if I just spoke or texted with someone at work, in PRINCIPLE Siri could know that I'm more likely to have Work as one of my destinations. But that would require Siri tracking connections between recent phone/text interactions and driving destinations; it doesn't happen for free.)



👀



54514F33-ABA3-44E8-BD86-9FB9F267EC43.jpeg



Siri (Apple) has access to massive amounts of data indicating what apps do, from complex data parsing of which user search results lead to which app downloads, to data as simple as the category, tags and descriptions that developers submit with their apps.

You might’ve noticed lately that Siri has started asking follow-up questions when it needs more information. Sometimes when I tell Siri to to turn on “the washroom” lights, she asks if I meant “the bathroom”. She hasn’t asked again after learning that they’re the same thing. If I ask her to set an alarm without indicating AM/PM, if it’s not clear enough to guess based on the current time, she asks if I meant later today or tomorrow morning. Under the scenario I described, in addition to the data Siri has to make a best guess, there’s nothing wrong about Siri asking whether I’d like to use the Uber app to get a ride. In fact, that’s how humans work. We ask for clarification if we’re not sure. That’s how we improve the accuracy of fulfilling requests from other humans (would you like Diet Coke or Regular?) and that’s how we learn (1 cup of flour? How much is in a cup?).

Under John Giannandrea, Siri has grown exponentially smarter and more human-like in the way it works with each iOS release and even mid-releases. This is happening because Giannandrea shifted the approach that his predecessors had taken. Rather than program Siri what to do and give developers hooks into Siri, instead enable Siri to learn to interact with other apps by observation and by asking questions.

I believe that this is the correct approach and it’s already paying dividends. I’m looking forward to the coming releases after the legacy code of old Siri is no longer in the way and Giannandrea’s work is allowed to fully flourish.
 
Last edited:
  • Like
Reactions: _Spinn_
The level of disregard by Apple toward improving Siri is mind boggling. I ask a simple question like “what time is the St. Louis Cardinals baseball game today?” & it asks me which Cardinals. Then gives me a list of teams, none of which are what I’m asking about. Every other assistant has no problem with this. It’s like Apple is defiantly purposely not spending any time on it.
 
Generally, I agree—Apple isn't a primary research kind of company—but I don't totally agree with your last point. I mean, yes, serious researchers do tend to work in larger groups, on longer-term research projects, and they tend to want to publish, so a "closed" corporate environment isn't super attractive. But I'd imagine an Apple salary next to an academic salary could be somewhat tempting... :)

Also, they obviously still have access to all the open research, which their internal R&D folks can leverage, discuss, extend, and so on. And occasionally—veeeery occasionally—they even publish something. Ha! But yeah, you're right about the divide between these two worlds.
Yes, I wasn't speaking so much to whether that was reality, but a good portion of the past criticisms of Apple's AI efforts - not capturing enough data for analysis, not active in research/publishing, etc.

I actually suspect Siri's biggest two hinderances are:
  1. Apple doesn't operate their own search engine. Many of Siri's data sources through partners rather than captured by Apple themselves, so it is hard to fill in the gaps or add additional semantic structure to help it answer queries.
  2. Apple insists on broad language support. Amazon supports 8 languages, Google supports 12, Apple supports 21 as well as many variant dialects. Third party Siri integration is generally limited to techniques that not just fit into the Siri architecture, but will work across all of those languages - or provide the user the chance to specify their own trigger phrase locally.
 
How do you have Allow Access When Locked set?

PS Elsewhere I read if you enable Messages Notification Previews Always, you can say “Read New Message” when locked.

I just use my Apple Watch now.

I will have to try that but I don't like having message previews visible always. But I've always had message notification previews set show only when unlocked. Either way something changed and Siri sucks haha
 
This is strange. Why would Apple deliberately want to limit Siri's functionality in this way? Unless they've decided that SiriKit is too limiting and there's a better, deeper integration planned in future updates to Siri. But still, why not just deprecate SiriKit interactions rather than blocking them completely?

On the other hand, perhaps this is just a clean up of intents that Apple's metrics showed weren't being widely used anyway. Who's ever used Siri to book an Uber?
Maybe because with voice commands Siri has access to your data:

Siri read me my last message (very first ad on launch day of Siri with iPhone 4S; dude running on board walk).
- text/iMessages are heavily personable and can relay personal information.

That's just 1 thought/.'
 
Maybe because with voice commands Siri has access to your data:

Siri read me my last message (very first ad on launch day of Siri with iPhone 4S; dude running on board walk).
- text/iMessages are heavily personable and can relay personal information.

That's just 1 thought/.'

Sure, that doesn’t mean that your data (from other apps) can leak out of Siri and into 3rd-party apps, though. The SiriKit APIs seem explicitly designed to prevent that.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.