Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I expect many people refer to Siri as "she" because Siri defaults (or at least used to default) to a female voice, and there's a natural tendency to refer to other participants in a conversation with slightly more personalized pronouns, lest someone mistakenly assume you're referring to a lamp or couch or some other non-speaking object. I certainly don't think of Siri as a thinking AI. "She" is mostly convenient shorthand, in sentences where "it" may be referring to something else (the restaurant you want directions to or the light you want to turn off).

I also don't have unreasonable expectations of what Siri can do. I generally limit my requests to setting HomeKit scenes, setting timers, adding reminders, creating appointments (I'll let Siri handle the date, time, and title, because that's quick and gets it recorded while it's fresh in my mind, and then go fill in the details on another device), and doing basic math ("what's 473 times 12 times 17 times 1.5") and unit conversions. Oh and asking for directions to some specific destination while driving. I limit myself to these requests because I have a fairly high degree of certainty that Siri will understand and respond correctly to them.

My biggest problem with Siri - and I don't think it's marketing, more like hubris on the part of the designers - is that they have been unwilling to design in any sort of specific, published, syntax, or ability to have some sort of meta conversation. In the former case, Siri tries to "make sense" of what you're saying, and often gets things wrong if you stray from their (unpublished and evolving) understood syntax, with the huge problem being that she rarely says, "Sorry, I don't understand what you mean" - she instead she makes huge assumptions that she knows what you mean, when she actually doesn't.

In the latter case, with no capability to have a conversation about the conversation - essentially, no editing mode - it means if I'm driving, and I ask for directions to a store, and she doesn't understand the name of the store, she will cheerfully start suggesting places that are clearly not what I want, and my only recourse is to try again, and probably get the same wrong result. There is no mechanism by which I can say, "Siri, you're misunderstanding the name of the destination, let me spell it for you: I K E A" (first name off the top of my head, not actually one she would misunderstand). I've had this happen with, say, a restaurant where the main word in the name sounds like some more common word, but if you search on the common word, you get a thousand matches. I've had occasions I've have to resort to giving the name of some store that I remember is a few blocks away from my actual destination, simply because Siri can recognize that store name.

And, maddeningly, if she doesn't get it right the first time, often she assumes that it's you who aren't sure where you want to go, so she starts adding details that are meant to be helpful in making your choice ("this one is 2.8 miles away, and gets 3 stars, and is open until 8pm, would you like to try that one?") - when you're driving, and what you REALLY want to know is whether to turn left or right at an intersection that's coming up soon, and the problem is that Siri didn't parse the name correctly, having Siri waste more and more time giving useless details, trying to "help you make up your mind", is infuriating.

I see this a lot in code - the presumption that if you get to a particular point in the code, it's because the user has made mistake X, and now it's time to explain to them their mistake as if they're five years old (when often the program ended up there for some entirely different reason the developer didn't think of). I would MUCH rather have a "personal assistant" that doesn't make assumptions - if she doesn't understand exactly what I mean, she should say, "sorry, I don't quite understand that - could you refer to my syntax manual and try again?".

But Apple's approach is to start out by pretending that Siri is fully conversant in English (or whatever language), and tell users to just ask questions, and then they try to handle whatever's thrown at Siri, and often fail badly - rather than making Siri fully understand/recognize a limited subset of English.

As a programmer, I've spent nearly my whole life dealing with rigid syntaxes, I'd be quite happy issuing verbal commands using specific syntax I'd looked up in a manual, rather than just having a sketchy "you could try things like..." list and having to guess.

And either publish specific recognized syntax for some new subject, or don't accept queries about that subject until you can be fully conversant in it. As just one example, Siri knows about all the HomeKit lights in my house. She knows them by name and where they are, and their current state. I can say, "turn on my bookcase light", and that works fine, as does "turn on my living room lights". If I say, "how many of my lights are on", she'll cheerfully say, "4 of your lights are on and 13 are off". But if I say, "which of my lights are on?" - and remember, she has all the information necessary to answer this question, including the names of every light - she will ALSO answer that with, "4 of your lights are on and 13 are off".

It's not a complicated request. It's not an obscure request (i.e. if you're already adding code to handle a count of the number of lights on/off, it's a fairly obvious next step to guess that someone might want to know which ones). Siri didn't mishear the word "which", and she understands the word in other contexts. Rather, they seem to have decided "this is a query about the state of the lights", and threw it to the same action on the back-end. The developers decided that answering the wrong question is "good enough". But if a human did this to you, you'd be annoyed with them. And with a human, you could explain the mistake to them and they'd likely get it right next time - obviously that doesn't work with Siri.

I would MUCH rather have Siri say, "I don't know how to answer that request" than have her pick a few words out of the sentence and assume she knows what you meant. And this gets back to, I'd would have been happier if Siri had debuted with a more limited and much more rigid syntax - you would have to phrase any given question/command in a very specific way, with the benefit of extremely high chances that Siri would correctly parse the request if you followed the template. And this would bring the benefit that if you issued a command that fit a particular request template, it would be pretty clear that the two words where the name of a HomeKit device is supposed to go must be the name of a HomeKit device. Siri would be given very strong contexts in which to interpret such names. I'm still gobsmacked that they thought it was a good idea to allow you to set a HomeKit scene by simply saying the name of the scene - it means that now any time you say anything to Siri, they first have to check to see if what you said is the name of a HomeKit scene before interpreting it in any other way. What if you name a scene using words like "play" or "set"? When you allow ambiguities like this, either you don't have access to a bunch of commands, or Siri has to start guessing which interpretation you meant. (If you say "Hey Siri, play time", are you asking for the right lighting scene for the kids to run around, or asking to play the song "Time" from Pink Floyd's Dark Side of the Moon?)

In addition to this, the unpublished syntax appears to be changing over time, constantly tinkered with on the back-end, but with no notice to the users about new or changed rules - for instance, I had many times where I'd say, "Hey Siri, play 'Accidental Tech Podcast'" (the bit in apostrophes is the literal name of the podcast), and Siri would say, "Playing Accidental Tech Podcast" and start playing the latest episode. And then there was a period of time where I'd give the same command and get back "There's no podcast named 'Accidental Tech'", and then some weeks later, the same command started working again.

If you took a job where one of the requirements was to know, say, French, and you said "yes, I know French", because you knew a smattering of French, and then your boss asked you a question in French and you gave a wildly incorrect answer because you only recognized two of the words and you just pretended to understand (sounds like a plot point for a sitcom episode)... they might consider firing you for lying about really knowing French.

Yet this is exactly the kind of thing that Siri does - they've coded her to pretend to be fully conversant in English (which I'm sure is their end goal), when in fact she only recognizes bits of it - and she doesn't say that she doesn't understand - frequently, instead, she fakes it, guessing that you probably mean something she does understand, and rushing off to do that thing. If an actual human personal assistant did this repeatedly, you'd get rid of them. I'm annoyed at the developers for taking this approach.

Don't put up a facade and try to backfill before anyone notices that it's fake. Instead, get it to be really good at recognizing a limited subset of the language - sentences/commands constructed in a particular way - and then slowly expand that syntax as time/resources permit - and publish the syntax ("Siri Syntax Guide v1.0", then 2.0, 3.0, etc.), so users know what to expect, rather than just encouraging them to ask in whatever format they feel like, hoping maybe Siri will understand.

The problem isn't users asking ridiculous questions, the problem is Apple encouraging users to just ask natural language questions, as if they were speaking to a real person. (Encouraging that would be fine if they had written something with Jarvis levels of comprehension of language and context, but they're nowhere near there yet.)

When I confirmed that Siri can only set alarms and reminders very well, I finally realized that it was all my fault for thinking that Siri could act like a real assistant, when in fact Siri is just an alarm clock manager.
 
Virtual assistants generally aren’t reliable, it’s hit-n-miss as to whether you’re understood. I wonder whether Apple did market research whether users would trust Siri to assist with ordering and received a poor response?
 
  • Like
Reactions: dk001
When I confirmed that Siri can only set alarms and reminders very well, I finally realized that it was all my fault for thinking that Siri could act like a real assistant, when in fact Siri is just an alarm clock manager.
In my case navigation, text messages, music are added to the list. But I get your point, if the virtual assistant isn’t 100% then Why even use it. I don’t bother with GA or Alexa for other reasons, but at least the few things that Siri does, Siri does well.
 
  • Like
Reactions: CarlJ and ddhhddhh2
In my case navigation, text messages, music are added to the list. But I get your point, if the virtual assistant isn’t 100% then Why even use it. I don’t bother with GA or Alexa for other reasons, but at least the few things that Siri does, Siri does well.
So, navigation, text messages, music, alarm clock manager… that’s certainly not 100%. As a result, since you say “why even use it” if it’s not 100%, then you’ve stopped using it for navigation, text messages, music and alarm clock manager?
 
This is admittedly frustrating as an end user who would love the convenience and improved functionality. However, every feature decision is a trade-off, and comes with a cost. Despite the reduced functionality in this specific area, it’s good to see such a large corporation really grappling with this issue, because it is a critical test of their fundamental values. That they are making decisions like these without being forced by legislation speaks volumes, and is a big part of why I continue to buy Apple products. What other companies are disadvantaged not so much by their competitors, but by their own fierce commitment to integrity?
 
This is admittedly frustrating as an end user who would love the convenience and improved functionality. However, every feature decision is a trade-off, and comes with a cost. Despite the reduced functionality in this specific area, it’s good to see such a large corporation really grappling with this issue, because it is a critical test of their fundamental values.
Another thing to consider, the machine learning nodes in the Apple chips were a direct response to the problem of “How do we take advantage of ML without just shoving everything back to a central server? Just the idea that Apple’s actually able to do so much in this area on-device is impressive, and it’s because of the privacy limitations.
 
Here is an example on why Siri frustrates me when I try to use it as a PA (outside of reminders, timers, and alarms).
Yesterday was Easter Sunday. My sister in law has her birthday on April 17th. Pretty cool for her. We were on a family call (speakerphone) and the question came up when this occurred last. Good question. So I asked Siri. oops.

Me: Hey Siri, What years did Easter fall on April 17th?
Siri: I don't know what you mean by "What years did Easter fall on April 17th"

Me: Hey Siri, In the last 100 years, how many times did Easter fall on April 17th?
Siri: 100 years.

Me: Hey Google, In the last 100 years, how many times did Easter fall on April 17th?
Google Assistant: 4 times. Easter has fallen on April 17th in 1927, 1938, 1949, and 1960.

A simple question and Siri was ...
 
  • Like
Reactions: ackmondual
Here is an example on why Siri frustrates me when I try to use it as a PA (outside of reminders, timers, and alarms).
Yesterday was Easter Sunday. My sister in law has her birthday on April 17th. Pretty cool for her. We were on a family call (speakerphone) and the question came up when this occurred last. Good question. So I asked Siri. oops.

Me: Hey Siri, What years did Easter fall on April 17th?
Siri: I don't know what you mean by "What years did Easter fall on April 17th"

Me: Hey Siri, In the last 100 years, how many times did Easter fall on April 17th?
Siri: 100 years.

Me: Hey Google, In the last 100 years, how many times did Easter fall on April 17th?
Google Assistant: 4 times. Easter has fallen on April 17th in 1927, 1938, 1949, and 1960.

A simple question and Siri was ...
Well, Google IS pulling from Google’s data as Apple doesn’t have a search engine. In most cases, the first link is good. In some, well, you can tell they parsed the wrong text from the page in order bring that result to the top. :)
A39DFF2B-D2FA-4AE8-9880-6BAC49CFB17E.jpeg
Fortunately, the Google app is free, so folks can download it and use it for all such requests for information that requires a huge dataset on the back end to search from.
 
It’s not any more convenient than just doing it through a web browser/app. Why would anyone want this?
I'm still old fashioned and just have a few sites bookmarked on my browsers, but it can be more convenient.

If you've got your phone on you, but have to go upstairs to your computer, just pick up your phone and make the voice commands. I've known some, going upstairs is a huge hassle. Others can dictate speech more easily than doing hand stuff since there are some dexterity issues.
 
Well, Google IS pulling from Google’s data as Apple doesn’t have a search engine. In most cases, the first link is good. In some, well, you can tell they parsed the wrong text from the page in order bring that result to the top. :)
View attachment 1993298
Fortunately, the Google app is free, so folks can download it and use it for all such requests for information that requires a huge dataset on the back end to search from.

I get that however that is kind of my point. Siri is pretty much broken based on what it was originally intended to be and what it has become.

Take a look at Apple's Siri page - https://www.apple.com/siri/
All is well and good till you get to the bottom. Based on my questions to Siri and the examples Apple shows, I would expect it to easily answer them.
1650306190334.png


Sadly more often than not, it can't. To answer many of these, Siri has to be able to access a significant dataset.

Personally I just wish it could get it right.
 
  • Love
Reactions: turbineseaplane
I get that however that is kind of my point. Siri is pretty much broken based on what it was originally intended to be and what it has become.

Take a look at Apple's Siri page - https://www.apple.com/siri/
All is well and good till you get to the bottom. Based on my questions to Siri and the examples Apple shows, I would expect it to easily answer them.
View attachment 1993326

Sadly more often than not, it can't. To answer many of these, Siri has to be able to access a significant dataset.
Taking a close look at the questions posted, you can see that these are all related to data sets Apple actually has access to (locally, on device(s)) or data sets they subscribe to (like sports scores). There’s responses for TV, Sports scores, HomeKit summons, Shazam, wikipedia (under Siri Knowledge), messaging and traffic.

What you asked was certainly something that COULD be calculated by an adept AI, but Google doesn’t even do that. It’s just reading the first entry of a Google search request which, in fact, wasn’t even produced by Google. :) I wonder what benefit these sites receive from making Google more useful especially since a user might not click on the link now that they have the answer, so no ad views.
 
  • Like
Reactions: dk001
Taking a close look at the questions posted, you can see that these are all related to data sets Apple actually has access to (locally, on device(s)) or data sets they subscribe to (like sports scores). There’s responses for TV, Sports scores, HomeKit summons, Shazam, wikipedia (under Siri Knowledge), messaging and traffic.

What you asked was certainly something that COULD be calculated by an adept AI, but Google doesn’t even do that. It’s just reading the first entry of a Google search request which, in fact, wasn’t even produced by Google. :) I wonder what benefit these sites receive from making Google more useful especially since a user might not click on the link now that they have the answer, so no ad views.

Sports team rosters and sports scores vs holiday calendar? What's playing at the movies? Or most of these below.
1650320055233.png

I'm not seeing the difference.
Some of the proposed questions would require internet search access.

If I understand your point, not only is Siri hamstrung, It is also seriously data constricted on those it can answer.
Then again, that better explains why some call Siri an Alarm Manager or similar term(s).

As for Google, why calculate it when you have the knowledge at your fingertips.
 
Sports team rosters and sports scores vs holiday calendar? What's playing at the movies? Or most of these below.
View attachment 1993435
I'm not seeing the difference.
Some of the proposed questions would require internet search access.

If I understand your point, not only is Siri hamstrung, It is also seriously data constricted on those it can answer.
Then again, that better explains why some call Siri an Alarm Manager or similar term(s).

As for Google, why calculate it when you have the knowledge at your fingertips.
You’re correct, Google and Apple are indeed performing the same actions, pulling up a page with (what is hoped to be) accurate information and then reading it. I was just remarking that’s it’s not anything inherently clever or computationally impressive about Google’s approach, it’s the same as Siri’s. The main difference is the data store that Google has access to is massively greater than what Apple has, and that’s always going to be the case. And, as Google is unlikely to provide un-tracked access to their data store, Siri’s ability to answer questions will always be constrained by what deals they can make with back-end data providers. And, I doubt Google will ever allow un-tracked access to their search engine. :)

Fortunately, Google’s app is free and will likely remain so as the power of their search engine comes from those users that click the search results.
 
I think that’s really the point the article is making, that Siri is not as smart as other assistants, and it’s because of the limitations of stricter privacy. This is talking about purchasing specifically, but I’ve read that Siri sometimes not doing what we ask can be traced back to privacy limitations as well.
That’s a trade off I’m fine with. I’m positive Google and Amazon make a lot of money from their more functional voice assistants, ones that have better recognition and can make purchases, but it’s at the cost of higher personal data and looser safety. The fact that Apple leaves that money on the table tells me they’re much more serious about privacy and security. (Let’s bring up the issue of CSAM later if/when it drops.)
But this is what I don’t understand. Perhaps someone can enlighten me. Just because Apple would obtain this information via Siri if Siri were allowed to, for example, search your photos, it doesn’t follow that Apple would then have to sell that information. If Apple doesn’t want to know your information fine, leave it unprocessed. It doesn’t have to be sold to anyone else either, but I want Siri to be able to search my photos. I want Siri to be able to do all that it can to help me, as an elderly woman. I depend on things that Siri should be able to do like read my mail etc. Apple doesn’t have to sell any of the information it gets. Or am I missing something?
 
Last edited:
Had an incident which for me calls into question about just what Apple means by privacy.

My granddaughter saw a YouTube vid on the latest Avatar 2 trailer. She then asked me about the Avatar ride at Disney. We talked a bit about it and watched a couple of YouTube vids of the ride (POV). The question came up on how long it took to build that ride. So I asked “Hey Siri. How long did it take to build the Avatar ride at Disney?” Sire came back with “Here’s what I found about…” and displayed a list of websites on this topic. Well, that was lame (typical) so I asked “Hey Google. …” and Google read me off the top blurb about this topic. Same info that Siri displayed. Siri just doesn’t read this stuff apparently.

So since they both found the same internet stuff; Google reads aloud, Siri does not, how is Siri being more “privacy“? I suspect this is just a limitation Apple doesn’t want to pursue.
 
But this is what I don’t understand. Perhaps someone can enlighten me. Just because Apple would obtain this information via Siri if Siri were allowed to, for example, search your photos, it doesn’t follow that Apple would then have to sell that information. If Apple doesn’t want to know your information fine, leave it unprocessed. It doesn’t have to be sold to anyone else either, but I want Siri to be able to search my photos. I want Siri to be able to do all that it can to help me, as an elderly woman. I depend on things that Siri should be able to do like read my mail etc. Apple doesn’t have to sell any of the information it gets. Or am I missing something?
I think it’s about more than just selling the data. Apple doesn’t even want to have the data to have the choice of selling it or do anything unsavory with it. Also if the data is stored, then it can theoretically be obtained by a third party, whether the government or hackers or rogue employees. So it’s a preventative measure. Apple seems to be very serious about privacy. And it seems to show because it seems even their products suffer from it.
 
I think it’s about more than just selling the data. Apple doesn’t even want to have the data to have the choice of selling it or do anything unsavory with it. Also if the data is stored, then it can theoretically be obtained by a third party, whether the government or hackers or rogue employees. So it’s a preventative measure. Apple seems to be very serious about privacy. And it seems to show because it seems even their products suffer from it.

We only know what little Apple tells us, and that is not much. Apple says privacy, alludes to privacy in some aspects, and we are left to wag the rest.

Wish they would impart details.
 
We only know what little Apple tells us, and that is not much. Apple says privacy, alludes to privacy in some aspects, and we are left to wag the rest.

Wish they would impart details.
If you request a set of all the data Apple has on you, Apple legally has to provide it to you, all the companies do (And Apple makes it easy to not only see the data but delete it if you like). If that list contains more info than you think it should, then you can take the next action requesting details specific to you.

I don’t know if any vendor is going to spell out how they do what they do to prevent data from getting on that list as there’s very likely proprietary ways they’re accomplishing it. BUT getting the list then querying them about it will get something.
 
I think it’s about more than just selling the data. Apple doesn’t even want to have the data to have the choice of selling it or do anything unsavory with it. Also if the data is stored, then it can theoretically be obtained by a third party, whether the government or hackers or rogue employees. So it’s a preventative measure. Apple seems to be very serious about privacy. And it seems to show because it seems even their products suffer from it.
Yeah, there’s been so many stories where contractors, who really could not care less, getting hired at a company and then looking up people they know using a company’s internal systems. Eventually using that information for their benefit. Reducing what data they have in a central repository does mean limiting Siri, but that also means that any breach at Apple (even by way of social engineering) doesn’t net an attacker as much private data.
 
So since they both found the same internet stuff; Google reads aloud, Siri does not, how is Siri being more “privacy“? I suspect this is just a limitation Apple doesn’t want to pursue.
Google likely does not allow Siri to “read” the information without providing non-anonymous identifying information about the requester (unlike, say Wikipedia where Siri WILL read the content). The privacy isn’t in the result, its in the tasks required to obtain the results.
 
  • Like
Reactions: CarlJ
If you request a set of all the data Apple has on you, Apple legally has to provide it to you, all the companies do (And Apple makes it easy to not only see the data but delete it if you like). If that list contains more info than you think it should, then you can take the next action requesting details specific to you.

I don’t know if any vendor is going to spell out how they do what they do to prevent data from getting on that list as there’s very likely proprietary ways they’re accomplishing it. BUT getting the list then querying them about it will get something.

You can ask, however I am not sure that they have to provide 100% legally.
That said, I have dome this a few times and while Apple delivers a document, you have to assume it is 100% and you still do not know what Apple does with or is doing with this info.

Still, I don’t see how a lot of this precludes Siri from being a far better PA especially in terms of looking up public information.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.