Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's natural for folks to gravitate towards a pronoun when they are hearing a male or female voice though.
Humans are imperfect.

At least when it's addressing a "device", using an incorrect pronoun has no real consequences for anyone.
That's a plus I think.

What voice should an "it" have anyways?

These days it could be either! ???
 
  • Haha
Reactions: turbineseaplane
I suspect many of us look back to the early days and realize what Siri could have become as we watch GA, Cortana, and Siri evolve.
While I appreciate the “privacy” claim, Siri could have been so much more before Apple went off on the privacy kick. Sadly it never was.
 
  • Like
Reactions: Fraserpatty
@mnsportsgeek

I played with having Siri read my messages, especially through my Air Pods Pro.
However trying to get Siri to execute any of the follow-up options was very hit or miss at best.
After a few frustrating days I gave up.
Same here, I can’t use Siri to dial, navigate or to read/reply to messages.

I have my iOS UI&Siri set to english and German street names sounds crap when read by an english Siri. I also get Messages of different persons who speaks different languages(english, german, spanish, italian, portuguese), this does not work, Siri can’t do multi-language nor context based multi-language.

I sometimes try to use Siri to change Music while driving and even for this it mostly sucks. I end manually changing my music, because Siri takes too much attention and pisses me off.
 
To be fair, I doubt most people are using Alexa to order items on Amazon anyways. Too many things that could go wrong, and too clunky compared to simply using the app or a browser interface.
 
Imagine being on holiday in Hawaii, trying to order Havaianas (the flip-flops) with Siri “Hey Siri get me (new) Havaianas!”, because yours broke, and Siri orders a few underaged(new) hawaiian girls, CSAM kicks in, your wife opens the door and the cops aloha’s in. I would say Siri can destroy marriages and lifetime jail you.
 
Last edited:
Same here, I can’t use Siri to dial, navigate or to read/reply to messages.

[…]
As in everything in life, ymmv, and one persons failure in using an assistant doesn’t mean that scales To everyone.

My wife uses CarPlay in an iPhone XR. Uses Siri to dial, navigate and read and reply to messages. Like a pro. Other than that and controlling music on my HomePod and HomeKit devices , I don’t have much use for ANY digital assistant.
 
  • Like
Reactions: CarlJ
All of these comments are along the lines of “this would’ve been terrible anyway because Siri is terrible” but like…that’s the point. Siri is both terrible AND doesn’t have features like this because of the massive restrictions on what data it can use. Siri isn’t trained on massive amounts of detailed user data like every other assistant. Thus it has less features and less reliability.
 
Same here, I can’t use Siri to dial, navigate or to read/reply to messages.

I have my iOS UI&Siri set to english and German street names sounds crap when read by an english Siri. I also get Messages of different persons who speaks different languages(english, german, spanish, italian, portuguese), this does not work, Siri can’t do multi-language nor context based multi-language.

I sometimes try to use Siri to change Music while driving and even for this it mostly sucks. I end manually changing my music, because Siri takes too much attention and pisses me off.

I can get Siri to dial routinely if I have the person I am calling setup as a unique name with only one number in the contacts.
Then it usually works.
Most of my contacts, especially work and peers have multiple phones and emails.
 
Right -- which is why we are glad they aren't doing this and also kvetching about how mostly awful and useless Siri is.
It's a sad state to be in for Siri - this far into its existence.
Rather than trust Amazon with my data from Alexa, I’m more comfortable with a diminished experience from Siri. Everybody’s mileage will vary on this.
 
  • Like
Reactions: CarlJ
I'm always amazed how many people refer to Siri as "she".

Siri is not a "she" (or "he") but an "it". At least in English, gendered pronouns should be reserved for referencing actual humans (or animals), not algorithms wrapped up in voice recognition utilities.

I wonder how much of the frustration with Siri (or other voice assistants) stems from unreasonable expectations that result from even subconsciously thinking of it as a real, thinking AI.

I'm generally OK with Siri's functionalities, and I find it to be a great convenience is many ways. But it is still just a keyboard replacement connected to an advanced chat bot, and thus I don't expect it to succeed with arbitrary requests outside of particular syntaxes.

This illusion (admittedly pushed by marketing agents at all the big tech companies playing this market) that Siri is a kind of "plastic pal who's fun to be with" just feels ridiculous to me.
I expect many people refer to Siri as "she" because Siri defaults (or at least used to default) to a female voice, and there's a natural tendency to refer to other participants in a conversation with slightly more personalized pronouns, lest someone mistakenly assume you're referring to a lamp or couch or some other non-speaking object. I certainly don't think of Siri as a thinking AI. "She" is mostly convenient shorthand, in sentences where "it" may be referring to something else (the restaurant you want directions to or the light you want to turn off).

I also don't have unreasonable expectations of what Siri can do. I generally limit my requests to setting HomeKit scenes, setting timers, adding reminders, creating appointments (I'll let Siri handle the date, time, and title, because that's quick and gets it recorded while it's fresh in my mind, and then go fill in the details on another device), and doing basic math ("what's 473 times 12 times 17 times 1.5") and unit conversions. Oh and asking for directions to some specific destination while driving. I limit myself to these requests because I have a fairly high degree of certainty that Siri will understand and respond correctly to them.

My biggest problem with Siri - and I don't think it's marketing, more like hubris on the part of the designers - is that they have been unwilling to design in any sort of specific, published, syntax, or ability to have some sort of meta conversation. In the former case, Siri tries to "make sense" of what you're saying, and often gets things wrong if you stray from their (unpublished and evolving) understood syntax, with the huge problem being that she rarely says, "Sorry, I don't understand what you mean" - she instead she makes huge assumptions that she knows what you mean, when she actually doesn't.

In the latter case, with no capability to have a conversation about the conversation - essentially, no editing mode - it means if I'm driving, and I ask for directions to a store, and she doesn't understand the name of the store, she will cheerfully start suggesting places that are clearly not what I want, and my only recourse is to try again, and probably get the same wrong result. There is no mechanism by which I can say, "Siri, you're misunderstanding the name of the destination, let me spell it for you: I K E A" (first name off the top of my head, not actually one she would misunderstand). I've had this happen with, say, a restaurant where the main word in the name sounds like some more common word, but if you search on the common word, you get a thousand matches. I've had occasions I've have to resort to giving the name of some store that I remember is a few blocks away from my actual destination, simply because Siri can recognize that store name.

And, maddeningly, if she doesn't get it right the first time, often she assumes that it's you who aren't sure where you want to go, so she starts adding details that are meant to be helpful in making your choice ("this one is 2.8 miles away, and gets 3 stars, and is open until 8pm, would you like to try that one?") - when you're driving, and what you REALLY want to know is whether to turn left or right at an intersection that's coming up soon, and the problem is that Siri didn't parse the name correctly, having Siri waste more and more time giving useless details, trying to "help you make up your mind", is infuriating.

I see this a lot in code - the presumption that if you get to a particular point in the code, it's because the user has made mistake X, and now it's time to explain to them their mistake as if they're five years old (when often the program ended up there for some entirely different reason the developer didn't think of). I would MUCH rather have a "personal assistant" that doesn't make assumptions - if she doesn't understand exactly what I mean, she should say, "sorry, I don't quite understand that - could you refer to my syntax manual and try again?".

But Apple's approach is to start out by pretending that Siri is fully conversant in English (or whatever language), and tell users to just ask questions, and then they try to handle whatever's thrown at Siri, and often fail badly - rather than making Siri fully understand/recognize a limited subset of English.

As a programmer, I've spent nearly my whole life dealing with rigid syntaxes, I'd be quite happy issuing verbal commands using specific syntax I'd looked up in a manual, rather than just having a sketchy "you could try things like..." list and having to guess.

And either publish specific recognized syntax for some new subject, or don't accept queries about that subject until you can be fully conversant in it. As just one example, Siri knows about all the HomeKit lights in my house. She knows them by name and where they are, and their current state. I can say, "turn on my bookcase light", and that works fine, as does "turn on my living room lights". If I say, "how many of my lights are on", she'll cheerfully say, "4 of your lights are on and 13 are off". But if I say, "which of my lights are on?" - and remember, she has all the information necessary to answer this question, including the names of every light - she will ALSO answer that with, "4 of your lights are on and 13 are off".

It's not a complicated request. It's not an obscure request (i.e. if you're already adding code to handle a count of the number of lights on/off, it's a fairly obvious next step to guess that someone might want to know which ones). Siri didn't mishear the word "which", and she understands the word in other contexts. Rather, they seem to have decided "this is a query about the state of the lights", and threw it to the same action on the back-end. The developers decided that answering the wrong question is "good enough". But if a human did this to you, you'd be annoyed with them. And with a human, you could explain the mistake to them and they'd likely get it right next time - obviously that doesn't work with Siri.

I would MUCH rather have Siri say, "I don't know how to answer that request" than have her pick a few words out of the sentence and assume she knows what you meant. And this gets back to, I'd would have been happier if Siri had debuted with a more limited and much more rigid syntax - you would have to phrase any given question/command in a very specific way, with the benefit of extremely high chances that Siri would correctly parse the request if you followed the template. And this would bring the benefit that if you issued a command that fit a particular request template, it would be pretty clear that the two words where the name of a HomeKit device is supposed to go must be the name of a HomeKit device. Siri would be given very strong contexts in which to interpret such names. I'm still gobsmacked that they thought it was a good idea to allow you to set a HomeKit scene by simply saying the name of the scene - it means that now any time you say anything to Siri, they first have to check to see if what you said is the name of a HomeKit scene before interpreting it in any other way. What if you name a scene using words like "play" or "set"? When you allow ambiguities like this, either you don't have access to a bunch of commands, or Siri has to start guessing which interpretation you meant. (If you say "Hey Siri, play time", are you asking for the right lighting scene for the kids to run around, or asking to play the song "Time" from Pink Floyd's Dark Side of the Moon?)

In addition to this, the unpublished syntax appears to be changing over time, constantly tinkered with on the back-end, but with no notice to the users about new or changed rules - for instance, I had many times where I'd say, "Hey Siri, play 'Accidental Tech Podcast'" (the bit in apostrophes is the literal name of the podcast), and Siri would say, "Playing Accidental Tech Podcast" and start playing the latest episode. And then there was a period of time where I'd give the same command and get back "There's no podcast named 'Accidental Tech'", and then some weeks later, the same command started working again.

If you took a job where one of the requirements was to know, say, French, and you said "yes, I know French", because you knew a smattering of French, and then your boss asked you a question in French and you gave a wildly incorrect answer because you only recognized two of the words and you just pretended to understand (sounds like a plot point for a sitcom episode)... they might consider firing you for lying about really knowing French.

Yet this is exactly the kind of thing that Siri does - they've coded her to pretend to be fully conversant in English (which I'm sure is their end goal), when in fact she only recognizes bits of it - and she doesn't say that she doesn't understand - frequently, instead, she fakes it, guessing that you probably mean something she does understand, and rushing off to do that thing. If an actual human personal assistant did this repeatedly, you'd get rid of them. I'm annoyed at the developers for taking this approach.

Don't put up a facade and try to backfill before anyone notices that it's fake. Instead, get it to be really good at recognizing a limited subset of the language - sentences/commands constructed in a particular way - and then slowly expand that syntax as time/resources permit - and publish the syntax ("Siri Syntax Guide v1.0", then 2.0, 3.0, etc.), so users know what to expect, rather than just encouraging them to ask in whatever format they feel like, hoping maybe Siri will understand.

The problem isn't users asking ridiculous questions, the problem is Apple encouraging users to just ask natural language questions, as if they were speaking to a real person. (Encouraging that would be fine if they had written something with Jarvis levels of comprehension of language and context, but they're nowhere near there yet.)
 
Letting Siri do the shopping is like letting your two year old do the shopping.
Fun fact: at one point, a relative's 3 year old realized she could pick up mama's or papa's phone and hold down the button and say, for instance, "add cookies to shopping list" and cookies would arrive in the house. That worked a number of times, with each parent assuming the other one had added the item, until the requests got a bit outlandish.
 
Studies have shown that a female voice is often seen as more commanding. It's why military and other aircraft use a female voice as it's warning and alert voice in the cockpit.
The way I recall it being explained was simply that fighter pilots (I believe the tech started in the military) were overwhelmingly male and were more likely to tune in a female voice.
 
  • Like
Reactions: dk001
But regardless, artificial intelligence/speech algorithms will never be perfect, it’s not possible, because everybody has different variances in terms of how they pronounce words, the rate of how fast they talk, tone, volume and delivery all play into how speech algorithms can decipher what the user is saying.
Agreed it'll never be perfect. The end goal is likely, "at least as good as a human would do". Beyond recognizing and collecting words, though, the big problem, and where Siri falls down the most, is parsing sentences and extracting the correct intended meaning out of them.
 
The only thing Siri is good at is setting alarms and reminders, and that's all it does. Sometimes when I command Siri, I'm even asked by Siri to 'unlock' my phone. Well, I don't care what the Siri lovers think, as far as I'm concerned, Siri can only set alarms and reminders, nothing else is Siri's forte.
In fact, I have a great suggestion for renaming Siri: Alarm Clock Manager, which certainly describes Siri's purpose well.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.