Apple has steadfastly stuck to the notion of "we want Siri to respond like, and be treated like, a human assistant", rather than giving her anything in the way of specific syntax, encouraging people to ask questions that they would ask of another human, with the developers racing to "build out the back-end" to handle this properly.
But being able to answer in a natural sounding voice with proper grammar, with a touch of snark on occasion, isn't the only thing - or even the main thing - that would make her seem human... it's all about the content of the answers she gives.
Siri has repeatedly shown that the developers can be astonishingly naive about what to do with the sentences that Siri correctly hears and parses. They want you to treat her sort of like a person, but she frequently gives answers that, if you got them from a real person, you'd either roll your eyes and walk away, or, if they were your actual paid assistant, you might consider firing them for being incompetent (especially when it happens over and over). She'll show that she has all the information needed, and she at least pretends to have understood the question, but then she gives an answer that looks a lot like the developers had one use case in mind at that point, and gave zero consideration to the possibility of there being any other questions that could end up in the same place, so they don't bother trying to really understand the question, they just run with their guess.
I've had this conversation more than once - same sort of naiveté about time as the birthday question in the story:
- "Hey Siri, how long until sunset?"
- "Sunset will be at 5:20pm today."
- "Hey Siri, how long until 5:20pm?"
- "It's one day until then."
- "Hey Siri, how many minutes until 5:20pm?"
- "It's 47 minutes until then."
So she knows when the sun is setting today, she knows what time it is now, she can calculate the offset, but when she gets the question "how long until sunset?", even though she recognizes the various parts of the question, she pretends you asked "when is sunset today?". And then, if you ask how long until a particular time, and that time is very close to now, she assumes you're asking about that time
tomorrow. Sure, if I asked how long until 5:20pm, and it was 5:45pm, then "tomorrow" might be a legit, though unhelpful, answer, but if you're in the same hour, you need to consider the order of the events (and from a developer standpoint, this isn't rocket science). And "how long until sunset" is
not an incredibly obscure question - it's not in the top ten, but I wouldn't be surprised if its in the top thousand.
Also, there's also a special place in hell for the developer who has Siri responding about 10% of the time to "Hey Siri, set reading" (a homekit scene that merely requires the HomePod mini to send a predefined JSON message to the Hue Hub), with something to the effect of, "Sorry, Carl, I can't contact your iPhone, please make sure it's on the same WiFi network" - yes, dammit, the HomePod and the iPhone are on the same WiFi network (and they're all in the same room), and you
DO NOT NEED my iPhone in order for you, the HomePod, to send a message to the Hue Hub.