Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.



In an annual test comparing Google Assistant, Siri, and Alexa on smartphones, Loup Ventures' Gene Munster found that Siri was able to correctly answer 83 percent of questions, beating Alexa but trailing behind Google Assistant.

Munster asked each digital assistant 800 questions during the test to compare how each one responded. Alexa answered 79.8 percent of questions correctly, while Google Assistant answered 92.9 percent of questions correctly.

aitestresults-800x221.jpg

Compared to last year, Siri has seen improvement. During the July 2018 test, Siri answered 79 percent of questions correctly compared to the 83 percent of questions answered right this time around. Alexa last year was at 61 percent while Google Assistant was at 86, so there have been digital voice assistant improvements across the board.

siriimprovementsovertime-800x509.jpg

This test covered smartphones specifically, comparing iPhones and Android devices. Munster says that smartphones were isolated from smart speakers because while underlying technology is similar, "use cases vary." Siri was tested on an iPhone running iOS 12.4, Google Assistant on a Pixel XL, and Alexa in the iOS app.

Questions were based on five categories and all assistants were asked the same 800 questions. Each question set was designed to "comprehensively test a digital assistant's ability and utility." Some of the sample questions across each of the categories:

[*]Local - Where is the nearest coffee shop?
[*]Commerce - Order me more paper towels.
[*]Navigation - How do I get to Uptown on the bus?
[*]Information - Who do the Twins play tonight?
[*]Command - Remind me to call Jerome at 2 pm today

Siri did best in the command, local, and navigation categories, faring less well in the information and commerce categories. Siri actually won out in the command category, but trailed behind Google Assistant in other categories.

questionsbycategory-800x475.jpg
Munster says that the continued rate of improvement "continues to surprise" based on the notable improvements that each voice assistant has demonstrated over the course of the last few years.

In the future, Loup Ventures expects to see further improvements from extending the feature sets of each voice assistant.

Article Link: Siri Answers 83% of Questions Correctly in Test, Beating Alexa But Trailing Google Assistant

It has been my experience that either they are using people in these tests who have been trained in exactly how to say certain words in a way Siri will understand, OR I must have some kind of a speech impediment. Despite my having it, somehow, virtually every other human being I’ve ever spoken with has no problem understanding what I’m saying. Just Siri.

It is unreliable to the point of being useless, so I’ve turned it off on everything. Even when everything else works, I have a good, solid, fast, reliable internet connection, a GOOD chunk of the time, Siri would somehow misunderstand what I was saying. I got tired of it and gave up.

Siri... just another Apple failure I have no time for anymore.
 
I'll add my skepticism to the results of this report as well. As a family using Amazon and Apple products, we all have the same opinion that Alexa far outperforms Siri both in voice recognition and in useful responses. Siri has very consistently proven to be unreliable in basic intended function. Personally I am down to only using Siri for sending text messages while I am driving, and I'd guess it completely correctly transcribes my message 75% of the time. And this is coming from someone who has been involved in public speaking most of my life, including instructing, i.e., I know how to speak understandably. My kids only use Siri as a source of childish entertainment, "Let's see how silly Siri's response will be." It's often quite silly.

Also, if the user has to choose from a limited, pre-approved list of inquiries to get Siri to function properly, then it is clearly low on the "I" part of AI. Good AI shouldn't require an instruction manual to use successfully. The closer it operates to conversational, the more useful it is.

I see there is an Alexa app for the iphone, but I have yet to try it out. I suppose if it is not accessible as easily as Siri is, then it might be a bit counterproductive to use.
 
  • Like
Reactions: MisterSavage
Actually, Loup Ventures did test Cortana last year and as usual it came dead last:

"Google Assistant continued its outperformance, answering 86% correctly and understanding all 800 questions. Siri was close behind, correctly answering 79% and only misunderstanding 11 questions. Alexa correctly answered 61% and misunderstood 13. Cortana was the laggard, correctly answering just 52% and misunderstanding 19."
Siri works beautifully in my own experience.
[doublepost=1565917004][/doublepost]

Except that in 2017, Loup Ventures (and Gene Munster) put Siri dead last (even worse than Cortana) in that year's Voice Assistant study:

"Three HomePods were subjected to 782 queries by the firm, said analyst Gene Munster. While Siri understood 99.4 percent of them, it was only able to answer 52.3 percent of them correctly. The latter figure compares with rates of 81 percent for Google, 64 percent for Alexa, and 57 percent for Cortana."
So Gene doesn't let being a "heavy bull on Apple" get in the way of his objectivity.
This study wasn't done using "voice assistants" AKA smart speakers thought it was done just using phones, right? Ironically, Siri is far more limited on my HomePod. Not hating, I find she works pretty well, under most circumstances on my newer devices.
 
Google Assistant has a much more diverse use case than the others and usually gives audible answers. Useful if you’re in hands free mode.

If, for example, I open Google Assistant on my iPhone and ask, “What does ‘sugoi’ mean in English?”, it answers. Same with questions in other languages, but sometimes I need to specifically state what language I want it to translate from. Siri cannot handle this very well.

Siri is locked in one language so asking it how to get to Shimbashi station fails whereas Google Assistant knocks it out the park.

That said, if you know Siri’s limitations, it works fine for day-to-day requests. If you need to occasionally take it up a notch, just as Siri to launch Google Assistant.
 
Siri definitely has some catching up to do with answering questions. Ask it a question about markets or futures and it does not know how to answer them. Those should be quite easy to answer. Wish there was an easy way to let Apple know that Siri should know how to answer those basic questions.
 
Siri definitely has some catching up to do with answering questions. Ask it a question about markets or futures and it does not know how to answer them. Those should be quite easy to answer. Wish there was an easy way to let Apple know that Siri should know how to answer those basic questions.
It knows how to look up the height of the Eiffel Tower and of the Empire State Building, but it can't tell me which one is taller.
In the iPados beta it doesn't always hear what I am saying and speech still has some hiccups.
 
I’ve used them all, Siri is great at commands, google is great for information and Amazon well decent all getting close to each other in a few years not going to be a great deal they all quite good
 
  • Like
Reactions: perspective002
[*]Local - Where is the nearest coffee shop?
[*]Commerce - Order me more paper towels.
[*]Navigation - How do I get to Uptown on the bus?
[*]Information - Who do the Twins play tonight?
[*]Command - Remind me to call Jerome at 2 pm today

And the responses:

Here's what I found on the web for coffee soup
I can't sort paper trowels in order
Playing Uptown girl by billy joel
I don't know who will win the game tonight
I've updated your reminder to call home


Most of Siri's brain-deadness is in the voice recognition portion of the system.

NAILED IT!
 



In an annual test comparing Google Assistant, Siri, and Alexa on smartphones, Loup Ventures' Gene Munster found that Siri was able to correctly answer 83 percent of questions, beating Alexa but trailing behind Google Assistant.

Munster asked each digital assistant 800 questions during the test to compare how each one responded. Alexa answered 79.8 percent of questions correctly, while Google Assistant answered 92.9 percent of questions correctly.

aitestresults-800x221.jpg

Compared to last year, Siri has seen improvement. During the July 2018 test, Siri answered 79 percent of questions correctly compared to the 83 percent of questions answered right this time around. Alexa last year was at 61 percent while Google Assistant was at 86, so there have been digital voice assistant improvements across the board.

siriimprovementsovertime-800x509.jpg

This test covered smartphones specifically, comparing iPhones and Android devices. Munster says that smartphones were isolated from smart speakers because while underlying technology is similar, "use cases vary." Siri was tested on an iPhone running iOS 12.4, Google Assistant on a Pixel XL, and Alexa in the iOS app.

Questions were based on five categories and all assistants were asked the same 800 questions. Each question set was designed to "comprehensively test a digital assistant's ability and utility." Some of the sample questions across each of the categories:

[*]Local - Where is the nearest coffee shop?
[*]Commerce - Order me more paper towels.
[*]Navigation - How do I get to Uptown on the bus?
[*]Information - Who do the Twins play tonight?
[*]Command - Remind me to call Jerome at 2 pm today

Siri did best in the command, local, and navigation categories, faring less well in the information and commerce categories. Siri actually won out in the command category, but trailed behind Google Assistant in other categories.

questionsbycategory-800x475.jpg
Munster says that the continued rate of improvement "continues to surprise" based on the notable improvements that each voice assistant has demonstrated over the course of the last few years.

In the future, Loup Ventures expects to see further improvements from extending the feature sets of each voice assistant.

Article Link: Siri Answers 83% of Questions Correctly in Test, Beating Alexa But Trailing Google Assistant
For me, Alexa is much more knowledgeable than Siri. I don’t use Siri in fact, I have turned her off on all of my devices.
 
Maybe google assistant is so good because of the amount of listening google does?

I have zero doubt about that. I remember all those permissions from Google over the years on all their smartphones. All the updates in the past used to reset settings to default settings which pretty much gave Google the right to do anything they wanted
 
  • Like
Reactions: I7guy
You can take the aggregate of all the Macrumors, AppleInsider, 9to5, and Apple Reddit posts and get a sense of at least Apple related forum results.

By definition if you aggregate the results of Apple related form posts you will get a sense of Apple related fora. What you do not get is 1) an unbiased sample (people may be more likely to post if they have bad results) 2) any representation of the general public’s usage (enthusiasts’ usage is unlikely to mirror the average user).

I think it's erroneous to completely throw out anecdotal because there are some valid inferences made there.

If Apple (or Amazon, Google, Microsoft, etc.) want to look through these results, they may find specific failures they might want to fix or use cases they may not have considered that they would want to add. Drawing general conclusions based on these anecdotes is not that valid because of its poor sample set.

Also do keep in mind though that Loup Ventures (and Gene Munster) is a heavy bull on Apple.

I am pretty sure they publish all the questions and the results. Very easy for other to duplicate.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.