Google Assistant is Still the Smartest Voice Assistant According to New Study
SEO firm Stone Temple says that Google Assistant continues to outpace Amazon Alexa, Apple Siri and Microsoft Cortana in terms of both the number of questions answered and answered correctly. The company asked each assistant 4,952 questions again in 2018 and even differentiated performance between Google Assistant on a smartphone and through Google Home. The results show Google Assistant on smartphone significantly outperformed all other voice assistants in terms of question answer attempts and correct answers.
The study methodology asked each voice assistant identical questions and determined if it:
- Answered verbally
- Whether the answer was from a database (e.g. Knowledge Graph)
- If the answer acknowledged third-party source (e.g. “According to Wikipedia..”)
- Understood the question
- Attempted to answer the question and did so incorrectly or correctly
“All four of the personal assistants include capabilities to help take actions on your behalf (such as booking a reservation at a restaurant, ordering flowers, booking a flight), and that was not something we tested in this study. Basically, we focused on testing which of them was the smartest from a knowledge perspective.”
This is an important note from the study. It is a knowledge test and not a capabilities test. This is playing to Google’s strength in search and the voice assistant did not disappoint. Google Assistant led in all categories.
Cortana Matches Google Home and Alexa Steps Up its Game
Stone Temple conducted a similar study in 2017 using the same corpus of questions. That means there is trend data as well as head-to-head comparative data. Google Assistant was only tested on Google Home in 2017, but in 2018 Stone Temple decided to test it on smartphone as well. They did this to understand the difference in performance despite the fact that both surfaces employ the same underlying technology. Given the results, it turned out to be a good decision. The year-to-year comparative data yielded three additional findings of note:
- Microsoft Cortana on the Harman Kardon Invoke smart speaker rose to nearly matching Google Home for the number of queries it attempted to answer rising from 53.9% to 64.5%)
- Amazon Alexa rose from just 19.8% of questions attempted in 2017 to 53%, an increase of 2.7 times.
- Alexa, Google Assistant on Google Home and Siri all showed a decline in the percent of answers deemed “fully and correctly answered”
The study also pointed out two findings related to accuracy of answers. Microsoft Cortana was the only voice assistant that actually increased its response accuracy over the year. Cortana’s performance for “fully and correctly answered” question rose to 92.1%. And, “Alexa decreased it’s fully and correctly answered level a bit, but in light of increasing its attempted responses by 2.7x, this decrease is not a bad result.”
Featured Snippets are Common But Not Everything
Stone Temple also pointed out that answers including featured snippets were plentiful and nearly identical to the 2017 results where between 40-50% of responses were drawn from featured snippets. The variances were Cortana which used featured snippets somewhat less in 2018 and Siri which used them more. The Siri rise may be attributed a switch the voice assistant made in late 2017 to use Google Search instead of Microsoft Bing.