Amazon Developing AI Chip to Make Alexa Faster
The Information is reporting (N.B. paywall) that Amazon is developing its own AI chip specifically designed for use by the Alexa voice assistant in Amazon Echo smart speakers. Article author Aaron Tilly commented:
The chip should allow Alexa-powered devices to respond more quickly to commands, by allowing more data processing to be handled on the device than in the cloud…Developing a chip that can run AI algorithms could make Alexa-powered devices more appealing to consumers. It would mean the devices could handle some of the AI processing instead of shipping everything to the cloud.
It’s About Less Reliance on the Cloud
Local processing is the key factor behind this move. Alexa is currently powered primarily from the cloud. However, beyond the home, there are many use cases that require robust on-device processing. If you want to serve the automotive market, this is particularly important. Voicebot was the first to report on Amazon Auto and its intent to implement offline use cases for Alexa. The same reporting also revealed that the capability would likely extend even into smart speakers. Reporting from CES:
“I caught up with Tom Taylor [senior vice president Amazon Alexa] after the [Panasonic] presentation and he said that the offline capabilities are important for automotive and will eventually show up in other products are well, like Amazon Echo. He mentioned that it would enable you to use Alexa for local functions even if the internet connectivity was down. An Amazon spokesperson confirmed that ‘this isn’t exclusive to cars.’ No launch date has been named.”
More Offline Use Cases
Currently, local Alexa processing is predominantly related to wake word recognition that determines whether a user is attempting to activate the assistant. Amazon then performs wake word verification in the cloud to confirm that it wasn’t a false positive. This secondary wake word verification in the cloud was introduced by Amazon in May 2017 and it materially reduced false positives. If Amazon intends to continue minimizing false positive wake word activations for offline use cases, those processes will need to move to the device chip. So, greater processing power is needed.
Then there are the other functions that today go to the cloud. You may need cloud access for the latest weather information or breaking news, but there are other functions that could be executed in an offline operating mode if the device chip is powerful enough. Any number of first party skills that are native to Alexa could be delivered with the right level of processing power.
Beyond this you have the natural language processing (NLP) which combines both automated speech recognition (ASR) and natural language understanding (NLU). You won’t match the breadth of Alexa’s understanding with cloud access, but specific domains could be correctly understood locally without cloud access with a more powerful AI chip.
Faster Response Time
Finally, there is speed. Reliance on the cloud actually introduces a delay in a user’s communication with Alexa. The time traveling to the cloud and then back again is added to the processing time which makes voice assistant response noticeably slower than human speech. An AI chip optimized for Alexa could enable more local processing of user queries and faster response times. The Information article quotes Chris Rowen, CEO of Babblabs as saying:
The experience can be so much better if you move a substantial part of the speech recognition to the device.
Faster processing is a usability feature that every voice assistant is aiming to deliver. This factor along with the offline processing features are key motivators behind why Google and Apple also developing their own AI chips. The battle for voice assistant technology doesn’t only involve new languages and skills. New chip technology is another area where investment can pay off in better user experience and broader use case applications.
Cover image credit: Nanalyze
Amazon Auto to Bring Alexa Onboard and Offline Use Cases to Panasonic Infotainment Systems