Alexa in Cars Becomes a Student Driver Learning Personalized Voice Commands
Amazon is taking Alexa’s Teachable AI feature on the road, giving the automotive version of the voice assistant the ability to ask follow-up questions and learn to infer meaning from a driver’s statements or learn their preferences through direct instruction.
Alexa’s Teachable AI upgrade for cars arrives a year after it first rolled out, following a demonstration at the 2020 device event. The idea is to speed up and enhance the voice assistant’s personalization by making it adapt to the user instead of the user needing to learn how to work around Alexa’s knowledge gaps. Alexa can learn directly or indirectly how a driver prefers to speak. The indirect method uses follow-up questions to statements from the driver to voice assistant saying, for instance, that they feel too hot or cold. Despite the lack of direct command, Alexa will respond by asking the driver if they want the AI to roll the window down or adjust the temperature setting of the car.
The same goes for the more explicit instructions in non-standard command phrasing. Amazon describes a driver telling Alexa to set the seat to “my position” or “set the AC to full blast.” Those are not commands Alexa comes out of the box understanding, but the AI will ask what they mean and remember for future requests. Alexa will even know to connect any of the new terms it uses to relevant synonyms like if the driver says they are boiling or frozen instead of just too hot or cold. The driver can also tell Alexa to forget any of the new commands and vocabulary if they choose by asking it to delete the last thing it learned or everything the driver taught it.
“When humans converse, we recognize the participants, are able to connect the dots to understand the context, adapt to the conversational style of our audience, ask for clarifications, and contribute to the discussion with our own suggestions. We envision conversations with Alexa having the same intelligence and personality as if speaking with a family member or friend sitting in the adjacent seat,” Amazon senior product manager Ajinkya Gore explained in a blog post. “Now, customers can tailor Alexa to their natural way of speaking, thereby personalizing Alexa to suit their everyday vocabulary. When Alexa encounters an unfamiliar voice command, the AI will recognize the opportunity to learn, initiate a teaching session to understand the customer’s preferred outcome and retain the learning for future use.”
Alexa Car Equality
Amazon pushed out the Teachable AI upgrade barely a week after releasing Alexa Auto SDK 4.0. The updated platform deployed a more multimodal approach by expanding the Alexa Presentation Language (APL) and its visual options from home devices to the road. Experiences built with APL can adjust visual elements to their environment, allowing Alexa to change what information is displayed and how based on lighting conditions and whether the car is moving. Both upgrades are limited in their operating domains at launch, with APL centering on sharing local info and running smart home devices while the Teachable AI is centered on environmental controls, windshield wipers and interior lighting. In both cases, more domains and features are slated to join the list.
As the rapid-fire release schedule suggests, Amazon is keen to invest in automotive AI. Fiat Chrysler created the first Alexa Custom Assistant in February, followed by Garmin’s new voice assistant. Soon after, Nissan jumped first at setting up Amazon’s Connected Vehicle Skill API, and Ford announced plans in May to integrate Amazon Alexa directly into hundreds of thousands of vehicles over the air. The Amazon Fire TV for Auto platform for Jeep and the way Amazon turned the new Range Rover into a mobile Echo smart speaker.
“We are always working to improve the in-cabin experience and our vision is to allow customers to converse with Alexa as if they are speaking with family or a friend,” Gore said. “Alexa’s goal for on-the-go experiences is to make driving easy, stress-free and enjoyable experience. This feature minimizes the burden of memorizing supported voice commands and thus, reduces cognitive load while driving. Customers now have more flexibility in how they interact with Alexa to complete popular tasks while keeping eyes on the road and hands on the wheel.”
Follow @voicebotai Follow @erichschwartz
Amazon Rolls Out ‘Teachable AI’ Feature to Turn Alexa into a Student of Your Preferences
New Alexa Auto SDK 4.0 Could Turn Cars into Echo Shows on Wheels
Amazon Alexa Natural Turn Taking and Preference Learning Could Transform Alexa into a Super Assistant, But Not Yet