Teachable Alexa

Amazon Rolls Out ‘Teachable AI’ Feature to Turn Alexa into a Student of Your Preferences

Alexa will now ask follow-up questions if it doesn’t understand a command or order, thanks to the new Teachable AI feature Amazon introduced on Friday. The feature, which Amazon demonstrated during its September devices event, allows users to directly instruct the voice assistant about their preferences instead of manually setting them up on the app or rephrasing the request to work around the gaps in Alexa’s knowledge.

Schooling Alexa

Teachable AI essentially lets a user program definitions into Alexa in real time with their voice. Whenever the voice assistant doesn’t know a reference, it will ask the user to define it. The first phase of the feature is specifically for smart devices like lights and thermostats but will eventually include other kinds of commands. It works a bit like setting up keywords for Alexa Routines; only it begins a conversation to understand what the custom command might be instead of a rote trigger. As an example, Alexa head scientist Rohit Prasad described asking for “Rohit’s reading mode.” Alexa won’t know what that is but will ask and learn that it means lowering the lights to 40% brightness.

“This new capability helps Alexa get smarter by asking questions to fill gaps in her understanding—just like we do as humans,” Prasad wrote. “With interactive teaching, Alexa learns these definitions and associated actions instantaneously, and they are stored only for your account for future use.”

Alexa can even be taught directly how humans use declarative statements as disguised orders. Telling Alexa that it’s too dark in a room will provoke the voice assistant to ask if the user wants the lights on or curtains open. Alexa will then learn from the user that being told it’s too dark is an indirect order to turn on the lights, giving it the illusion of intuition to those who don’t see the actual teaching moment.

Indirect Inference

The feature is reminiscent of the Listen Learner technique developed by researchers from Apple and Carnegie Mellon University.  Listen Learner keeps a smart speaker microphone on non-stop and has the AI ask its owner what every sound it hears means. The idea is the AI would learn things like what a faucet drip sounds like and report it to the owner before they even realize there’s a problem. Though more for environmental monitoring, the direct teaching method fits with Alexa’s newly questioning nature.

Teachable AI is only part of a bigger AI enhancement Amazon announced in September. The indirect command aspect of Teachable AI connects to the latent goal inference feature that debuted last month. Latent goal inference similarly gets the voice assistant to extrapolate when a request or goal is hinted at obliquely. Asking a history question might prompt Alexa to ask if the user wants to do a history quiz game, or asking about how long something takes to cook might get Alexa to suggest setting a timer for however long that is. Both features will benefit next year when Amazon adds Natural Turn Taking to Alexa, enabling the voice assistant to continue a conversation without requiring repeated use of the wake word.

  

Amazon Alexa Latent Goals Feature Will Predict Customer Objectives and Proactively Suggest Follow-on Actions and Even Aid in Skill Discovery

Amazon Alexa Natural Turn Taking and Preference Learning Could Transform Alexa into a Super Assistant, But Not Yet

Listen Learner from Apple and CMU Raises Entirely New Privacy Concerns for Voice Assistants