Alexa Intent Prediction-update

Amazon Alexa Latent Goals Feature Will Predict Customer Objectives and Proactively Suggest Follow-on Actions and Even Aid in Skill Discovery

Amazon revealed today that Alexa can now guess what you will want to do next after certain types of requests. Based on that knowledge, Alexa will proactively suggest actions that users can simply confirm instead of having to invoke the voice assistant again to complete a subsequent task. This is available from first-party Alexa interactions as well as third-party Alexa skills. According to an Amazon spokesperson, CNBC, Harry Potter, Trivia, TED Talks, Ultimate History Quiz, and Xbox are examples of third-party Alexa skills that are taking advantage of this feature today. Some examples provided to Voicebot include:

CNBC

Customer: What’s the stock price of {X}?

Alexa: When the market closed on Monday, {X} traded at  $201.15, down 1.99% since the previous close.

Alexa: Would you like to know the intraday high for {X} from the CNBC skill?

TED Talks

Customer: What happened today in history?

Alexa: On May 18, 1980…{answer}

Alexa: By the way, I can search for talks on history in the TED Talk skill. Want me to pull them up?

Ultimate History Quiz

Customer: When did the second world war end?

Alexa: The Second World War ended on September 2, 1945.

Alexa: Want to test your history knowledge with Ultimate History Quiz?

Note from Amazon on this image: In this interaction, Alexa infers that a customer who asks about the weather at the beach may be interested in other information that could be useful for planning a beach trip.

CREDIT: GLYNIS CONDON

Alexa Takes a Step Toward Better Skill Discovery

In each of these user interactions, the question is posed to Alexa as a first-party service. That is native Alexa in case you are not familiar with the first-party terminology. After the question is answered, Alexa then asks if the user wants to complete an action that is fulfilled by a third-party Alexa skill.

Today, this feature will apparently not work in the other direction. Users in a third-party Alexa skill will not get suggestions that would take them to other skills according to a clarification provided by Amazon. This would be a potential concern to third-party skill developers as they might be concerned about having Alexa interrupt a user session. Amazon is not saying this will never be implemented, but it is not part of today’s announcement.

Of course, there are more common scenarios as well where the entire interaction might take place within an Alexa first-party interaction and not hand off to a third-party skill. An example of this offered by Amazon includes:

Steeping Tea

Customer: How long does it take to steep tea?

Alexa: Five minutes is a good place to start.

Alexa: Would you like me to set a timer for five minutes?

Alexa Latent Goals addresses three issues in one feature. First, the user is offered added value for engaging with Alexa. That could increase the perceived value of Alexa as well as help users discover new skills. Second, third-party skill developers get some help with the biggest hurdle many face today, discovery by new users. Third, Amazon potentially gets longer user sessions and more interactions which increases its knowledge of user interests and the ratio of consumer attention it commands.

How Developers Can Take Advantage of Alexa Latent Goals

Developers that want to take advantage of potential referral traffic from Alexa Latent Goals should be using the Name-Free Interaction Toolkit when building their Alexa skill. Amazon relayed to Voicebot that, “The Name-Free Interaction Toolkit allows developers to provide build-time signals, such as name-free intents and utterances, and runtime signals (canFulfillIntentRequest). When a latent goal is identified, Alexa can use these signals to identify target skills that can handle the sub-intents of the identified latent goal.”

This does not require use of Alexa Conversations in your skill, a feature that is still in Beta. Latent Goals builds upon earlier developer features such as Name-Free Interactions and CanFulfillIntentRequest as mentioned above. While Amazon will be rolling out Latent Goals and passing users along to third-party skills users will not have reporting in the near term on traffic that comes from Alexa referral traffic versus direct invocation. Amazon told Voicebot that may come at a later date.

How it Works

Anjishnu Kuman and Anand Rathi of Amazon’s Alexa AI team outlined the technical approaches behind Latent Goals in an Amazon Science blog post this morning. The process starts with a deep-learning-based trigger model that determines whether a latent goal should be anticipated based on “several aspects of the dialogue context, such as the text of the customer’s current session with Alexa and whether the customer has engaged with Alexa’s multi-skill suggestions in the past.”

If the likelihood of a latent goal is identified, then the system will consider whether there is a skill that can fulfill that objective. It employs semantic-role labeling to match the objective with Alexa skills and uses context carryover features to transform the request so a skill can interpret it properly even though it didn’t originate in a third-party skill session. Alexa then employs bandit learning machine learning models to “track whether recommendations are helping or not, underperforming experiences are automatically suppressed.”

Kuman and Rathi say that some data already “show that latent goal discovery has increased customer engagement with some developers’ skills.” The capability is available today in the United States.

Becoming A Proactive Assistant

This has the potential to be a significant development for Alexa. Voice assistants today are mostly designed to respond to user requests. Amazon has indicated that an increasing number of Alexa activities are now system initiated as opposed to user-initiated. The most interesting of these is Alexa Hunches which was first announced in the fall of 2019. This feature considers a user’s pattern of behavior and makes proactive suggestions that may meet an unarticulated user need.

For example, if when you arrive home every day and you ask Alexa to execute the same routine which turns on the lights, starts playing music, and adjusts the thermostat, Alexa might ask if you want these to be done whenever it notices the garage door opens after 5:00 pm. That could save the need for an utterance. The home will have the expected setting for ambiance before you enter the door. This is a small convenience that offers users a bit more personalization.

Amazon also announced Alexa Preference Learning at its product launch event in September. This feature enables you to tell Alexa more about your preferences and interests. This data can then be used to further personalize your Alexa experience.

Alexa Latent Goal identification will surely take in signals from personalization data. However, this is more about adding value to a particular user session than personalizing the experience. It can presumably offer user benefits even with very little or no personalization data. Based on the behavior of other users, Alexa can suggest follow-on actions that can aid in the discovery of new features (and in turn third-party skills) or simply eliminate the need for making an explicit request. This looks more like a true assistant that not only fulfills explicit requests but also anticipates needs and makes completing objectives easier.

The question now is how well this will work. None of the examples provided to me by Amazon or mentioned in the blog post actually delivered the experience outlined when I tried it on my own Alexa-enabled devices. That is not to say it won’t in the future because the system may need more data to learn or it may have determined based on my dialog context and usage history that there was a low likelihood a latent objective existed.

In addition, many developers have told me that earlier initiatives such and CanFulfillIntentRequest have shown little efficacy in driving skill discovery to date which also calls into question the impact on user experience thus far. The question remains about how much these new features will change the Alexa experience or discovery of third-party skills. However, Alexa Latent Goals is another example of how Amazon is still trying to move the feature set forward to deliver a more proactive assistant. If they succeed, it will subtly change how consumers view assistants and could serve as a foundation for adding more proactive agency to Alexa.

Amazon Alexa Natural Turn Taking and Preference Learning Could Transform Alexa into a Super Assistant, But Not Yet

Amazon Alexa Skill Growth Has Slowed Further in 2020

Amazon Starts Selling English-Only Echo Smart Speakers in the Netherlands