Google Assistant Nest Hub Max

Google Nest Hub Max Debuts ‘Look and Talk’ and ‘Quick Phrases’ Features for Skipping Wake Words

Google introduced two ways to bypass Google Assistant wake words on the Nest Hub Max smart display at Google I/O this year. The rumored “Look and Talk” feature activates the voice assistant with eye contact, while the previously Pixel-exclusive Quick Phrases don’t require saying “Hey Google” to get Google Assistant’s attention. The tech giant also previewed upgrades to make it easier to converse with the voice assistant, which the company claims more than 700 million people use every month.

Quick AI Eye

The Quick Phrases feature rolling out to the Nest Hub Max first debuted on the Google Pixel 6 and Pixel 6 Pro last year as a way to skip the need for the saying ‘Hey Google’ or other wake words for some commands. Users can ask Google Assistant to set and change alarms, control connected lights and share the time or weather. The feature relies on the Voice Match vocal identification tool to keep the voice assistant under the control of the device owner.

Look and Talk connects the  Google Nest Hub Max’s camera to the voice assistant, activating it when the user looks into the screen as a kind of facial recognition version of a wake word. Previously code-named Blue Steel as a reference to the look given by Ben Stiller’s character in the film Zoolander, the feature works from up to five feet away. The voice assistant relies on both Face and Voice Match as well to avoid accidental awakenings, which means those using Guest Mode still need to say the wake word.

“There’s a lot going on behind the scenes to recognize whether you’re actually making eye contact with your device rather than just giving it a passing glance, Google Assistant vice president Sissie Hsiao explained in a blog post. “In fact, it takes six machine learning models to process more than 100 signals from both the camera and microphone — like proximity, head orientation, gaze direction, lip movement, context awareness and intent classification — all in real time.”

Natural Chat

Google also previewed a future upgrade to the voice assistant designed to make talking to the AI more like talking to a human. Google Assistant will learn to ignore when people say “um,” pause, or otherwise don’t interrupt themselves. Instead of needing to formulate a command in their head to make sure they say it clearly enough for the AI to understand, Google Assistant will supposedly be able to interpret the way people speak to each other, knowing other humans automatically filter out self-corrections and other ways speech differs from writing.

“To make this happen, we’re building new, more powerful speech and language models that can understand the nuances of human speech — like when someone is pausing, but not finished speaking. And we’re getting closer to the fluidity of real-time conversation with the Tensor chip, which is custom-engineered to handle on-device machine learning tasks super fast,” Hsiao wrote. “Looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.”

  

Google Debuts Multitasking AI Model ‘Pathways’

Google Assistant ‘Look and Talk’ Activation Heads Toward Release

Google Pixel 6 Upgrades Google Assistant With Voice Typing and Quick Phrase Commands