Google Assistant Plans Personalized Speech Recognition Feature
Google Assistant is working on a new feature called personalized speech recognition to improve its adaptation to common terms and names, as first spotted by 9to5Google. The upcoming feature was seen in the developer’s code for the Google app on Android, though it doesn’t have a specific release date attached yet.
Recognizing People
The update appears to build on the “Hey Google” hotword accuracy the company introduced last year to reduce accidental awakenings except for more than just the wake word for the voice assistant. Personalized speech recognition would store audio recordings on the device to better train Google Assistant at recognizing what a user says, similarly to how a user can manually recite the name of contact so that the AI will recognize the pronunciation when they ask the voice assistant to call that contact. The user will be able to delete the data at any time and it won’t get sent to the cloud, which should reassure those concerned about the privacy implications. The new feature is reminiscent of how the Google Nest Hub and Google Mini locally process common requests to respond more quickly. As the description of the feature reads, “Store audio recordings on this device to help Google Assistant get better at recognizing what you say. Audio stays on this device and can be deleted any time by turning off personalized speech recognition.”
Google Assistant Learns
The feature’s development comes as the tech giant continues to push forward with its conversational AI. The company showcased the new LaMDA 2 model and introduced the new AI Test Kitchen app for experimenting with it at Google I/O 2022. LaMDA 2 ups the AI’s ability to engage in open-ended conversations, while the AI Test Kitchen provides those who apply and Google chooses to test out new or unreleased AI tools and relay their idea and criticisms to Google directly. Google Assistant also upgraded the Nest Hub Max smart display with new ways to start conversations that can be personalized to the user. The “Look and Talk” feature activates the voice assistant simply through eye contact, the Quick Phrases don’t require saying “Hey Google” to get Google Assistant’s attention. In addition, a future upgrade will teach Google Assistant to ignore when people say “um,” pause, or otherwise don’t interrupt themselves. Instead of needing to formulate a command in their head to make sure they say it clearly enough for the AI to understand, Google Assistant will supposedly be able to interpret the way people speak to each other, knowing other humans automatically filter out self-corrections and other ways speech differs from writing.
Follow @voicebotaiFollow @erichschwartz
Google Demos New Conversational AI Model and Opens AI Test Kitchen
Google Nest Hub Max Debuts ‘Look and Talk’ and ‘Quick Phrases’ Features for Skipping Wake Words
Google Cloud Upgrades Speech AI Models Already Used by Spotify Car Thing