Google Apologizes for Sparking Voice Assistant Privacy Concerns, Announces Policy Changes
Google announced changes to the privacy policy for Google Assistant in a blog post on Monday, following the revelations in July that contractors hired by Google listen to snippets of audio recorded by the voice assistant. The updated policy aims to try and rebuild trust in Google Assistant after a summer where nearly every voice assistant developer came under criticism for how it stores and shares audio recordings and two months after Google announced it was pausing its contractor program for review. The company also apologized, although not for the program, but for failing to communicate how it handles customer data.
“It’s clear that we fell short of our high standards in making it easy for you to understand how your data is used, and we apologize. When we learned about these concerns, we immediately paused this process of human transcription globally to investigate, and conducted a full review of our systems and controls.”
Google Assistant Privacy Changes, Partially
Google’s biggest single change to how it deals with audio recordings is that the Voice & Audio Activity (VAA) data storage is now an opt-in program. Users have to permit Google to record and use bits of their data. The audio reviewed by Google employees and contractors is used for improving how well Google Assistant understands the user’s voice and to enhance Google Assistant’s recognition of languages and accents.
It’s especially notable that the opt-in requirement applies to existing Google Assistant users as well as new ones. Anyone who wants to activate VAA will have to change their current settings, where Google is also making a point to explicitly clarify that humans may hear small bits of audio in order to improve the voice assistant. Google claims that it has never saved recordings by default to send for review, although that may mean that they didn’t store recordings tagged as false wake-ups. Users can still review and delete any interaction they have with Google Assistant online, and Google said it will now delete much more of the data it collects automatically.
Depending on how many people choose to opt-in to VAA, the percentage of each individual’s audio reviewed may skyrocket. According to Google, only 0.2% of audio recorded by Google Assistant is reviewed by humans. If only 20% of Google Assistant users agree to be in the program, it will then take 1% of each individual’s audio to equal 0.2% of audio recorded overall.
Not Quite Apple, More Than Amazon
Google’s changes to its voice review program split the difference between the reactions of Apple and Amazon, two of its biggest voice assistant competitors. Amazon has made an effort to help people understand its privacy rules, including how audio recordings are reviewed, but Alexa still records audio interactions unless people choose to opt-out of the program. Meanwhile, Apple apologized for its review system a month ago and set up an opt-in system like Google’s new method. But, Apple went a step further, ending its contractor program and restricting the data to Apple employees. This better defines the chain of responsibility for the recordings, because Apple can know precisely who has access to which recordings.
The test for all of these approaches is how much customers will trust these companies after these various privacy oversights. New privacy safeguards will reassure some, but it may not be enough to lure back previous users or attract new ones who want to assure their privacy. Voice assistant developers will need to continue to make an effort to assuage their customer base lest their potential be permanently damaged by their treatment of user data.
Follow @voicebotai Follow @erichschwartz
Apple Apologizes for Letting Contractors Listen to Siri Recordings
Google Contractors Listen to Belgian and Dutch Voice Assistant Conversations
Google & Apple Hit Pause on Sharing Voice Assistant Recordings with Contractors