Google Contractors Listen to Belgian and Dutch Voice Assistant Conversations
Google contractors are listening to conversations recorded by Google Assistant in Belgium and the Netherlands. Several recording were also triggered without a wake word. The news, first reported this week by Belgian broadcaster VRT NWS, highlights the ongoing privacy debate when it comes to voice assistants, and just how unaware consumers are of what their smart speakers and other devices with microphones are doing in the background.
Recording Without a Trigger
VRT NWS reported that it listened to more than 1,000 recordings from Google Assistant. A Google subcontractor employee let the reporters see how the system works, and shared that there are thousands of people around the world with access to the system. The point is to improve speech recognition by checking and comparing what is said with how Google Assistant understands the command, helping the program better grasp how people speak, and how to distinguish speech from other noises.
More alarming is the news outlet reported that 153 of the recordings they gained access to were not conversations with Google Assistant, but somehow were recorded anyway. Among the innocent background conversations, the recordings included people being intimate with each other and sharing private information. One of the contractors said he’d even heard what may have been a case of violence recorded by a Google Assistant-enabled device.
Google’s terms and conditions state that audio recordings are made when you start a conversation with Google Assistant on a smartphone or other device by using its wake word. What has been suspected, but not made explicit before, is how widely access is given to Google employees and contractors to the recordings, including information that shouldn’t have been recorded.
“We partner with language experts around the world to improve speech technology by transcribing a small set of queries — this work is critical to developing technology that powers products like the Google Assistant,” a Google spokesperson told Voicebot in an official statement. “Language experts only review around 0.2% of all audio snippets, and these snippets are not associated with user accounts as part of the review process.”
The officially reviewed snippets may be anonymized so that the contractors don’t know who is doing the talking, but VRT NWS used the information people shared with the voice assistant to easily track the speakers down and confirm it was their voice.
Google has been at pains to assure consumers that they are not eavesdropping on personal conversations, even if they are occuring in the background when triggered by the device’s wake word. The company offers a variety of tools to boost privacy and delete recordings, but the general public typically lacks rigorous digital security habits, so it may be some time before people automatically customize their privacy settings. Google does claim to take any breach of security very seriously, however.
“We just learned that one of these reviewers has violated our data security policies by leaking confidential Dutch audio data. Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action,” the Google spokesperson said. “We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”
Google is not alone in having these programs. Amazon’s Alexa and Apple’s Siri assistants operate similarly, and for similar purposes. But, people are still very concerned about privacy when it comes to digital assistants. Amazon has taken steps such as creating an Alexa privacy center website and implemented a voice command to delete Alexa recordings.
Nonetheless, a recent study by Microsoft, found 41 percent of voice assistant users are concerned about their privacy due to passive listening by their devices. Privacy concerns have even sparked individual efforts to make anti-passive listening devices like Project Alias, ClappervsAlexa, and Mycroft Kickstarter.
Currently, Amazon is facing lawsuits and FTC complaints that it is violating the Children’s Online Privacy Protection Act (COPPA), and governments are debating over how to protect privacy while angling to use data from smart speakers for law enforcement. Germany officials are arguing about using voice assistant data in courtrooms, and Amazon fought a court order to give recordings from an Echo device to law enforcement in an Arkansas murder case. They eventually did so but only after the defendant said it was fine.
With smart speaker ownership growing fast, up 40 percent last year according to a Voicebot survey, the fight over voice privacy isn’t likely to be resolved easily. The secretive nature of how voice assistants record and share data, and their imperfect selectivity in what they record, only makes revelations like this more explosive, and the need for useful solutions sharper.