The 4 Biggest Controversies For Voice Assistants in 2019
2019 has been a banner year in many ways for voice technology and voice assistants. The breadth and scope of the industry have extended into new shapes and verticals at what seems like an ever-increasing pace. The successes, however, have not come without any growing pains. The companies behind the triumphs have stumbled more than once in the last 12 months. It’s worth noting some of the biggest snafus and negative publicity, both for how they shaped the landscape, and what they may portend for the coming year.
Contractors Listening to Private Conversations
The issue of who is listening to voice assistant recordings and what they are hearing has been one of the longest-running controversies faced by voice assistant developers this year. In April, reports came out about Amazon sharing Alexa recordings with contractors as part of its quality control and improvement program. The issue was that many users didn’t know about it. On top of that, accidental awakenings meant that the voice assistant sometimes recorded private information that it was never supposed to hear. Similar revelations in July about Google Assistant followed, as did ones about Apple’s Siri, Microsoft’s Cortana voice assistant, and the Clova assistant from Naver. Since then, most of the companies at least paused their contractor programs and revised how they operated in some form. But, while the new terms of service and adjustments in how the quality control programs operate may improve the overall transparency of voice assistants, they don’t necessarily mean people will trust them again. Privacy is frequently cited as important to current and prospective voice assistant users, so these stories may linger and cause issues for voice assistant makers in the future.
Security Vulnerabilities and the Great Google Action Outage
When Security Research Labs (SR Labs) told Amazon and Google about some security vulnerabilities in Alexa skills and Google Actions in the summer, both companies took steps to correct the matter. For some reason, this led to the removal of thousands of Actions worldwide for several days. The disappearance, which came without warning or explanation, caused a furor in the developer community. While the reasoning was to improve security, the circumstances led to a lot of ill will towards Google, especially as Alexa closed up the security issues without an apparent corresponding problem. Voice assistant security is a real concern, but the successful spread of voice assistants means that issues like this are a bigger deal than before, and how they are addressed matters more. It’s why stories about using lasers to hijack voice assistants get a lot of attention, even if the actual danger of it happening is minimal.
Alexa’s Children’s Privacy Lawsuits
Privacy and voice assistant debates moved into the courtroom this year when Amazon ended up facing two lawsuits from parents alleging that Alexa was violating the privacy of their children. The California and Washington State lawsuits argue that recordings made by Alexa are done without parental consent and therefore violate privacy laws. Amazon only started offering child-focused Alexa content after the FTC updated the Children’s Online Privacy Protection Act (COPPA) in 2017, but the lawsuits allege that the recordings kept by Amazon are not consented to by parents. Awkwardly, the two lawsuits came out right as Amazon was debuting its newest Echo Dot Kids Edition.
Google Duplex Faking AI
Google made a very big deal about its new Duplex feature at the end of last year, expanding it to 43 states in March. Duplex is supposed to use a voice AI to call and make reservations or ask questions of businesses on behalf of the user. It turns out that there were often humans doing the calling, as exposed in May this year. That Google couldn’t trust Duplex to handle all of the calls was bad, but hiding that humans were supporting the feature made the company look duplicitous. Once again, the error was about trust and transparency, and how a lack of either makes voice assistant developers look bad.