LeakyPick

Can a Device That Detects Voice Assistant Transmissions Reduce Privacy and Hacking Concerns?

Voice assistants are facing more scrutiny than ever, including a new inquiry by the European Union into potential antitrust violations and ongoing lawsuits over children’s privacy in the U.S. Academic researchers have built a new tool for possibly assuaging some of those privacy concerns. The device, named LeakyPick, tests smart speakers and other products with voice assistants, then monitors transmissions and lets users know if sounds are being sent to the cloud without their knowledge.

LeakyPick Privacy

The LeakyPick, built by researchers from Darmstadt University, the University of Paris Saclay, and North Carolina State University, resembles a simple box filled with circuit boards. The prototype only costs about $40 and is designed to be placed in every room where voice assistant-enabled tech operates. The device regularly makes sounds to test the voice assistants, then observes network traffic to alert the owners about sounds being sent to the cloud. The device is configured only to do the tests when no one is there lest the tests become too annoying to be worth keeping LeakyPick in the house. The researchers claim that their prototype has a 94% accuracy rate in detecting audio transmissions.

LeakyPick, named for its function of picking up audio leakage, has a second function to researchers as a way to test for the words and sounds that will wake up a voice assistant. The LeakyPick team has already found 89 words that will awaken Alexa and send audio to Amazon’s cloud. That’s nowhere near the more than 1,000 words and phrases compiled by other researchers recently that will awaken a voice assistant inadvertently. The research exemplifies why smart speaker owners are concerned about privacy. Last summer’s non-stop series of revelations that contractors were listening to audio recordings made by voice assistants upset people because they didn’t even know that private information was being picked up inadvertently. The resulting revisions of programs theoretically addressed the issue, but the Irish whistleblower who first sounded the alarm about Siri’s program unmasked himself recently because he said not enough was being done, leading Ireland’s Data Protection Commissioner to start looking into possible privacy violations by Apple.

Hacking the Sound

According to the researchers, accidental awakenings are also a vulnerable spot for malicious hacking of a voice assistant. Those hacking attempts might go undetected normally, but LeakyPick would spot the transmission to the cloud and let the rightful user know that there may have been such an attempt.

“If LeakyPick can identify which network packets contain audio recordings, it can then inform the user which devices are sending audio to the cloud, as the source of network packets can be identified by hardware network addresses,” the researchers wrote in a paper this month for a cryptography and security conference.”This provides a way to identify both unintentional transmissions of audio to the cloud, as well as above-mentioned attacks, where adversaries seek to invoke specific actions by injecting audio into the device’s environment.”

But, while accidental awakenings are a widespread event, audio hacking of smart speakers is still pretty much unknown. Worries about that kind of hacking are why Alexa has an option that mimics LeakyPick by playing a tone whenever the voice assistant starts listening, and another when it finishes. But, while the idea of loudspeakers and lasers hijacking a smart speaker is exciting, it’s difficult and relatively easy to guard against with everything from voice passwords to just hitting mute on a smart speaker when it’s not in use. There are also other, more mechanical privacy tools for smart speakers like the ClappervsAlexa. Still, finding and plugging any vulnerability is a good strategy for voice assistant developers who want to foster trust in their platforms and reduce the concerns that are keeping many people from buying a smart speaker or using a voice assistant regularly.

  

European Union Regulators Begin Antitrust Inquiry of Voice Assistants and the Internet of Things

Lasers Can Hack Voice Assistants in Example Worthy of Mission Impossible But the Risk is Minimal for Consumers

More than 1,000 Phrases Will Accidentally Awaken Alexa, Siri, and Google Assistant: Study