Witlingo_greyscale_brielle_nickoloff

Alexa and Siri Listening – Witlingo’s Brielle Nickoloff Discusses Privacy and Consumer Concerns

Witlingo recently published a checklist about privacy considerations comparing Amazon Echo smart speakers and Apple iPhones. It was created by Witlingo in response to frequent questions about the “always listening” privacy concerns related to smart speakers. Many consumers comment about privacy concerns related to always-listening-smart-speakers. What few people understand is that most smartphones today have an always-listening mode activated by default. And, the audio recording is just one of the many ways which smartphones today should raise privacy concerns, but essentially get a free pass.

Siri and Alexa Listening Graphic: Created by Witlingo

Witlingo’s graphic highlights that smart speakers actually provide greater privacy than the devices most of us carry around throughout the day. It also illustrates just how generous users are with giving away their privacy when using a smartphone, raising questions about whether or not smartphones should have so much access to our lives. I caught up with Brielle to understand the origin of the graphic, what she is learning from interacting with end-users, and her thoughts on privacy and voice.

What do you do?

Brielle Nickoloff: My title is lead UX research and design. So I focus on two things: I engage users and potential users of our products to understand their full life context when using our products, and then I design voice-first experiences for usability.

What prompted the graphic?

Brielle Nickoloff, Witlingo

Ahmed [Bouzid] and I had just finished interviewing a resident of an assisted living home in Baltimore, MD, as part of our field research. At Witlingo, we prioritize engaging with real clients so that we can build products and experiences that users love. There is no better way to do that than talking to real users (here is the interview with Ms. Shirley Crowder in full https://youtu.be/fILJmnZAdMI)

So, Ms. Crowder has been advocating and working hard to get other seniors access to Amazon Echo devices for an initiative that intends to reduce social isolation in aging populations. One of the main barriers she’s come across is resistance to the idea that Alexa is “always listening” — even though every one of these seniors trying out Alexa also has a smartphone! While interviewing her, she mentioned that one way she tries to convince others that the Echo is not especially invasive is by noting that their smartphone is far more invasive. On our car ride back, we realized crystallizing this idea through a comparison graphic could be useful.

Of the items on your list, what do you think should be viewed as the biggest privacy issue and why?

For me, it’s the GPS tracking — having a device that knows my location at any given time that starts to detect patterns of behavior and routines across time. This thing is effectively plotting our lifestyles, and it is a pretty uncomfortable thought.

How do the concerns between voice and mobile compare?

Anyone working within the voice-first sphere has heard this same concern repeatedly: “It’s super creepy to have something that could always be listening to you, in your kitchen.” Touché! Our intention with this graphic was not to suggest that just because smartphones are far more privacy-violating than smart speakers, that we should not be concerned about smart speaker privacy concerns. On the contrary, we are hoping that the concerns raised by smart speakers will act as a catalyst to having a real conversation about technology and privacy. This conversation is long overdue.

What are the voice assistant platforms doing right about privacy protection?

From the very first day, Amazon made all of the audios that they record from users available to not only listen but also to delete. Obviously, the mute button was also a good idea to alleviate concerns. The same holds for Google products. These are good features and I believe it alleviates some concerns or at least shows that the companies are acting in good faith.

What do the voice assistant platforms need to do, if anything, in order to add further privacy protection for users?

Honestly, we should not expect too much from these companies to act out of goodwill. They exist to make money and we have seen how they spend a great deal of money fighting against legislation that forces them to be more accountable. I will point just to the obvious case of Facebook and how they have been unable so far to solve the problem, even under extreme pressure. Tech companies are in the business of selling products and mainly convenience that people are willing to pay for. Protecting privacy, at least at this point, is an expense to them and in a way it clashes with the mission of delivering convenience. The implementation of effective privacy strategies needs to be actively pushed by other constituencies such as consumers, privacy watchdog groups, and the government, through legislation.

What should voice assistant users be doing to protect their privacy?

If you are not comfortable with the speaker near you while having a sensitive conversation with someone, unplug the speaker. And, don’t forget to turn your phone off while at it. Short of that, mute your device while talking, and delete your audio files if you don’t want them to survive on Amazon or Google’s servers. Other than that, customers need to speak up more. They need to express their concerns to the companies whose products they are using. After all, they are customers, and privacy is a feature of the product. Just like consumers may be vociferous about wanting this feature or that feature, they should also make it clear that they don’t want the information collected about them to be used in ways that are harmful to them. Consumers can join also watch groups and raise concerns with their congressional representative. In other words, we should work towards being active consumers and active citizens.

Rand Hindi CEO of Snips Talks Independent Voice Assistants, Privacy, Block Chain and ICO – Voicebot Podcast Ep 51

Apple HomePod Has a Privacy Flaw That No One is Talking About