Cortana Device

Report: Microsoft Contractors Review Cortana and Skype Recordings

Microsoft hired contractors to review recordings collected by its Cortana voice assistant and Skype automatic translator according to a report by Motherboard. The news follows similar revelations about Amazon, Google, and Apple sharing with contractors recordings of consumer voice assistant interactions, including many made without the user knowing about it.

Cortana and Skype’ Reviewed for Improvement

The Cortana and Skype user agreements both mention that recordings will be used to make the respective products better. By hunting for errors and adding corrections, Microsoft is teaching Cortana and Skype how to better perform their functions. But the agreements say nothing about contractors manually reviewing the recordings.

A key issue with people listening to voice assistant recordings, whether Cortana or any other service, is the private information contained in some of the audio. Identifying information about where the recordings come from gets stripped out before anyone listens, but that’s irrelevant if the audio includes the user discussing their personal information out loud. Smart speakers and other voice assistant-enabled devices are only supposed to start recording if explicitly called with the right wake word. False activations are all too common, however. Financial discussions, medical diagnoses, and plenty of intimate encounters all get picked up when the voice assistant erroneously detects the wake command. Some of them are eventually heard by a contractor.

According to a Voicebot survey in 2018, 28.5 percent of smart speaker users said they noticed false wake-ups at least once a day, and 43.7 percent reported it happening at least once per  month. And, that doesn’t include the times they didn’t notice the device activating.

Rethinking What to Hear

By this point, no one should be surprised that Microsoft contractors are listening to audio recorded by Cortana and Skype, accidental or not. A similar report about Amazon Alexa recordings came out in April and July saw the same revelations about Google Assistant and Apple’s Siri voice assistant.

The sudden focus on how these companies use voice recordings, and the amount of information inadvertently picked up and listened to by contractors, is already provoking changes. Google and Apple employees and contractors put a pause on their quality assurance listening programs. Google decided to apply the temporary halt to the whole European Union. That their announcement came out right as German authorities ordered Google to stop running the program in their country for an investigation that is supposedly unrelated.

Apple’s cessation of their program applies worldwide, although the company has said it will restart it as an opt-in system. Amazon hasn’t paused their review program, but it has made adjustments to the service agreement. The terms now state outright that some of the data will be reviewed by humans manually, and offers the user a chance to opt-out.

Do Security and Privacy Matter?

That Microsoft didn’t anticipate people being upset about what is heard by their contractors is slightly ironic. A study published by the company showed that 41 percent of voice assistant users are concerned about passive listening devices invading their privacy. Whatever the company decides to do about their contractor review program, there’s no doubt that the debate about how to square privacy with the need for improving the software with manual review won’t end any time soon.

  

Google & Apple Hit Pause on Sharing Voice Assistant Recordings with Contractors

Report: Apple Contractors Listen to Siri Recordings

Google Contractors Listen to Belgian and Dutch Voice Assistant Conversations