Microsoft Adds Voice Clip Privacy Controls
Microsoft is shifting its voice data collection policy to supposedly give users more control over their information. The Microsoft voice data collection system is now opt-in, shifting the default to getting rid of user data, a big difference from the way voice assistants collected data before the series of stories last summer sent developers scrambling to rethink their policies.
Microsoft Speaks
Voice clips are the audio recordings made when using a voice product like a transcription or translation service. Microsoft’s updated policy for those clips no longer automatically takes them in for review by employees and contractors as a way to improve their performance. Instead, Microsoft will default to ignoring the clips and not collecting the data. Microsoft will instead request that users opt to join the program as a way to improve the AI.
“The new settings for voice clips mean that customers must actively choose to allow people to listen to the recordings of what they said. If they do, Microsoft employees and people contracted to work for Microsoft may listen to these voice clips and manually transcribe what they hear as part of a process the company uses to improve AI systems,” Microsoft explained in a blog post. “The more transcripts Microsoft has of how real people talk from contributed voice clips, the better these AI systems will perform.”
Private Voices
That doesn’t mean people who don’t opt-in can’t keep using the products and services. From the user’s perspective, nothing visible will change. Microsoft does carve out one exception to ending voice data collection, however. Though the voice clips won’t be gathered by the company, information about the activity will be gathered. That means transcripts from a conversation between a person and an AI will be sent to Microsoft, though not the raw audio. Text or audio, the data is supposed to be anonymized before anyone looks at it, according to Microsoft.
Microsoft’s quality control and accuracy program came into the spotlight in 2019 after reports questioning its practices came to light. Microsoft was one of the many voice AI platforms getting heat and leading to a rapid change in how such programs were handled. The storage of voice clips actually ended in October, but the company clearly hopes to make it something to choose to be part of will appease those concerned with privacy. Though Microsoft has massively revamped the Cortana voice assistant, the policy change is reminiscent of how Google responded to similar pressure. In August, Google revealed that it was restarting human reviews of audio recorded by its software, a year after pausing the program. As with Microsoft, the program is now one people have to volunteer to be part of. Google also made the whole process more transparent overall.
Follow @voicebotai Follow @erichschwartz
Google, Promising Privacy, Asks Voice Assistant Users to Join Audio Review Program
Microsoft Cortana Recordings Reviewed in China Without Security: Report
Report: Microsoft Contractors Review Cortana and Skype Recordings