Microsoft Cortana Recordings Reviewed in China Without Security: Report
Microsoft sent recordings made by Skype and its Cortana voice assistant to a contractor in China without any security measures, according to a report in the Guardian. The potential exposure of private information put Microsoft’s quality control and accuracy program back in the spotlight months after earlier reports questioning its practices came to light.
Private Information Exposed
According to the contractor, Skype phone calls and conversations with Cortana were accessible through a web app on personal laptops at home in Beijing. There was no system in place to limit prevent criminals or Chinese authorities from acquiring the recordings and the Microsoft accounts used to review the recordings all used the same password, with little in the way of background checks. Logins could theoretically be shared, and recordings saved onto the computer without any barriers to prevent it.
The Cortana and Skype user agreements allow Microsoft to use recordings for improving accuracy. There was a similar system in place for recordings made by the Xbox One game system until a similar story about contractors reviewing those recordings came out. The Xbox contractor recording program has since ended and the game console doesn’t even support Cortana anymore. Microsoft applies the human check-ups to upgrade Cortana and Skype, but the agreements previously said nothing about contractors manually reviewing the recordings. Only after the reports this summer came out did the company adjust the contracts.
“We review short snippets of de-identified voice data from a small percentage of customers to help improve voice-enabled features, and we sometimes engage partner companies in this work,” Microsoft said in a statement. “We’ve always disclosed this to customers and operate to the highest privacy standards set out in laws like Europe’s GDPR. This past summer we carefully reviewed both the process we use and the communications with customers. As a result we updated our privacy statement to be even more clear about this work, and since then we’ve moved these reviews to secure facilities in a small number of countries.”
Who is Listening?
Voice assistant developers have redoubled their efforts to reassure users that their privacy is secure since this past summer, when Microsoft, along with Apple, Google, Amazon, and other platform makers had to explain how their programs work. Even when identifying information about the recording is stripped out, the audio might include private information. That’s especially true when the voice assistant is accidentally awoken. False activations happen frequently, with 64% of U.S. voice assistant users claiming they unintentionally turn on a voice assistant at least once a month according to a survey by The Manifest. There’s no way of knowing how often it happens without the speaker even noticing, but the Chinese contractor claimed to have heard accidental recordings, including what may have been violence.
Privacy is often pointed to as a vital issue by voice assistant users. These reports don’t do the developers any favors in that regard and even resolving the current issues doesn’t undo the suspicion ignited originally. The project of earning people’s trust in voice technology so that they are willing to fully engage with them is not going to complete any time soon.