Apple Apologizes for Letting Contractors Listen to Siri Recordings
Apple officially apologized for allowing contractors to listen to recordings made by its Siri voice assistant in a blog posted on Wednesday. The unsigned statement from the company vowed to make changes to improve customer privacy.
Sorry About Siri
Reports that Apple contractors were listening to Siri recordings first cropped up in late July, among a spate of similar reports about other voice assistants. The audio included interactions between people and the voice assistant, along with many conversations and noises that users probably had no idea were recorded for potential review. The point of the review was to check Siri’s responses and improve its interactions with users. Apple implied in its consumer licensing terms, albeit not explicitly, that human review for quality control and other tests would happen.
In the wake of those reports, Apple put a global pause on the contractor listening program. Even as it halted the program, Apple said it would only be temporary so that it could review the details of its operations. That audit is apparently over already.
“As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said in their statement. “As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users.”
New and Improved Siri Review, Without Contractors
Apple’s plan for adjusting how it uses human review of Siri recordings has a few facets to it. The biggest change from the consumer side is that Apple will ask people if they are willing to allow bits recorded by their Siri devices to be used for improving the voice assistant, turning it into an opt-in system instead of the default approval given when accepting Apple’s terms and conditions. Amazon made a similar move about a month ago but set mechanics up as an opt-out system so Apple is taking a step beyond its most responsive peer so far.
On Apple’s end, the company will not retain Siri recordings as its default, using them only when the permission is granted by users. That doesn’t exclude transcripts automatically generated of those recordings, however, somewhat limiting the impact of this change.
The last major adjustment Apple will make is to restrict who listens to Siri audio recordings to just employees. Contractors will no longer be involved in the quality control and improvement program.
As Voicebot pointed out in our Voice Insider newsletter on August 11th, this is the most logical step for Apple and other voice assistant makers to take. By limiting audio recordings to actual employees, Apple clarifies the chain of responsibility for the recordings, which in practice should provide added protections against misuse because Apple will know more precisely who has access to which recordings and will oversee the processes for handling the data.
The New Approach to Voice Assistant Improvement
Apple’s response to concerns about contractors listening to Siri recordings could be a template for how other companies caught in this situation handle their next steps. Since April, stories about contractors listening to recordings made by Amazon Alexa, then Google Assistant and Microsoft’s Cortana have emerged. Meanwhile, Facebook has potentially worse problems regarding recordings made over its Messenger platform. All of these companies have either paused their contractor listening programs or adjusted them in some way. Amazon recently changed its service agreement by offering the option to opt out of the program and introduced both a new Alexa Privacy Hub and the ability for users to delete recordings using a voice command back in May 2019. With today’s announcement, we now have clarity about how Apple intends to respond to the situation. It will be interesting to see whether Facebook, Google, and Microsoft follow suit or adopt different approaches. At the very least, we have an emerging template showing what two of the tech giants believe is an appropriate compromise between privacy and the practice of reviewing user recordings to improve voice assistant performance.
How much any of these changes will make customers trust these companies more remains to be seen. Those who follow this kind of news closely may be happy to see new privacy safeguards in place, but it may not reassure current and potential voice assistant users already suspicious about what might be recorded. A recent Microsoft study found that 41% of voice assistant users are concerned about passive listening devices invading their privacy. That number won’t appreciably dip without the companies behind them making a visible effort to foster privacy as a value.