Naver Clova

Naver Defends Letting Contractors Review Clova Voice Assistant Recordings

Naver is the latest company facing criticism for letting contractors listen to recordings made by its Clova voice assistant, according to a report in the Korea Times. The South Korean company is defending its method as both legal and not invasive of user privacy.

Voice Assistant QA and Improvement

Naver sends small samples of recordings made during user conversations with Clova to the contractors for quality control and to improve how the voice assistant functions. Identifying details are said to be stripped out of the recording, which is only a few seconds long usually. Naver’s terms and conditions for Clova explicitly state that data collected by the voice assistant is stored and used to improve service quality before being destroyed. That hasn’t stopped the criticism from coming, however.

Naver is continuing the trend of voice assistant developers facing backlash for their contractor-led quality control programs. Amazon faced the issue first after reports about Alexa recordings came out in April. July saw the same revelations about Google Assistant and Apple’s Siri, and August had barely begun before Microsoft was forced to respond to similar complaints about contractors listening to Cortana and Skype recordings. Facebook is facing its own, even more troubling problems about recordings made from conversations on its Messenger platform.

All of the other voice assistant makers admitted that the AI had recorded audio accidentally, including personal and even criminal activity. False activations are a problem for all voice assistants. A survey of 375 voice assistant users in September 2018 by Voicebot found that over 70% of device users have noticed one of these “wake up” errors. Nearly 29% said these false wakeups occur daily. Despite that, Naver claimed in a statement that no conversations were recorded when users didn’t wake Clova. That is statistically unlikely based on the current data.

Changes Coming

For now, Naver is sticking to defending its contractor listening program. Many of the other companies in this situation are already making changes to how they run their programs. Apple officially apologized for its contractor listening program last week, not long after putting a global pause on the program. After a review, Apple will now ask people for permission to use bits of recordings to improve Siri, and will only use company employees to listen to the audio. Amazon took a slightly different route, setting up an opt-out system instead. Both companies have revised their terms and conditions to lay out how their programs work more explicitly.

Whether Apple or Amazon’s changes will set a template for privacy remains to be seen. Naver clearly feels confident in its defense of the program, but finding an appropriate compromise between privacy and the practice of reviewing user recordings to improve voice assistant performance is something all developers will need to eventually address.

  

Apple Apologizes for Letting Contractors Listen to Siri Recordings

Report: Microsoft Contractors Review Cortana and Skype Recordings