is listening to its users’ concerns — and not just literally.
The company said Monday it would improve the way it informs customers about how their Google Assistant voice recordings will be used. Part of that effort will include having users re-confirm their preferences for a setting that may share some audio recordings with human language experts.
Google’s announcement came two months after the Belgian news site VRT NWS obtained more than 1,000 audio recordings ofGoogle Home and Google Assistant users, including some audio that had been mistakenly recorded. Google soon confirmed that it had made recordings for a language expert who had partnered with the company, and that the expert had “violated our data security policies by leaking confidential Dutch audio data.” Following that, Google said it “paused” letting people listen to voice recordings from the devices.
Google Assistant is available on devices including the Google Home smart speaker (a Mini costs $49, while the regular size costs $99) and the Google Nest Hub, which retails for $129. Similar to Amazon’s Alexa, Google Assistant helps users play music, retrieve information from the internet, send messages and control their smart home devices.
In a blog post published Monday, Google Assistant senior product manager Nino Tasca said the company had recently “heard concerns about our process in which language experts can listen to and transcribe audio data from the Google Assistant to help improve speech technology for different languages.” “It’s clear that we fell short of our high standards in making it easy for you to understand how your data is used, and we apologize,” he said.
Google will now update its audio settings to alert users that opting in to the Voice & Audio Activity setting — which records users in an effort to boost voice recognition — may involve people listening to recordings of their voices. Those who already use Assistant will be able to re-confirm their Voice & Audio Activity preferences before Google allows people to listen in, a practice the company calls “the human-review process.”
The company said that the device’s default setting has always been to not store audio recordings, and that the clips aren’t linked to user accounts. Language experts hear just about 0.2% of audio snippets from people who have opted in to the VAA setting, Google said. “Going forward, we’re adding greater security protections to this process, including an extra layer of privacy filters,” the company added.
Google Assistant will “vastly reduce” how much audio data it stores, the company said, and later this year Google will automatically delete most VAA audio data “older than a few months.” The company also said it would improve its ability to detect unintentional audio activations. The recording is only supposed to happen when a user activates the device by saying “Hey Google” or “OK Google” or touches the microphone icon.
Google is far from alone in seeking to allay concerns about smart-speaker privacy. Amazon
last month said it would let customers opt out of letting humans listen to their Alexa recordings after a Bloomberg report in April shed light on the practice. And Apple
, which came under scrutiny over a Guardian report of contractors allegedly listening to Siri recordings, apologized last month for not “fully living up to our high ideals” and overhauled the voice assistant’s privacy protections.
, for its part, confirmed in April that it was developing a voice assistant to potentially work with products like its Portal smart displays and Oculus virtual-reality arm.
Class A shares of Google parent Alphabet, Inc. have been up this year 18% compared to a 15% gain for the Dow Jones Industrial Average
and a 19% increase for the S&P 500 Index