Limpid Look

Google Subcontractor Leaked your “Hey Google!” Recordings

Google admitted on Thursday that more than 1,000 sound recordings of customer conversations with the Google Assistant were leaked by some of its partners to a Belgian news site VRT. Google employees are systematically listening to audio files recorded by Google Home smart speakers and the Google Assistant smartphone app. Throughout the world – so also in Belgium and the Netherlands – people at Google listen to these audio files to improve Google’s search engine. VRT was able to listen to more than a thousand recordings. Most of these recordings were made consciously, but Google also listens to conversations that should never have been recorded, some of which contain sensitive information.

Not everyone is aware of the fact that everything you say to your Google smart speakers and your Google Assistant is being recorded and stored. But that is clearly stated in Google’s terms and conditions. And what people are certainly not aware of, simply because Google doesn’t mention it in its terms and conditions, is that Google employees can listen to excerpts from those recordings. Google’s reviewers may not see account data, but they still get to hear very private information, for example related to health. Jef Ausloos, a researcher at the Centre for IT & IP Law at the University of Leuven, in Belgium, told VRT that means Google’s system may not comply with GDPR, which requires explicit consent to collect health data.

Google today defended its practice of having workers listen to users’ Google Assistant queries, following the leak of 1,000 voice recordings to a media outlet. Google also said it will try to prevent future leaks of its users’ voice recordings.

Google: Leak violated data-security policy

Google responded to the VRT story in a blog post today.

“We just learned that one of [our] language reviewers has violated our data-security policies by leaking confidential Dutch audio data,” Google said. “Our Security and Privacy Response teams have been activated on this issue, are investigating, and we will take action. We are conducting a full review of our safeguards in this space to prevent misconduct like this from happening again.”

Google has previously disclosed that it hires language experts to listen to recordings, and it defended the practice in today’s blog post.

“As part of our work to develop speech technology for more languages, we partner with language experts around the world who understand the nuances and accents of a specific language,” Google wrote. “These language experts review and transcribe a small set of queries to help us better understand those languages. This is a critical part of the process of building speech technology and is necessary to creating products like the Google Assistant.”

We asked Google today if its internal employees also listen to the recordings. A company spokesperson answered “yes” and added that “we apply a wide range of safeguards to protect user privacy throughout the entire review process (both internally and with our affiliates).”

It’s clear that owning a Google Home or a similar Assistant device and allowing it to listen to your sensitive daily conversations and verbalized internet requests involves at least some type of privacy compromise. Using any Google product does, because the company makes money on collecting that data, storing it, and selling targeting ads against it. But these findings contradict Google’s claims that it’s doing seemingly everything it can to protect its users’ privacy, and that its software isn’t listening unless the wake word is uttered. Clearly, someone is in fact listening, somewhere else in the world. And sometimes they’re not supposed to be.

Add comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sponsored Content


Sponsored Content

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.


Instagram has returned empty data. Please authorize your Instagram account in the plugin settings .