Google admits to Google Assistant audio data leak
2 min. read
Published on
Read our disclosure page to find out how can you help MSPoweruser sustain the editorial team Read more
Yesterday, we reported that Google contractors might be listening into your Google Home speaker. Google today admitted that Dutch audio data related to Google Assistant got leaked. Google said that one of its language reviewers has violated its data security policies by leaking confidential data.
I understand that language experts review and transcribe a small set of audio queries to help AI better understand those languages. But I don’t understand why Google allows them to access get access to these files. Ideally, these experts should be invited to Google’s workplace and the audio data should be played back to them under Google’s controlled environment. Google can’t expect 3rd party contractors to safeguard their customer data. Google mentioned in today’s blog post that it is conducting a full review of its safeguards in this space to prevent misconduct like this from happening again.
We apply a wide range of safeguards to protect user privacy throughout the entire review process. Language experts only review around 0.2 percent of all audio snippets. Audio snippets are not associated with user accounts as part of the review process, and reviewers are directed not to transcribe background conversations or other noises, and only to transcribe snippets that are directed to Google.
You can read Google’s full response from the source link below.
Source: Google
User forum
0 messages