Technology giants Apple and Google have agreed to stop using humans to review voice assistant recordings. The decision comes as the companies take fire for alleged unethical practices involving their voice-powered home assistants.
Trained to listen to wake phrases like “Okay” and “Hey,” Google Home and Apple’s Siri constantly listen for potential voice commands. Sometimes, however, they wake up by mistake. When they do, people can divulge highly sensitive information. But that’s only part of the reason data protection watchdogs have ordered the duo to cease reviewing recordings with human employees.
Last month, Google came under fire after one of its language reviewers violated the company’s data security policies by leaking confidential Dutch audio data. Apple, for its part, has been scrutinized for a “grading,” where it reviewed recordings of peoples’ conversations with Siri through devices like the Home Pod or the Apple Watch.
In fact, all three major sellers of personal assistants — Google, Amazon and Apple — have taken fire for the way they store and use people’s data and recordings. Data protection authorities are now saying enough is enough.
“The use of language assistance systems in the EU must comply with the data protection requirements of the GDPR,” said Johannes Caspar, Hamburg’s commissioner for data protection and freedom of information. “In the case of the Google Assistant, there are currently significant doubts.”
Hamburg’s authority can only enforce the ban for three months. Ireland’s Data Protection Commissioner is the go-to authority to expand the ban indefinitely and across the entire European Union.
“We are currently examining the matter,” said Graham Doyle, a spokesman for the Irish agency.
Amazon has yet to align itself to the decision by European authorities to stop using human reviewers for Echo recordings. All three companies claim the practice is necessary to improve their assistants’ algorithms.
Users of any of the three voice-activated assistants can delete recordings, recording history, or stop the voice-listening functionality outright in their settings menus. Going the extra mile, Apple has pledged to make the opt-out process easier and more transparent with a future software update.
tags
Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.
View all postsNovember 14, 2024
September 06, 2024