2 min read

Apple Offers to Settle ‘Siri Eavesdropping’ Lawsuit for $95 Million

Filip TRUȚĂ

January 03, 2025

Promo Protect all your devices, without slowing them down.
Free 30-day trial
Apple Offers to Settle ‘Siri Eavesdropping’ Lawsuit for $95 Million

Apple has filed a motion to settle a class action lawsuit alleging that its digital assistant Siri leaked audio recordings to third parties.

Years in the making

The lawsuit, going on for some five years now, claims that “without the user’s consent, Apple recorded, disclosed to third parties, or failed to delete, conversations recorded as the result of a Siri activation.”

“Far from requiring a 'clear, unambiguous trigger' as Apple claimed in its response to Congress, Siri can be activated by nearly anything, including ‘[t]he sound of a zip’ or an individual raising their arms and speaking,” according to the complaint.

“Once activated, Siri records everything within range of the Siri Devices’ microphone and sends it to Apple’s servers,” allege the plaintiffs.

Some customers claim their web browser started serving ads as if Siri knew what they’d been saying in private. For example, one plaintiff claims they began seeing ads for a specific medication after a chat in person with their physician.

$95 million in settlement fees

Apple aims to appease upset customers with $95 million, to be distributed on a pro rata basis according to the number of Siri-enabled devices that experienced an unintended Siri activation, according to the settlement.

Considering the broad claims made in the suit and the potentially high number of customers eligible for compensation, individual plaintiffs may end up getting only small awards.

If the settlement is reached, Apple is to also provide non-monetary relief targeted at addressing the alleged violations, including confirmation that Apple has permanently deleted individual Siri audio recordings collected prior to October 2019. It would also publish a webpage further detailing how users may opt in to the “Improve Siri” option and listing the specific information stored because of this opt-in.

Point taken

To Apple’s credit, soon after the conundrum began, the company did acknowledge that it could’ve done a better job with Siri collecting audio samples from users.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading,” the iGiant said in August 2019. “We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies […] As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize.”

Apple then immediately stopped retaining Siri recordings by default and went on to only use computer-generated transcripts to continue improving the service. It also clarified that its review team would “work to delete any recording which is determined to be an inadvertent trigger of Siri.”

tags


Author


Filip TRUȚĂ

Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.

View all posts

You might also like

Bookmarks


loader