25 September 2023

Listening in: Siri and her records of our intimate moments

Start the conversation

Anna Washenko* says a new report has seen Apple singled out for criticism over how it reviews its user recordings.


Voice assistants are growing in popularity, but the technology has been experiencing a parallel rise in concerns about privacy and accuracy.

Apple’s Siri is the latest to enter this grey space of tech.

Last week, The Guardian reported that contractors who review Siri recordings for accuracy and to help make improvements may be hearing personal conversations.

One of the contract workers told The Guardian that Siri did sometimes record audio after mistaken activations.

The wake word is the phrase “hey Siri”, but the anonymous source said it could be activated by similar-sounding words or with the noise of a zipper.

They also said that when an Apple Watch is raised and speech is detected, Siri will automatically activate.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” the source said.

“These recordings are accompanied by user data showing location, contact details, and app data.”

Apple has said that it takes steps to protect users from being connected with the recordings sent to contractors.

The audio is not linked to an Apple ID and less than 1 per cent of daily Siri activations are reviewed.

It also sets confidentiality requirements for those contract workers.

We reached out to Apple for further comment and will update the story if we receive it.

Apple, along with Google and Amazon, all have similar policies for the contract workers it hires to review those audio snippets.

But all three voice AI makers have also been the subject of similar privacy breaches, either by whistleblowers going to the press or through errors that give users access to incorrect audio files.

The tech companies have also all been the subject of inquiries around voice platforms recording conversations they weren’t supposed to.

Amazon recently outlined its policies for keeping and reviewing recordings in response to queries from US Senator Chris Coons.

A whistleblower report from a Google contractor said that workers had heard conversations between parents and children, private and identifying information, and at least one possible case of sexual assault.

These cases bring up a series of questions.

What can Apple and its colleagues do to better protect user privacy as they develop their voice systems?

Should users be notified when their recordings are reviewed?

What can be done to reduce or eliminate the accidental activations?

How should the companies handle the accidental information that its contractors overhear?

Who is responsible when dangerous or illegal activity is recorded and discovered, all by accident?

Voice assistants appear to be yet another instance where a technology has been developed and adopted faster than its consequences have been fully thought-out.

* Anna Washenko is a freelance writer in Los Angeles. She tweets at @AnnaGetsPithy.

This article first appeared at arstechnica.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.