27 September 2023

Big bytes: Why Apple’s appetite for data matches every other company’s

Start the conversation

Jerry Hildenbrand* says Apple’s Siri privacy blunder proves how valuable audio data is to building a digital assistant.


A piece from The Guardian back in July that claimed Apple “regularly” had third-party contractors listening to the things Apple customers were telling Siri, has kicked off a bit of a stir.

The company that uses privacy to bolster its reputation was caught doing the same things that Google, Amazon, and every company with a digital personal assistant was doing, and people were ready with digital pitchforks raised.

In response, Apple apologised and has promised to change the way it collects data and that third parties are no longer going to be involved when it comes to listening to saved recordings of users who have opted in.

This, of course, is exactly what any company would do when it was caught doing a thing its users don’t like.

The thing is, this move guarantees that Siri is always going to suck compared with the digital assistants offered by other companies.

Apple’s biggest blunder

Late last month, Apple released a full-on apology along with a promise to do better.

You should follow the link and read it if this sort of thing interests you, but here are the “important” bits: “First, by default, we will no longer retain audio recordings of Siri interactions.”

“We will continue to use computer-generated transcripts to help Siri improve.”

“Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests.”

“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place.”

“Those who choose to participate will be able to opt-out at any time.”

“Third, when customers opt-in, only Apple employees will be allowed to listen to audio samples of the Siri interactions.”

“Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”

These all sound like good ideas that should have been implemented from day one, but at least they are in place now.

They definitely should have been considered when the company was buying ad space to tell us all that only Apple cares about our privacy and those other companies will sell you down the river because they are inherently evil.

The worst thing about what Apple did was that it promised us it would never happen.

That’s the kicker.

What Apple did was in line with its privacy policy (though it would have been nice if we knew third parties were doing the listening) and these changes are all good things.

But that doesn’t matter because Apple was trying to convince us all that it would never, ever, ever pull a stunt like this because it was better than other tech companies.

And people bought it because nobody really thinks about how something like a digital assistant works or how it can be made to work better.

When caught with its hand in the cookie jar, Apple did the right thing.

But Apple promised it would never try to steal a cookie and that leaves many with a really bad impression.

Training AI isn’t magic

Earlier I said this move would ensure Siri always laggs behind the competition.

That’s because of how you make something like a digital voice-activated assistant better.

Amazon and Google are super intrusive with the amount of data each collects.

Don’t make the mistake of thinking Apple doesn’t also collect a sickening amount of user data.

The difference is that Amazon and especially Google are very upfront about aggregating it all so your experience with Alexa or Google Assistant is much more personal.

If you want a voice-activated product to understand people, you need to let programmers listen in.

Siri, for one reason or another, isn’t there yet and without incorporating more user data it never will be.

Apple seems fine with this, positioning Siri as more of a product you can ask questions to get answers instead of something more proactive.

Users happily opt in when it comes to Google Assistant because it gives something they think is valuable in return.

If Siri can’t give you proactive information about your day to day life, there isn’t as much value in letting Apple listen to what you’re saying.

That’s a real problem when it comes to voice recognition.

People in different areas or from different backgrounds will always speak differently.

Accents, voice inflection, the choice words to use and more mean any AI needs plenty of training to recognise how we talk and it can’t get that from a written transcript.

They all do it

The best thing you can do when it comes to digital assistant tech is to remember that every company offering it grabs as much data about you as it can.

What is important is that you’re properly notified in advance of what data is being collected and how that data is stored and used.

If you trade your data away, make sure what you get in return is worth it.

Equally important is how a company reacts when we realise just what those terms and conditions really mean.

Apple told everyone upfront — as did Google or Microsoft or Amazon — that data was being stored and possibly even listened to in the terms you agree to when you first use Siri.

While the language obviously wasn’t clear enough, otherwise we wouldn’t be surprised when this sort of thing happens, it is there for us to read.

The gripe that third parties are the ones doing the listening is something that should have been addressed beforehand, but at least now it is.

What you need to do is consider the value of the service(s) you receive in return for all that data.

If you love Siri or any other voice service from a tech company and think it’s worth trading your data for it, keep doing it.

Just make sure you know what you’re giving away in return for it.

* Jerry Hildenbrand writes for Mobile Nation and is an editor for Android Central. He tweets at @gbhil.

This article first appeared at www.androidcentral.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.