Adam Clark Estes* says the latest scandal with the Amazon Echo illustrates the real-life privacy nightmare that always-on voice assistants bring into our homes.
What’s the most terrifying thing you can imagine an Amazon Echo doing?
Think realistically.
Would it be something simple but sinister, like the artificially intelligent speaker recording a conversation between you and a loved one and then sending that recording to an acquaintance?
That seems pretty bad to me.
And guess what: it’s happening.
A husband and wife in Portland, Oregon, recently received a disturbing call from the man’s employee.
“Unplug your Alexa device right now,” said the voice on the line.
“You’re being hacked.”
That would have been scary enough, but then the thoughtful employee explained that he had recently received audio files containing a conversation between the couple.
When they doubted him, the employee sent the files.
Sure enough, the couple’s Amazon Echo had shared a recording of a private conversation without the couple’s permission — and it wasn’t because of hackers.
It was because of Amazon.
Amazon recently admitted that the Portland couple had fallen victim to an “unlikely … string of events.”
Somehow, their Echo had misinterpreted background noise as a wake word and then another sound as a command to send a message and then another string of words as a command to send the recording to the man’s employee.
Amazon even claims that Alexa said, “Do you have a story for us? All tips confidential, right?” to confirm the action, but the couple denies that the devices ever asked for a confirmation to send the message.
Heck, they didn’t even know they were being recorded in the first place.
To say, “This is some Black Mirror shit,” would not only be cliché, it would be an understatement.
This incident illustrates the real-life privacy nightmare that always-on voice assistants bring into our homes.
As with any internet-connected technology, smart speakers like the Amazon Echo and the Google Home confront consumers with the decision to trade privacy for convenience.
The terms of that trade-off remain unclear.
For now, we know that these devices record your commands in order to train their voice software to understand commands better.
We also know that Google and Amazon both hold a number of patents that would enable them to collect data from voice commands to do anything from making a judgment about a child’s level of “mischief” to gauging a person’s mood in order to personalise content or target ads.
Amazon specifically has already started experimenting with ads on Alexa-powered devices in the form of sponsorships and is reportedly in talks with companies about delivering ads based on voice commands.
If you ask how to remove a stain, for instance, Alexa might respond with a cleaning ad.
But right now, these are just ideas.
Present day reality is, in some ways, much more frightening.
The technology that powers internet-connected, voice-controlled devices is so new that we simply don’t know how or when it will fail.
And we definitely don’t know what the consequences might be when they do.
Scenarios like the Alexa oopsie above don’t even represent security issues.
They represent a fundamental design flaw in these apparently under-tested systems.
If Amazon Alexa and Google Assistant are supposed to improve as they collect more data and learn more about human speech, we can only conclude that there’s always a chance they will fail and do the wrong thing along the way.
We now know that might mean your Echo could record a private conversation between you and a loved one and send it someone on your contact list.
That’s all assuming these devices work like they’re supposed to.
There are other ways that voice-controlled assistants become compromised, including but not limited to software bugs, security shortcomings, and government intervention.
For example, a touch panel bug turned some Google Home Minis into fully fledged surveillance devices last year.
Security researchers, meanwhile, have had a field day hacking Alexa and turning her into an always-listening spy.
And let’s not forget that Amazon has proven that it will hand over your Echo data to law enforcement if the situation demands it.
The US Federal Bureau of Investigation (FBI) may or may not be wiretapping Echo devices in the meantime.
If you’ve noticed that I haven’t mentioned Apple or Siri in all of this creepy surveillance business, you get a gold star.
The HomePod and other Siri-powered experiences simply haven’t been subject to so much scandal (yet).
That might be because Apple insists that all Siri commands are anonymised, encrypted and stored on the device.
Who knows if, in the long term, this means that Siri is a safer assistant than Alexa.
But for now, as far as we know, Apple’s technology simply hasn’t created the appalling sort of situation that leads to a couple’s private conversation being sent to a seemingly random person because of crappy software.
Yet.
As for Alexa, though, now feels like a moment of reckoning.
Amazon’s admitting the Echo error happened at almost the same time we saw reports that Google Home had outsold Amazon Echo devices for the first time.
That’s probably a coincidence, but it makes you wonder if Amazon is in over its head when it comes to artificial intelligence and machine learning.
I’ve argued in the past that Google’s smart gadgets work better than Amazon’s.
And now, despite a bug here or there, I’m starting to feel like Alexa might just be dangerous in her inferiority.
Alexa’s screw-ups are scary.
The conversation recording is actually terrifying.
It’s a nightmare.
And it’s also one you can avoid.
Don’t buy an Echo.
Definitely, don’t buy one for a friend.
* Adam Clark Estes blogs about the future for Gizmodo in New York. He tweets at @adamclarkestes and his website is adamclarkestes.com.
This article first appeared at www.gizmodo.com.au.