27 September 2023

Hackers access history: An Alexa bug could expose peoples voice history

Start the conversation

Lily Hay Newman* says new research should make people think twice about the personal data their smart assistant stores.


Amazon has patched the flaw, but remember to lock down your voice assistant interactions.

Smart-assistant devices have had their share of privacy missteps, but they’re generally considered safe enough for most people.

New research into vulnerabilities in Amazon’s Alexa platform, though, highlights the importance of thinking about the personal data your smart assistant stores about you—and minimising it as much as you can.

Findings published on Thursday by the security firm Check Point reveal that Alexa’s Web services had bugs that a hacker could have exploited to grab a target’s entire voice history, meaning their recorded audio interactions with Alexa.

Amazon has patched the flaws, but the vulnerability could have also yielded profile information, including home address, as well as all of the “skills,” or apps, the user had added for Alexa.

An attacker could have even deleted an existing skill and installed a malicious one to grab more data after the initial attack.

“Virtual assistants are something that you just talk to and answer, and usually you don’t have in your mind some kind of malicious scenarios or concerns,” says Oded Vanunu, Check Point’s head of product vulnerability research.

“But we found a chain of vulnerabilities in Alexa’s infrastructure configuration that eventually allows a malicious attacker to gather information about users and even install new skills.”

For an attacker to exploit the vulnerabilities, they would need first to trick targets into clicking a malicious link, a common attack scenario.

Underlying flaws in certain Amazon and Alexa subdomains, though, meant that an attacker could have crafted a genuine and normal-looking Amazon link to lure victims into exposed parts of Amazon’s infrastructure.

By strategically directing users to track.amazon.com—a vulnerable page not related to Alexa, but used for tracking Amazon packages—the attacker could have injected code that allowed them to pivot to Alexa infrastructure, sending a special request along with the target’s cookies from the package-tracking page to skillsstore.amazon.com/app/secure/your-skills-page.

At this point, the platform would mistake the attacker for the legitimate user, and the hacker could then access the victim’s full audio history, list of installed skills, and other account details.

The attacker could also uninstall a skill the user had set up and, if the hacker had planted a malicious skill in the Alexa Skills Store, could even install that interloping application on the victim’s Alexa account.

Both Check Point and Amazon note that all skills in Amazon’s store are screened and monitored for potentially harmful behaviour, so it’s not a foregone conclusion that an attacker could have planted a malicious skill there in the first place.

Check Point also suggests that a hacker might be able to access banking data history through the attack, but Amazon disputes this, saying that information is redacted in Alexa’s responses.

“The security of our devices is a top priority, and we appreciate the work of independent researchers like Check Point who bring potential issues to us,” an Amazon spokesperson told WIRED in a statement.

“We fixed this issue soon after it was brought to our attention, and we continue to further strengthen our systems. We are not aware of any cases of this vulnerability being used against our customers or of any customer information being exposed.”

Check Point’s Vanunu says that the attack he and his colleagues discovered was nuanced and that it’s not surprising Amazon didn’t catch it on its own given the scale of the company’s platforms.

But the findings offer a valuable reminder for users to think about the data they store in their various Web accounts and to minimise it as much as possible.

Not a case of “OK, come on in!”

“This definitely wasn’t a case of an open door and ‘OK, come on in!’” Vanunu says.

“This was a tricky attack, but we’re glad Amazon took it seriously, because the implications could have been bad with 200 million Alexa devices out there.”

Though you can’t control whether Amazon has a bug in one of its far-flung Web services, you can minimise data on your Alexa account.

After blowback over hazy practices related to using human transcribers for some Alexa users’ audio snippets, Amazon made it easier to delete your audio history.

It’s important to do this regularly, because otherwise Amazon will store those recordings indefinitely.

To view and delete your Alexa history, open the Alexa app on your phone and go to Settings > History.

In this view, you can only delete entries one by one.

To delete en masse, go to Alexa Privacy Settings on Amazon’s Website and then choose Review Voice History.

You can also delete verbally by saying, “Alexa, delete what I just said” or “Alexa, delete everything I said today.”

*Lily Hay Newman is a senior writer at WIRED focused on information security, digital privacy, and hacking.

This story first appeared at wired.com.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.