27 September 2023

Silent stalkers: How hackers can secretly talk to Siri and steal your data

Start the conversation

Courtney Linder* says hackers can take advantage of a smartphone’s voice assistant by sending silent commands to your phone.


Photo: bedo

Your smartphone almost certainly features a voice assistant, whether that’s Google Assistant or Siri.

At best, these bots seem like harmless helpers that set alarms, play music, and order pizza.

At worst, they expose almost all of your device’s data, down to your text messages and private documents.

Researchers recently discovered a new type of smartphone cyberattack, called SurfingAttack, that leverages your voice assistant.

By sending commands to your phone through inaudible ultrasound waves, bad actors can force Google Assistant or Siri to make fraudulent calls with your mobile phone or retrieve SMS codes to gain access to your accounts.

The team — which hails from Michigan State University, the Chinese Academy of Sciences, the University of Nebraska-Lincoln, and Washington University in St Louis, Missouri — used an ultrasonic transducer, concealed beneath a table, to convert voice commands into a silent vibration that is indiscernible to human ears, but that voice assistants could pick up on.

Michigan State’s Qiben Yan told Popular Mechanics his lab already conducts research in the areas of mobile and Internet-of-Things (IoT) security, so when he learned voice attacks could be propagated through ultrasound airwaves two years ago, he and his team began considering whether an attack like that could be transmitted through solid materials, like a table.

Because Yan and his colleagues were already studying ultrasound testing, which he says is “widely used” for the inspection of metallic pipelines, it was a clear starting point for their work on SurfingAttack.

How SurfingAttack works

First, the target device is placed on a tabletop, perhaps a desk in an office.

Special software then creates imperceptible commands, which are turned into an electrical signal through an ultrasonic generator.

After that, a piezoelectric transducer — which converts electrical signals into vibrations — allows the silent command to propagate through the table.

Both the transducer and a microphone are hidden beneath the table to keep the commands mum and record whatever the voice assistant says.

The researchers discovered they could successfully send commands to Siri and Google Assistant from up to 9m away and through various mediums, including glass, metal, and wood.

A soft tablecloth, however, bungled up the signal.

“Theoretically, ultrasonic guided waves can transmit in any solid materials: for example, one scenario can be attacking phones placed on the floor, another can be attacking devices placed on a seat inside a vehicle,” says Yan, who is an Assistant Professor in the Department of Computer Science and Engineering at Michigan State.

While this kind of attack is less likely to happen in the wild due to the specifics of the setup — it would probably work best in a cubicle environment — it does illustrate a smartphone weakness that hasn’t been considered in the past: Voice assistants are listening not only to commands given in the spectrum of the human speaking voice, which ranges from 20 hertz to 20 kHz, but can discern commands at a slightly higher frequency just outside of our own hearing range.

The SurfingAttack uses a frequency of 20–40 kHz.

Yan says the team used Lamb waves, a particular type of ultrasonic surface-guided waves, due to its inaudible frequency.

“[It] propagates in the solid materials without making any perceivable noise or vibration, which is exactly why we use this wave form,” he says.

Take precautions

The researchers tested out the SurfingAttack on 17 different models of phone and found it was successful on 15 of them, including the Google Pixel 1, 2, and 3; the Motorola G5 and Z4; Samsung Galaxy S7 and S9; Xiaomi Mi 5, Mi 8, and Mi 8 Lite; and the iPhone 5, 5s, 6 Plus, and X.

The only phones that were impermeable to the threat were the Huawei Mate 9 and Samsung Galaxy Note 10 Plus.

Luckily, it’s not very difficult to insulate yourself from this kind of attack, Yan says.

While the attack works best in public spaces — he points to situations where you may want to charge your smartphone at the airport, for instance, and leave it unattended on a tabletop — it’s going to be pretty tricky to pull this off in a private setting.

To stay safe, just keep your phone in your pocket, which will make it impossible for the silent command to surf along a table to your phone and activate the voice assistant.

Otherwise, you can use a thicker phone case or disable your voice assistant from the lock screen.

* Courtney Linder is Senior News Editor at Popular Mechanics. She tweets at @linderrama.

This article first appeared at www.popularmechanics.com/technology/security.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.