Boing Boing Staging

Exploiting smartphone cables as antennae that receive silent, pwning voice commands

In IEMI Threats for Information Security: Remote Command Injection on Modern Smartphones, French government infosec researchers José Lopes Esteves and Chaouki Kasmi demonstrated a clever attack on smartphones that sent silent voice commands to OK Google and Siri by converting them to radio-waves and tricking headphone cables into acting as antennas.

The sexy thing about this is that it lets attackers control most of a phone’s functionality in an undetectable way. The attack today relies on specialized transmitters that are close to the device itself, but it’s easy to anticipate that one or both of these constraints could be removed through further development — the transmitter could be miniaturized as something that you could carry in a coat-pocket and use to pwn people near you on public transit, airplanes, etc; and/or the transmitter could use higher powers and phased-array antennas to focus a signal over longer distances.

As the researchers pointed out in their Hack in Paris 15 presentation, voice interfaces are proliferating in the Internet of Things, for devices that are too small or remote to accommodate screens or keyboards.

Although the latest version of iOS now has a hands-free feature that allows iPhone owners to send voice commands merely by saying “Hey Siri,” Kasmi and Esteves say that their attack works on older versions of the operating system, too. iPhone headphones have long had a button on their cord that allows the user to enable Siri with a long press. By reverse engineering and spoofing the electrical signal of that button press, their radio attack can trigger Siri from the lockscreen without any interaction from the user. “It’s not mandatory to have an always-on voice interface,” says Kasmi. “It doesn’t make the phone more vulnerable, it just makes the attack less complex.”

Of course, security conscious smartphone users probably already know that leaving Siri or Google Now enabled on their phone’s login screen represents a security risk. At least in Apple’s case, anyone who gets hands-on access to the device has long been able to use those voice command features to squeeze sensitive information out of the phone—from contacts to recent calls—or even hijack social media accounts. But the radio attack extends the range and stealth of that intrusion, making it all the more important for users to disable the voice command functions from their lock screen.

The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack. But they note that their attack could also be prevented in software, too, by letting users create their own custom “wake” words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands. Neither Google nor Apple has yet responded to WIRED’s inquiry about the ANSSI research.


Hackers Can Silently Control Siri From 16 Feet Away [Andy Greenberg/Wired]

Exit mobile version