Amazon's Alexa can be hacked with lights and laser pointers - report

The researchers discovered that these voice-controllable systems can become subject to "light-based audio injection attacks."

Mike George, VP Alexa, Echo and Appstore for Amazon, speaks during the LG press conference at CES in Las Vegas (photo credit: REUTERS)
Mike George, VP Alexa, Echo and Appstore for Amazon, speaks during the LG press conference at CES in Las Vegas
(photo credit: REUTERS)
Researchers at Japan's University of Electro-Communications and at the University of Michigan have discovered a way to hack into Amazon's Alexa, Apple's Siri Portal and Google Assistant from just over a football field away (110 meters), they announced in a white paper released on Monday.
The hacking can be done, the report said, by simply shining lights and laser pointers at the microphones located on the devices.
The researchers discovered that these voice controllable systems can fall subject to "light-based audio injection attacks," meaning that they were able to shine lights with commands encoded in them at the devices' built-in microphones - such as "Alexa, record this conversation," or "Google, add [blank] to my shopping list."
"We find that five mW [milliwatts] of laser power – the equivalent of a laser pointer – is sufficient to obtain full control over many popular Alexa and Google smart home devices, while about 60mW is sufficient for gaining control over phones and tablets," the report said.
The researchers demonstrated how these methods can be used to completely take over these voice controllable systems even between buildings as well as through closed windows and doors at similar distances.
 
In addition, they proved that these light attacks don't require expensive laser set ups to be successful. Costs to perform such a feat would be less than $400: a $20 laser pointer, $339 laser driver and $28 sound amplifier.
Every smartphone and voice-controlled device has what is called a diaphragm located inside the microphone. When sounds bounce out of the plate inside the device, the diaphragm moves and registers the movements as sounds.
The researchers replicated that process, by focusing the laser light at the diaphragm, which creates an electrical signal that will cause the smartphone to respond in the exact same manner that it does to sound. The encoding within the light and frequency is what enables the attacker to dictate commands to the device.
The researchers documented the security implications of these findings. They showed how a hacker can use the light-injected voice commands to unlock smart-lock protected doors, garage doors, allow them to shop online without the target's knowledge – and even locate and unlock the target's vehicle – if these are all connected to the voice controllable systems. The paper also discusses software countermeasures against these light-based attacks.

Stay updated with the latest news!

Subscribe to The Jerusalem Post Newsletter


“This opens up an entirely new class of vulnerabilities,” Kevin Fu, an associate professor of electrical engineering and computer science at the University of Michigan, told The New York Times. “It’s difficult to know how many products are affected, because this is so basic.”
The researchers shared their findings with Tesla, Ford, Google, Apple and Amazon to discuss their concerns. All companies have said they are closely reviewing the conclusions of the research. 
“This is the tip of the iceberg,” Fu told the Times. “There is this wide gap between what computers are supposed to do and what they actually do. With the Internet of Things, they can do unadvertised behaviors, and this is just one example.”
An Amazon spokesperson replied to a request for comment by the Times, stating that the company has never heard of "light-command hack[s]" until now.
However, the spokesperson offered solutions for the time being, such as setting up a pin for shopping with Alexa as well as with "other sensitive smart-home requests." In addition, you can also mute Alexa, disconnecting the microphone.
Researchers claim that the microphones need to be completely redesigned to remedy the problem, and covering the microphone in do-it-yourself ways will not protect users from malicious command injections.
Daniel Genkin, one of the paper's co-authors and an assistant professor at the University of Michigan, told the Times that there is an easy fix for the time being: Leave your voice-controlled assistant out of the line of sight from outside your home, "and don’t give it access to anything you don’t want someone else to access."