Researchers use laser to hack Alexa, other voice assistants

Your Amazon Alexa can sound just like Samuel L. Jackson
Copyright 2019 CNN
Digital voice assistants like Amazon's Alexa have become household staples.

Usually you have to talk to voice assistants to get them to do what you want. But a group of researchers determined they can also command them by shining a laser at smart speakers and other gadgets that house virtual helpers such as Amazon’s Alexa, Apple’s Siri and Google’s Assistant.

Researchers at the University of Michigan and Japan’s University of Electro-Communications figured out they could do this silently and from hundreds of feet away, as long as they had a line of sight to the smart gadget. The finding could enable anyone (with motivation and a few hundred dollars’ worth of electronics) to attack a smart speaker from outside your house, making it do anything from playing music to opening a smart garage door to buying you stuff on Amazon.

In a new paper, the researchers explained that they were able to shine a light that had a command encoded in it (such as “OK Google, open the garage door”) at a microphone built into a smart speaker. The sounds of each command were encoded in the intensity of a light beam, Daniel Genkin, a paper coauthor and assistant professor at the University of Michigan, told CNN Business on Monday. The light would hit the diaphragm built into the smart speaker’s microphone, causing it to vibrate in the same way as if someone had spoken that command.

The researchers exploited the vulnerability in tests to do things like trigger a website related to the work, researchers outline the equipment needed, which includes an under-$20 laser pointer, a $339 laser driver, and a $28 sound amplifier.

“If you have a laser that can shine through windows and across long distances — without even alerting anyone in the house that you’re hitting the smart speaker — there’s a big threat in being able to do things a smart speaker can do without permission of the owner,” said Benjamin Cyr, a graduate student at the University of Michigan and a paper coauthor.

Researchers said the Google Home device and first-generation Echo Plus could be commanded over the longest distance: 110 meters (about 361 feet). The researchers said that distance was the longest area they could use (a hallway) when conducting tests.

The researchers noted that they haven’t seen this security issue being taken advantage of. One way to avoid any potential issues, though, is to make sure your smart speaker can’t be seen by anyone outside your home.

Researchers said the weakness can’t truly be fixed without redesigning the microphones, known as MEMS microphones, that are built into these devices, however, which would be a lot more complicated. Takeshi Sugawara, a visiting scholar at the University of Michigan and the paper’s lead author, said one way to do this would be to create an obstacle that would block a straight line of sight to the microphone’s diaphragm.

Gekin said he contacted Google, Apple, Amazon and other companies to address the security issue.

Spokespeople for Google and Amazon said their companies are reviewing the research. Apple declined to comment.