That smart speaker in your home may not be as secure as you think. Researchers have discovered a new vulnerability in devices like Amazon Alexa, Apple's Siri voice assistant, and the Google Home. A laser beam can be used to mimic the human voice in order to hack the system.

It's a little far fetched, but here's the science behind it.

Kevin Fu, a professor at the University of Michigan and cybersecurity researcher Takeshi Sugawara discovered that when they pointed a laser at a smart speaker's microphone and changed the intensity at a precise frequency, the incoming light could be interpreted as a digital signal, just the same way sound is. The researchers then experimented with changing the intensity of the laser in order to match the frequencies of a human voice.

The website Wired has an in-depth look at the research that has been conducted so far.

Daniel Genkin, a professor at the University Michigan who worked with the team says it's plausible that sophisticated hackers could shine a light through a window to take control of a system.

"Your assumptions about blocking sound aren’t true about blocking light," he says. "This security problem manifests as a laser through the window to your voice-activated system."

It should be pointed out that a hacker would have to have a lot of sophisticated equipment and a great deal of intel in order to hack your phone or smart speaker. Also, many electronic door locks do not allow unlock commands from voice assistants, because of security concerns.

But for now, it might not be a bad idea to move Alexa away from the window.

 

 

More From Banana 101.5