A few weeks ago we reported about how Security Research Labs (SRLabs), a hacking research group and think tank based in Germany, found that Alexa and Google Home expose users to phishing and eavesdropping due to third-party skills and apps.
Now, another new study about the vulnerabilities of smart speakers, like Amazon Echo, Apple Home Pod, and Google Home, has been released.
Researchers at the University of Michigan and Japan’s University of Electro-Communications found that you can hack smart speakers with vibrating light.
Researchers could stand hundreds of feet away from a smart speaker and manipulate the assistant’s artificial intelligence using a special laser encoded with commands.
For example, a laser could be encoded with information that would command the assistant to unlock your front door or order something through your Amazon account.
The encoding makes the laser vibrate in a way that the smart speaker confuses for a human voice.