Reported by Ars Technica, the type of attack is called ‘Light Commands’ and was discovered by researchers at the University of Electro-Communications and the University of Michigan. Light Commands make it possible to hack Siri, Google Assistant, Alexa, and more from a distance as long as the attacker has line of sight to the device’s microphones.

Here’s how it works:

As noted by Ars, the researchers have done limited testing with iPhones, tablets, smart speakers, and smart displays but they believe that “all devices that use MEMS microphones susceptible to Light Commands attacks.”

Light Commands do have some limitations, like a malicious party needing to have a direct line of sight to a device and be able to very accurately position a laser on a device’s microphone.

However, the researchers have carried out attacks in moderately realistic conditions and the lack of authentication for voice assistants that can control smart home devices like door locks, garage doors, and more is certainly concerning.

More interesting, some of the tests were even done with just an $18 laser pointer, laser driver, and an audio amplifier for less than a $400 total.

Check out more about Light Commands vulnerabilities in the videos below: