Researchers have discovered that it is possible to "speak" to voice assistants like Siri and Alexa in a frequency too high for the human ear to hear, Fast Company reports. This so-called "DolphinAttack" could potentially allow hackers to send commands to iPhones or Amazon Echos, asking the devices to call certain numbers or load dangerous websites. "[E]very iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10, and even Amazon's Alexa assistant" are vulnerable to the high-frequency attack, Fast Company writes.

The problem stems from the fact that voice assistant microphones are able to "hear" sounds above 20KhZ, which adult human ears can't pick up. Electronic companies might enable devices to hear that high in the first place because "analyzing software might need every bit of 'hint' in your voice to create its understanding," said NewDealDesign founder Gadi Amit.

Hackers have to be pretty close to electronics in order to hack them with the high frequency voices — an Apple Watch is vulnerable several feet away while a hacker would have to be inches from an Amazon Echo to issue a successful DolphinAttack. That being said, "hacking an iPhone seems like no problem at all," Fast Company writes. "A hacker would merely need to walk by you in a crowd."

In theory you can protect against a DolphinAttack by turning off the "always on" setting on Siri or Google Assistant or muting Amazon Alexa or Google Home. Of course, this also makes it so the devices won't do what they're meant to: respond to your voice. Read more about DolphinAttack at Fast Company. Jeva Lange