Researchers have discovered a way to hijack smart assistants using sounds inaudible to the human ear, raising security concerns about the voice-activated devices.

Amazon's Alexa, Apple's Siri and Google's Assistant, as well as home help's from Samsung and Huawei, are affected by the problem, which could let hackers take control of devices and command them to conduct tasks such as downloading malicious software or opening the front door.

Researchers from Zhejiang University were able to take over popular gadgets including iPhones and MacBooks running Siri, newer Galaxy smartphones with Bixby, Amazon Echo speakers and PCs running Windows 10 with Cortana.