Researchers made an alarming discovery regarding smart home assistants. It seems these devices are not so foolproof, as they found a way to control them without the knowledge of the owners. By using sounds imperceptible to the human ear, they can easily take commands and put the security of the owners at risk.
Smart home assistants can be easily hijacked
This technique can be used by hackers anytime to make smart home assistants download a harmful software. In other cases, this could turn out even more serious, as these assistants often control locks or other security devices. The assistants affected by this vulnerability include Siri, Amazon Alexa, Google Assistant, as well as similar equipment from Huawei and Samsung.
The discovery was made by researchers from Zhejiang University in China. They took typical voice commands given to smart home assistants, and then converted them to ultrasonic frequencies. These cannot be heard by humans, but the devices can pick them up.
Using this method, they managed to take over iPhones and MacBooks with Siri, PCs with Cortana on Windows 10, Amazon speakers, or new models of Samsung Galaxy with Bixby. They managed to make these devices open malicious links or execute other commands.
This poses a huge security issue
The problem lies both in their hardware and in their software. These devices shouldn’t be able to pick up such sounds. Also, the software they operate on should tell when the command comes from a human, and listen only to them.
Usually, a smart home assistant can perform an action only if the command comes from several inches away. However, researchers managed to get hold of them even from a distance of a few feet. This poses a lot of security risks, and should act as a warning for manufacturers to refine their devices, and make them more secure.
Image Source: Wikimedia Commons