Site icon Techooid.com

Siri and Alexa can be Hacked using Inaudible Voice Commands

In many houses, people are accustomed to talking with the digital assistants like Apple’s Siri and Amazon’s Alexa but researchers have reported that they can secretly hear someone else also.

Due to the accessibility and efficiency of Speech Recognition systems which translates the human-spoken words into the machine-readable formats, it is being extensively used to control different systems.

Researchers at Zhejiang University reported that these digital assistants can respond to human inaudible voice commands termed as DolphinAttack. Using this attack, researchers demonstrated following practices

Another group of researchers at California University reported that they can hide the audio commands inside recordings of music. An audio that would be some music for the human can instruct the digital assistant to open doors for you or to add something secretly to your shopping list.

“We wanted to see if we could make it even more stealthy,”

Said Nicholas Carlini, a fifth-year PhD student in computer security at U.C. Berkeley and one of the paper’s authors. “There is no evidence that these techniques have left Lab” Mr. Carlini added.

Response

Amazon said that it doesn’t disclose specific security measures, but it has taken steps to ensure its Echo smart speaker is secure. Google said security is an ongoing focus and that its Assistant has features to mitigate undetectable audio commands. Both companies’ assistants employ voice recognition technology to prevent devices from acting on certain commands unless they recognize the user’s voice.

Apple said its smart speaker, HomePod, is designed to prevent commands from doing things like unlocking doors, and it noted that iPhones and iPads must be unlocked before Siri will act on commands that access sensitive data or open apps and websites, among other measures.

Exit mobile version