ResearchTechnology

Alexa, Google Assistant, Siri Can Be Tricked by Hidden Malicious Voice Commands

While Amazon, Apple, and Google are busy making their voice assistants smarter, a group of researchers claims that the present iterations of the companies’ voice assistants are vulnerable. The researchers said they were able to send malicious commands to Amazon’s Alexa, Apple’s Siri, and Google Assistant that were hidden in recorded music or an innocuous-sounding speech.

According to a report by NY Times, the researchers in China and the US have begun testing how hidden commands can be sent to Alexa, Google Assistant, and Siri that are undetectable to the human ear. These commands were reportedly able to activate the artificial intelligence (AI) systems on smartphones and smart speakers to dial phone numbers or open websites – all without the consent of end users.

Back in 2016, a students team from University of California, Berkeley, and Georgetown University showed that they could hide commands in white noise played over loudspeakers and through some YouTube videos to turn on airplane mode or open a website using smart devices.

Some of those Berkeley researchers, however, have now claimed in a research paper that hidden commands can be embedded into music tracks or spoken text. This means attackers could leverage this vulnerability to use voice-enabled smart devices, such as Amazon Echo, Apple HomePod, or Google Home speakers, apart from smartphones, without making users aware of the backdoor access.

The researchers are said to have made slight changes to the original audio files to cancel out the sound that speech recognition systems (including Mozilla’s open source DeepSpeech voice-to-text translation software) detect and replaced it with a sound that would be transcribed distinctly by machines. This eventually makes the smart devices hear commands that are not detectable to the human ear.

The researchers hid the command – “OK Google, browse to evil.com” in a recording of the spoken phrase, “Without the data set, the article is useless”. Researchers used the loophole to embed this command into a four-second clip from Verdi’s Requiem in music files. Moreover, Chinese and American researchers from China’s Academy of Sciences and other institutions are said to have showcased how they could control voice-activated devices with commands embedded in songs that can broadcast over the radio or played on YouTube.

“Companies have to ensure user-friendliness of their devices, because that’s their major selling point,” Tavish Vaidya, a researcher at Georgetown who wrote one of the first papers on audio attacks, told NY Times. Interestingly, Amazon, Apple, and Google are yet to bring a fix for the issue that can impact a large number of smart device users.

Last month, it was discovered that some security researchers at cyber-security company Checkmarx created a ‘skill’ that enabled Amazon Echo devices to eavesdrop on conversations. That vulnerability, which left the Alexa assistant active even after ending a session, was fixed by Amazon after receiving its report from the researchers’ team.

source: NDTV news

Tags

Staff Writer

All articles published by Staff Writer have been contributed by all our reporters and edited and proofread by our editorial team.
Back to top button
Close

Adblock Detected

Please disable your adblocker to continue accessing this site.