According to a report at the New York Times, researchers at the University of California, Berkeley, have demonstrated that they can stuff "silent" commands into music or spoken text to that could potentially get a voice assistant to add something to a shopping list, control an IoT device. or worse.
Now before you toss your Echo into the ocean, you should know that there's no evidence that this type of attack has ever happened outside the lab.
They used a technique called DolphinAttack, which translates voice commands into ultrasonic frequencies that are too high for the human ear to recognize.
PhD student at UC Berkeley, Nicholas Carlini, said the team wanted to make the commands even more stealthy and added: "My assumption is that the malicious people already employ people to do what I do".
Without fear, Ishan Kishan comes into his own
Over 1000 runs from 52 games at a strike rate of over 130 reveal he's got the calibre to make it big. So if we win the games I know we have a great chance of getting to the playoffs.
Georgia officer resigns after arrest of 65-year-old woman turns violent
Campbell initially said she wanted a simple apology from the officers, but now the incident has shaken her faith in police. She was then told she was under arrest, police said, but did not obey the officer's order to step out of her auto .
Nigeria on red alert over fresh Ebola outbreak
Tedros said the organization is working with its partners to send more staff, equipment, and supplies to the affected area. The fist case of the virus was discovered near the Ebola River in the Democratic Republic of Congo , almost 40 years ago.
The spokesperson goes on to describe Amazon's efforts at keeping the line of voice-activated Echo smart speakers secure, which they say includes "disallowing third party application installation on the device, rigorous security reviews, secure software development requirements and encryption of communication between Echo, the Alexa App and Amazon servers". They say that they have created a way to get rid of sounds that would normally be heard by Google Assistant, Siri and Alexa, and replace them with audio files that can not be heard by the human ear.
Apple has additional features to prevent the HomePod speaker from unlocking doors.
For many people, digital assistants have become a party of daily life-you might have Siri set a reminder for an upcoming appointment on your iPhone, or tell Alexa to order more laundry detergent from Amazon. The attack first muted the phone so the owner wouldn't hear the system's responses, either. They've used it to instruct smart devices to visit malicious sites, initiate calls, click pictures and send messages.
By making the research and code public, scientists at Berkeley hope to prevent this kind of vulnerability from landing in the wrong hands.
Researchers were able to manipulate Mozilla's DeepSpeech software. The group provided samples of songs where voice commands have been embedded to make digital assistants do specific things, including visiting websites, turning on Global Positioning System, and making phone calls. They were able to hide the command, "OK Google, browse to evil.com" in a recording of the spoken sentence, 'Without the dataset, the article is useless.' Humans can not detect the command.