LightBlog

jeudi 10 mai 2018

Inaudible malicious Google Assistant commands can be hidden in music and other audio

A few years ago we started to see reports of how software from a company called SilverPush was using “audio beacons” to communicate with other devices. The company started off by only embedding their beacons into websites. They expanded into TV where commercials could embed an ultrasonic audio beacon that could communicate with any device that has the SilverPush software installed (a smartphone, laptop, tablet, etc.). The whole idea creeped a lot of people out, but now we’re hearing about something similar where Siri, Alexa, and Google Assistant can all receive inaudible commands that have been hidden in music.

We’ve moved into an era where big data has become so valuable that companies are trying to figure out each and every way they can track you. The idea of having your actions tracked by a single website is bad enough. It takes things to a different level when half a dozen or more companies are tracking you as you go from website to website. Things get even worse when you realize these companies have built a profile (even if it is anonymous) on you for years as you’ve browsed the internet.

According to a report from The New York Times, it has just been revealed that the popular virtual assistants many of us use every single day can be controlled by subsonic commands hidden in radio music, YouTube videos, or even white noise played over speakers. This sounds awfully familiar to the SilverPush stuff that we heard back in 2015. The new report says these virtual assistants can be made to dial phone numbers, launch websites, make purchases, and access smart home accessories (such as a smart home door lock).

There have even been some cases where these virtual assistants are being instructed to do things like take a photo or send an SMS message from up to 25 feet away through a building’s open window. All of this information has come to light thanks to the work done by researchers in both China as well as the United States. Researchers from Berkeley have said that they can even alter audio files “to cancel out the sound that the speech recognition system was supposed to hear and replace it with a sound that would be transcribed differently by machines while being nearly undetectable to the human ear.”

Apple, Google, and Amazon have all admitted they are aware of these vulnerabilities but have been rather vague when explaining existing mitigations. Amazon says they have taken steps (but haven’t told anyone what steps) to make sure Alexa is secure. Google says its virtual assistant has features to mitigate these undetectable commands. Apple says the HomePod is programmed to not perform certain tasks such as unlocking a door, while they insist Siri on the iPhone and iPad is safe since the device has to be unlocked in order to execute such commands.


Source: The New York Times Via: VentureBeat



from xda-developers https://ift.tt/2G7C9rq
via IFTTT

.

Aucun commentaire:

Enregistrer un commentaire