Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Malicious commands in music
Smile  LOL.  Another way hackers can rip you off.  Malicious commands can be hidden in white noise.


Malicious Google Assistant commands can be hidden in music and innocuous-sounding speech recordings
Ben Lovejoy
- May. 10th 2018 9:18 am PT

A group of students from Berkeley have demonstrated how malicious commands
to Google Assistant, Alexa, and Siri can be hidden in recorded music or innocuous-sounding speech.
Simply playing the tracks over the radio, streaming music track or podcast could allow
attackers to take control of a smart home …

The NY Times reports that it builds on research that began in 2016.
Quote:Over the past two years, researchers in China and the United States have begun
demonstrating that they can send hidden commands that are undetectable to the
human ear to Apple’s Siri, Amazon’s Alexa and Google’s Assistant. Inside university
labs, the researchers have been able to secretly activate the artificial intelligence
systems on smartphones and smart speakers, making them dial phone numbers or
open websites. In the wrong hands, the technology could be used to unlock
doors, wire money or buy stuff online — simply with music playing over the radio.
The 2016 research demonstrated commands hidden in white noise, but the students
have this month managed to do the same thing in music and spoken text.
Quote:By making slight changes to audio files, researchers were able to cancel out the
sound that the speech recognition system was supposed to hear and replace it with
a sound that would be transcribed differently by machines while being nearly
undetectable to the human ear […]

They were able to hide the command, “O.K. Google, browse to evil.com” in a recording
of the spoken phrase, “Without the data set, the article is useless.” Humans cannot discern the command.
The Berkeley group also embedded the command in music files, including a four-second clip from Verdi’s “Requiem.”
Similar techniques have been demonstrated using ultrasonic frequencies.
Quote:Researchers at Princeton University and China’s Zhejiang University demonstrated that
voice-recognition systems could be activated by using frequencies inaudible to the human
ear. The attack first muted the phone so the owner wouldn’t hear the system’s responses, either.
The Berkeley researchers say that there is no indication of the attack method being
used in the wild, but that could easily change.
Quote:Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the
paper’s authors, [said that] while there was no evidence that these techniques have left the lab,
it may only be a matter of time before someone starts exploiting them. “My assumption is that
the malicious people already employ people to do what I do,” he said.

Check out 9to5Google on YouTube for more news:

[/url] [Image: googleassistant-e1503955120811.jpg?quali...145&crop=1]
Google Assistant
Assistant is Google's personal assistant that is capable of answering questions, performing automated tasks, and more
About the Author
[Image: ecf44b2f82be1cf21f8195025f24b89b?s=512&d=mm&r=r]
Ben Lovejoy

[Image: vip-powered-light-small.png]
[Image: g.gif?blog=22427743&v=wpcom&tz=-7&user_i...5980060056]
That's pretty funny. I wonder how long it will be before it turns into a CIA conspiracy story.

Forum Jump:

Users browsing this thread: 1 Guest(s)