Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks

Discussion in ' News Discussion' started by MacRumors, May 10, 2018.

  1. MacRumors macrumors bot


    Apr 12, 2001

    Researchers in the United States and China have been performing tests in an effort to demonstrate that "hidden" commands, or those undetectable to human ears, can reach AI assistants like Siri and force them to perform actions their owners never intended. The research was highlighted in a piece today by The New York Times, suggesting that these subliminal commands can dial phone numbers, open websites, and more potentially malicious actions if placed in the wrong hands.

    A group of students from the University of California, Berkeley and Georgetown University published a research paper this month, stating that they could embed commands into music recordings or spoken text. When played near an Amazon Echo or Apple iPhone, a person would just hear the song or someone speaking, while Siri and Alexa "might hear an instruction to add something to your shopping list." Or, more dangerous, unlock doors, wire money from your bank, and purchase items online.


    The method by which the students were able to accomplish the hidden commands shouldn't be a concern for the public at large, but one of the paper's authors, Nicholas Carlini, believes malicious parties could already be making inroads with similar technology.
    Last year, researchers based at Princeton University and Zheijiang University in China performed similar tests, demonstrating that AI assistants could be activated through frequencies not heard by humans. In a technique dubbed "DolphinAttack," the researchers built a transmitter to send the hidden command that dialed a specific phone number, while other tests took pictures and sent text messages. DolphinAttack is said to be limited in terms of range, however, since it "must be close to the receiving device."

    In yet another set of research, a group at the University of Illinois at Urbana-Champaign proved this range limitation could be increased, showing off commands received from 25 feet away. For the most recent group of researchers from Berkeley, Carlini told The New York Times that he was "confident" his team would soon be able to deliver successful commands "against any smart device system on the market." He said the group wants to prove to companies that this flaw is a potential problem, "and then hope that other people will say, 'O.K. this is possible, now let's try and fix it.'"

    For security purposes, Apple is stringent with certain HomeKit-related Siri commands, locking them behind device passcodes whenever users have passcodes enabled. For example, if you want to unlock your front door with a connected smart lock, you can ask Siri to do so, but you'll have to enter your passcode on an iPhone or iPad after issuing the command. The HomePod, on the other hand, purposefully lacks this functionality.

    Article Link: Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks
  2. nwcs macrumors 68000


    Sep 21, 2009
    This is really clever. I wouldn’t have thought that the AIs would respond to non-vocal frequencies as they’re intended to listen to humans only. I would think that checking the frequency range of the command would be enough to counteract this problem fairly simply.
  3. SoN1NjA macrumors 68000


    Feb 3, 2016
    the pool
    That's crazy, although I don't see this being an issue for most people, it demonstrates that if there's a will there's a way
  4. amaier1986 macrumors newbie


    May 10, 2018
    Portland, Oregon
    HomePod directs me to use my phone to unlock my front door or open my garage doors. This potential issue seems to be somewhat under control with iOS.
  5. whizstachio macrumors newbie

    Jul 26, 2016
    --- Post Merged, May 10, 2018 ---
    Isn’t this what virusses do?
  6. Bacillus, May 10, 2018
    Last edited: May 10, 2018

    Bacillus Suspended


    Jun 25, 2009
    Embarassing what people have to overcome to instruct Siri, but it must be such fun to see it work...
  7. jarman92 macrumors 6502

    Nov 13, 2014
    Agreed. But why wouldn't Apple have foreseen this and limited the frequency range in the first place? There's literally no need for phone mics to detect anything below/above human voice frequencies.
  8. fairuz macrumors 68000


    Aug 27, 2017
    Silicon Valley
    Guys, make sure to check out my new song on iTunes, and play it really loudly near your HomePod!
    Also, go Bears!
  9. Tech198 macrumors G5

    Mar 21, 2011
    Australia, Perth
    I guess when you start having AI or "machine learning" an ones ears anything can be possible, including hacking :)

    Bet as a company, my face would be red by now as I over-stepped THAT mark. Kinda makes you wish when it all goes to hell
  10. BootsWalking macrumors 6502a

    Feb 1, 2014
    Finally a way to get Siri to properly recognize voice commands - embedding them in pop music.
  11. daveschroeder macrumors 6502

    Sep 14, 2003
    Madison, WI
    That is NOT "subliminal".

    I think you're looking for another word.
  12. JohnnyApple$eed macrumors member


    Feb 19, 2015
    "...while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them."

    Sir, you just exploited them yourself!

    I have a big secret I cannot tell you. But I'm going to tell you. Don't tell anyone I told you.
  13. H3LL5P4WN macrumors 68000


    Jun 19, 2010
    Pittsburgh PA
    My HomePods randomly respond to things on the TV speaker (since AirPlay 2 still isn't up and running). Whats spooky was last night Siri said there was nothing to read; the dialogue on TV had nothing to do with reading.
  14. Wide opeN macrumors 6502a

    Wide opeN

    Aug 27, 2010
  15. w5jck macrumors member


    Nov 9, 2013
    My thought is that the word should be "inaudible" and NOT "subliminal". As in, "The devices can react to inaudible commands."
  16. Deacon-Blues macrumors 6502a


    Aug 15, 2012
    This can easily be prevented by not having Siri listen, but rather using the home button. As an added bonus, Siri isn't listening to you all the time. Still, very clever.
  17. whooleytoo macrumors 604


    Aug 2, 2002
    Cork, Ireland.
    Reminds me of the time (long ago, pre-Siri) my Mac's display was waking randomly, while I was in the same room watching TV. Never knew what was causing it, until one day when the headphones were disconnected from it.

    TV: "....we don't have time for that..."
    Mac: <wakes display> "It's 7:15"

    Feels funny hearing your appliances having a conversation and not being involved in it!
  18. davbman macrumors newbie

    Sep 3, 2008
    What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

    They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
  19. Eriamjh1138@DAN macrumors 6502a


    Sep 16, 2007
    BFE, MI
    If the “words” are still being uttered but just sub or ultrasonically, then software can filter out those frequencies before processing into commands.

    Otherwise, fun security hack!
  20. nwcs macrumors 68000


    Sep 21, 2009
    Not just Apple, they also mentioned Alexa.
  21. techwhiz macrumors 6502a

    Feb 22, 2010
    Northern Ca.
    This is why Google/Android requires training. If the voice is not correct to wake up the assistant, then how would this attack work?

    It hopes you are listening to the song and have Google Assistant/Siri already open and accepting commands.
    Normally music stops when you launch the assistant.

    So this threat is akin to going outside, stand on one foot, and catch rain in an upside down cup......
  22. SteveJobs2.0 macrumors 6502a

    Mar 9, 2012
    Because all it would take is for one person with a super high or super low voice to file a discrimination lawsuit.
  23. Tech198 macrumors G5

    Mar 21, 2011
    Australia, Perth
    My guess would a "Barry White"
  24. DipDog3 macrumors 65816


    Sep 20, 2002
    No problem, Siri is so terrible that I already have Siri disabled on all of my devices.
  25. neutralguy macrumors 6502a

    Jun 5, 2015
    Sounds like a true smart speaker.. delegating all the smart works to the phone.

Share This Page