Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks

Discussion in 'MacRumors.com News Discussion' started by MacRumors, May 10, 2018.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    Researchers in the United States and China have been performing tests in an effort to demonstrate that "hidden" commands, or those undetectable to human ears, can reach AI assistants like Siri and force them to perform actions their owners never intended. The research was highlighted in a piece today by The New York Times, suggesting that these subliminal commands can dial phone numbers, open websites, and more potentially malicious actions if placed in the wrong hands.

    A group of students from the University of California, Berkeley and Georgetown University published a research paper this month, stating that they could embed commands into music recordings or spoken text. When played near an Amazon Echo or Apple iPhone, a person would just hear the song or someone speaking, while Siri and Alexa "might hear an instruction to add something to your shopping list." Or, more dangerous, unlock doors, wire money from your bank, and purchase items online.

    [​IMG]

    The method by which the students were able to accomplish the hidden commands shouldn't be a concern for the public at large, but one of the paper's authors, Nicholas Carlini, believes malicious parties could already be making inroads with similar technology.
    Last year, researchers based at Princeton University and Zheijiang University in China performed similar tests, demonstrating that AI assistants could be activated through frequencies not heard by humans. In a technique dubbed "DolphinAttack," the researchers built a transmitter to send the hidden command that dialed a specific phone number, while other tests took pictures and sent text messages. DolphinAttack is said to be limited in terms of range, however, since it "must be close to the receiving device."

    In yet another set of research, a group at the University of Illinois at Urbana-Champaign proved this range limitation could be increased, showing off commands received from 25 feet away. For the most recent group of researchers from Berkeley, Carlini told The New York Times that he was "confident" his team would soon be able to deliver successful commands "against any smart device system on the market." He said the group wants to prove to companies that this flaw is a potential problem, "and then hope that other people will say, 'O.K. this is possible, now let's try and fix it.'"

    For security purposes, Apple is stringent with certain HomeKit-related Siri commands, locking them behind device passcodes whenever users have passcodes enabled. For example, if you want to unlock your front door with a connected smart lock, you can ask Siri to do so, but you'll have to enter your passcode on an iPhone or iPad after issuing the command. The HomePod, on the other hand, purposefully lacks this functionality.

    Article Link: Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks
     
  2. nwcs macrumors 68000

    nwcs

    Joined:
    Sep 21, 2009
    Location:
    Tennessee
    #2
    This is really clever. I wouldn’t have thought that the AIs would respond to non-vocal frequencies as they’re intended to listen to humans only. I would think that checking the frequency range of the command would be enough to counteract this problem fairly simply.
     
  3. SoN1NjA macrumors 68000

    SoN1NjA

    Joined:
    Feb 3, 2016
    Location:
    peanut farm
    #3
    That's crazy, although I don't see this being an issue for most people, it demonstrates that if there's a will there's a way
     
  4. amaier1986 macrumors newbie

    amaier1986

    Joined:
    May 10, 2018
    Location:
    Portland, Oregon
    #4
    HomePod directs me to use my phone to unlock my front door or open my garage doors. This potential issue seems to be somewhat under control with iOS.
     
  5. whizstachio macrumors newbie

    Joined:
    Jul 26, 2016
    #5
    --- Post Merged, May 10, 2018 ---
    Isn’t this what virusses do?
     
  6. Bacillus, May 10, 2018
    Last edited: May 10, 2018

    Bacillus Suspended

    Bacillus

    Joined:
    Jun 25, 2009
    #6
    Embarassing what people have to overcome to instruct Siri, but it must be such fun to see it work...
     
  7. jarman92 macrumors 6502

    Joined:
    Nov 13, 2014
    #7
    Agreed. But why wouldn't Apple have foreseen this and limited the frequency range in the first place? There's literally no need for phone mics to detect anything below/above human voice frequencies.
     
  8. fairuz macrumors 68000

    fairuz

    Joined:
    Aug 27, 2017
    Location:
    San Jose and Berkeley, CA
    #8
    Guys, make sure to check out my new song on iTunes, and play it really loudly near your HomePod!
    Also, go Bears!
     
  9. Tech198 macrumors G5

    Joined:
    Mar 21, 2011
    Location:
    Australia, Perth
    #9
    I guess when you start having AI or "machine learning" an ones ears anything can be possible, including hacking :)

    Bet as a company, my face would be red by now as I over-stepped THAT mark. Kinda makes you wish when it all goes to hell
     
  10. BootsWalking macrumors 6502a

    Joined:
    Feb 1, 2014
    #10
    Finally a way to get Siri to properly recognize voice commands - embedding them in pop music.
     
  11. daveschroeder macrumors 6502

    Joined:
    Sep 14, 2003
    Location:
    Madison, WI
    #11
    That is NOT "subliminal".

    I think you're looking for another word.
     
  12. JohnnyApple$eed macrumors member

    JohnnyApple$eed

    Joined:
    Feb 19, 2015
    #12
    "...while there was no evidence that these techniques have left the lab, it may only be a matter of time before someone starts exploiting them."

    Sir, you just exploited them yourself!

    I have a big secret I cannot tell you. But I'm going to tell you. Don't tell anyone I told you.
     
  13. H3LL5P4WN macrumors 65816

    H3LL5P4WN

    Joined:
    Jun 19, 2010
    Location:
    Pittsburgh PA
    #13
    My HomePods randomly respond to things on the TV speaker (since AirPlay 2 still isn't up and running). Whats spooky was last night Siri said there was nothing to read; the dialogue on TV had nothing to do with reading.
     
  14. Wide opeN macrumors 6502a

    Wide opeN

    Joined:
    Aug 27, 2010
  15. w5jck macrumors member

    w5jck

    Joined:
    Nov 9, 2013
    #15
    My thought is that the word should be "inaudible" and NOT "subliminal". As in, "The devices can react to inaudible commands."
     
  16. Deacon-Blues macrumors 6502

    Deacon-Blues

    Joined:
    Aug 15, 2012
    Location:
    California
    #16
    This can easily be prevented by not having Siri listen, but rather using the home button. As an added bonus, Siri isn't listening to you all the time. Still, very clever.
     
  17. whooleytoo macrumors 603

    whooleytoo

    Joined:
    Aug 2, 2002
    Location:
    Cork, Ireland.
    #17
    Reminds me of the time (long ago, pre-Siri) my Mac's display was waking randomly, while I was in the same room watching TV. Never knew what was causing it, until one day when the headphones were disconnected from it.

    TV: "....we don't have time for that..."
    Mac: <wakes display> "It's 7:15"

    Feels funny hearing your appliances having a conversation and not being involved in it!
     
  18. davbman macrumors newbie

    Joined:
    Sep 3, 2008
    #18
    What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

    They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
     
  19. Eriamjh1138@DAN macrumors 6502

    Eriamjh1138@DAN

    Joined:
    Sep 16, 2007
    Location:
    BFE, MI
    #19
    If the “words” are still being uttered but just sub or ultrasonically, then software can filter out those frequencies before processing into commands.

    Otherwise, fun security hack!
     
  20. nwcs macrumors 68000

    nwcs

    Joined:
    Sep 21, 2009
    Location:
    Tennessee
    #20
    Not just Apple, they also mentioned Alexa.
     
  21. techwhiz macrumors 6502a

    Joined:
    Feb 22, 2010
    Location:
    Northern Ca.
    #21
    This is why Google/Android requires training. If the voice is not correct to wake up the assistant, then how would this attack work?

    It hopes you are listening to the song and have Google Assistant/Siri already open and accepting commands.
    Normally music stops when you launch the assistant.

    So this threat is akin to going outside, stand on one foot, and catch rain in an upside down cup......
     
  22. SteveJobs2.0 macrumors 6502a

    Joined:
    Mar 9, 2012
    #22
    Because all it would take is for one person with a super high or super low voice to file a discrimination lawsuit.
     
  23. Tech198 macrumors G5

    Joined:
    Mar 21, 2011
    Location:
    Australia, Perth
    #23
    My guess would a "Barry White"
     
  24. DipDog3 macrumors 65816

    DipDog3

    Joined:
    Sep 20, 2002
    #24
    No problem, Siri is so terrible that I already have Siri disabled on all of my devices.
     
  25. neutralguy macrumors 6502a

    Joined:
    Jun 5, 2015
    #25
    Sounds like a true smart speaker.. delegating all the smart works to the phone.
     

Share This Page