Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
That's what I thought... when I setup "Hey Siri" on my iPhone it was so that it would only respond to me. Others have tried to initiate Siri on my phone but it will only respond to my voice. I also have a Google Home Mini which will respond to anyone, but if someone other than me says something like "call Mom" it refuses since it will only access my contact list if it recognizes that it is actually me asking. Seems like appropriate precautions are already in place, people just need to be sure to setup voice detection??
 
  • Like
Reactions: jeremiah256



Researchers in the United States and China have been performing tests in an effort to demonstrate that "hidden" commands, or those undetectable to human ears, can reach AI assistants like Siri and force them to perform actions their owners never intended. The research was highlighted in a piece today by The New York Times, suggesting that these subliminal commands can dial phone numbers, open websites, and more potentially malicious actions if placed in the wrong hands.

A group of students from the University of California, Berkeley and Georgetown University published a research paper this month, stating that they could embed commands into music recordings or spoken text. When played near an Amazon Echo or Apple iPhone, a person would just hear the song or someone speaking, while Siri and Alexa "might hear an instruction to add something to your shopping list." Or, more dangerous, unlock doors, wire money from your bank, and purchase items online.

siri-iphone-x.jpg

The method by which the students were able to accomplish the hidden commands shouldn't be a concern for the public at large, but one of the paper's authors, Nicholas Carlini, believes malicious parties could already be making inroads with similar technology.
Last year, researchers based at Princeton University and Zheijiang University in China performed similar tests, demonstrating that AI assistants could be activated through frequencies not heard by humans. In a technique dubbed "DolphinAttack," the researchers built a transmitter to send the hidden command that dialed a specific phone number, while other tests took pictures and sent text messages. DolphinAttack is said to be limited in terms of range, however, since it "must be close to the receiving device."

In yet another set of research, a group at the University of Illinois at Urbana-Champaign proved this range limitation could be increased, showing off commands received from 25 feet away. For the most recent group of researchers from Berkeley, Carlini told The New York Times that he was "confident" his team would soon be able to deliver successful commands "against any smart device system on the market." He said the group wants to prove to companies that this flaw is a potential problem, "and then hope that other people will say, 'O.K. this is possible, now let's try and fix it.'"

For security purposes, Apple is stringent with certain HomeKit-related Siri commands, locking them behind device passcodes whenever users have passcodes enabled. For example, if you want to unlock your front door with a connected smart lock, you can ask Siri to do so, but you'll have to enter your passcode on an iPhone or iPad after issuing the command. The HomePod, on the other hand, purposefully lacks this functionality.

Article Link: Researchers Demonstrate Subliminal Smart Device Commands That Have Potential for Malicious Attacks

My amateurish guess is that they did some maths with some reverse fourier transform stuff that let them create a waveform identical to the phrases constructed entirely within the 20-20khz human range with frequencies below 20hz and above 20hz. but im just bamboozled how these mics even respond to those frequencies and how the speakers even reproduce them properly
 
Last edited:
Not re
What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
Not really true. I trained my iPhone and it responds to people on tv saying “are you serious” quite often. Ironically my other android phone doesn’t respond to people saying google.

Not everyone can trigger Siri on my phone but way too many can.
 
Too soon to know if this is a real concern, but it is pretty cool. For obvious reasons, the smart locks I know of don't allow for voice unlock, and ordering is limited to your pre-entered address on Amazon. Still, there is at least some potential for some fun pranks if you can get your friend to play something you recorded. over a speaker.
 
If there is one thing Siri really needs, it is "Dolphins talking with Darth Vader"- detection
 
Last edited:
What they don't point out is they must of "trained" Siri on the victim phone with the voice of the other Siri device as Siri will not respond to other people saying Hey Siri except the owner. That is for the iPhone of course. HomePod is different story.

They really need to show more info. I can assume they picked the iPhone for more of a scare factor though left out that part. Now Alexa and many Android phones, they still respond to anybody.
or maybe they withheld some key steps from the published report so that it wouldn’t immediately be exploited by others.
 
Because all it would take is for one person with a super high or super low voice to file a discrimination lawsuit.
Even young childrens' voices fall within the overall range of human speech, and an abnormally deep voice, as in significantly deeper than any normal person would be expected to have would have harmonic frequencies at higher frequencies as well, especially where consonants are concerned; you can't very well growl your S:es and T:s and so on. :D

Besides, I don't think discrimination lawsuits work quite that way anyway.
 
How in the heck is Siri going to wire money using some hidden voice command? It can send payments but you have to authenticate. Also it's not advanced enough to buy things for you. I call BS! Maybe this is true for Alexa, but you can't do anything dangerous with Siri because it's either locked down or it isn't advanced enough to do those things. Even if they get a command through to unlock my door, I'll have to approve it.
 
right and playing records backward have demonic messages. subsonic warfare anyone;//
 
Reminds me of the mosquito ringtones that kids used for a while to hide calls from older folk :)

As some have already noted, this should prompt makers to filter input to voice frequencies. Easy.
That was literally the first thing that came to mind, the super high pitch frequency ringtones lol.
 
Viruses have already been found to be transmitted to nearby devices using their speakers and microphones. Luckily Macs can’t get viruses and Siri can’t understand anything.
 
Consequence of not using a proper analog anti-aliasing filter before the ADC? If the signal is not analog lowpass filtered to half the sample rate before sampling then those frequencies above half the sample rate 'alias' down into frequencies within the normal sample range. For example, if the sample rate is 40khz then sounds between 20khz and 40khz (outside the normal hearing range) sound identical to the sampler as sounds between 0hz and 20khz do (inside the normal hearing range). This is the audio equivalent to those Moire patterns you sometimes see in photos with textures finer then the pixel density of the camera or monitor.

If so this would imply they tried saving a few cents by either leaving out the front end analog lowpass filter or using a very cheap first-order filter which isn't something that could be fixed in firmware. A cheap filter can be tricked by just increasing the volume of the sound and since you can't hear it anyway it doesn't matter how loud they make it. This must be the approach they used since the article states "While DolphinAttack has its limitations — the transmitter must be close to the receiving device — experts warned that more powerful ultrasonic systems were possible."

Fixing this requires designing in stronger high-order front-end filtering which ultimately should only help Siri actually recognize you when you 'do' speak. And we already know she needs all the help she can get ;)

Here is a white paper describing the phenomenon...

http://www.ni.com/white-paper/54448/en/
 
Last edited:
  • Like
Reactions: M2M
I’m not worried about Siri picking up subliminal commands hidden in music.

Siri can’t even hear me if I yell into the microphone unless there is absolute silence around me.

If I’m in my car, I have to turn off the heater, turn off the radio. Make sure there’s no road noise at them moment.

If I’m at home and there’s music playing anywhere in the house, it has to be turned off.

If she can’t tell that I’m speaking if it’s not absolutely silent, then I’m not worried about subliminal commands.

But... I’ll keep an eye out for Siri randomly saying “Sorry, I didn’t get that”. Pretty much all Siri seems to be able to do.
[doublepost=1526008021][/doublepost]
Cult of Mac article indicates porn sites. Oh, Siri... you know me so well. lol But I am a Google Assistant kind of man.

Oh the horror... listening to music and a porn site opens. Umm... don’t they realize we already get porn site redirects already from everything? Even MacRumors has sent me to be the thousandth user twice today for a free $1000 Walmart gift card (whatever scam redirect).

Half the music today is sex anyway. So you’re listening to dirty music and a video pops up. I expect most iPhones probably already have porn on them anyway. I’m not even surprised anymore when someone shows me something on their phone and nude pictures scroll by in their photo album.
[doublepost=1526008342][/doublepost]
right and playing records backward have demonic messages. subsonic warfare anyone;//

If only playing music backwards was as easy as it used to be. Was fun to spin them records backwards and try to hear what the activists claimed was in there.

Wait... I think it said “uuuurrrrr Naaaaaa Veeeee Duuuurrrnnn ennnnn Elllllll”.

Somebody save me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.