Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There will come a time, I believe, when we all must make a decision to keep letting our privacy erode and continue using these devices or make a stand and say, no, I reject this stuff and refuse to buy it.

That day is probably coming over the next ten years.
 
Now image our world filled with these things or even robots that can kill or harm people and then what you'll put the robot or an automated car in jail ? Grim future
 
Now image our world filled with these things or even robots that can kill or harm people and then what you'll put the robot or an automated car in jail ? Grim future
Plenty of science fiction out there to theorize about questions like that.
 
Last edited:
  • Like
Reactions: canadianreader
Amazon admitted that it happens from time to time. "Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future."
I didn’t mean this is the only time it’s ever happened, but I do think that if this was widespread, we would hear about it more.
 
The bug is probably that the conversation was sent to a contact, instead of the intended recipient ;)
 
Because these stories are trying to be frightening.:rolleyes:

What I want to know is:

- How come they didn't notice the swirling blue light? Was it behind them, on a shelf, or what?

- Exactly what words were used that confused it.

- What skill was used, how did it send (email?), and how much.

- Did the skill say anything afterwards, like "Okay, sent Tom your message."

Again, these things are gonna happen. Heck, I can't count the number of times I've been butt dialed and treated to someone's conversation without them meaning to :p
[doublepost=1527251659][/doublepost]Okay, just read more info about the sequence of events on CNN from Amazon:

"Echo woke up due to a word in background conversation sounding like 'Alexa.'

Then, the subsequent conversation was heard as a 'send message' request," Amazon said in a statement.

"At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list.

Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right'.

As unlikely as this string of events is, we are evaluating options to make this case even less likely."

--

So, TWICE the Echo asked for more info and confirmation, while lit up to show it was active.
All good questions. I have 4 Echo dots and an Echo spot around my house and I have had a couple of accidental activations as well but I they were immediately recognized (i.e. the blue swirling light and Alexa asking me a follow up question). Also in nearly every case I knew exactly what I had just said that sounded like "Alexa".
 
  • Like
Reactions: kdarling
To be fair, there have been general industry stories here for a long time, so that's hardly new or strange. And realistically speaking the headline is essentially what happened (minus the specific "woman" part).

That's the SPECIFIC part that they are sensationalizing.

If you click what I'm responding to, which was a response to me, I actually said:

From reading the story it seems like it recorded the couple's conversation about the floors, but the subject line and the article just keep saying "woman" over and over???
 
That's the SPECIFIC part that they are sensationalizing.

If you click what I'm responding to, which was a response to me, I actually said:
But that part of it isn't really what would make it any more or less sensationalized essentially. If it was the same title but had "couple's private conversation" in it instead of "woman's private conversation" it wouldn't really make much of a difference as far as sensationalism would go.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.