Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple has responded to the article by simply stating ‘you can turn of location tracking in Siri or turn Siri off all together’, which pretty much confirms everything reported in the article...

Remember that next time you lot on here attempt to belittle Google or Amazon for apparently tracking you or recording everything you say, which they don’t they only record after the keyword just like Siri... only with Google and Amazon you can turn off the ability for your recordings to be used. Not on Apple you can’t.

What’s on your iPhone stays on your iPhone. So long as you turn off Siri...


http://9to5mac.com/2019/07/26/siri-privacy-concerns-report/

Apple really are two face hypocrites, shocker.

Everything Siri should be opt in, it should be disabled by default if Apple markets privacy.
 
  • Like
Reactions: mi7chy and apolloa
Everything Siri should be opt in, it should be disabled by default if Apple markets privacy.

Yeap, I was so annoyed by the media and the users on here slagging off Alexa, because I damn well knew in the settings you can opt out of your recordings being used to improve it! Yet here’s Apple doing exactly the same with little to no vetting of the people listening, Amazon states it’s workers have to Sign NDA contracts, and with no opt out option bar turn off Siri, yet it sticks huge privacy bill boards up...

Actually I saw this coming a mile away, I thought it was a very very dangerous path they were going down when they started this privacy ad campaign.. let’s see if the advertising regulators in the EU have them for false or misleading advertisement. That’ll dent their reputation a bit.
 
Last edited:
Crazy how in just a few short years, having something like Siri or Alexa is considered normal.
 
People don't understand that any record that exists can be subpoenaed. Apple's anonymization techniques leave many holes, and they can be compelled to undo them by court order. There's nothing stopping a prosecutor from subpoenaing Apple for all audio records from a particular location at a particular time to solve a crime.

We've all traded convenience for real privacy.

And I remind everyone:
From:lisa jackson@apple To: john.podesta Date: 2015-12-20 17:26

"Thousands of times every month, we give governments information about Apple customers and devices, in response to warrants and other forms of legal process. We have a team that responds to those requests 24 hours a day. Strong encryption does not eliminate Apple's ability to give law enforcement meta-data or any of a number of other very useful categories of data."
https://www.sott.net/article/332223...-bar-to-giving-govt-data-Apple-told-Democrats

And note that privacy is not about "you" this is about the potential for coercion and blackmail (of, for example the judicial branch, health and safety violations, whistle-blowing etc).

Note that this is not a political post but rather a legal and policy one which shows that it is Apple's actual policy to turn over information not necessarily with a subpoena - as their rhetoric of privacy is not legally binding AFAIK.
 
So I used to tell everyone that the reason Siri is so much worse than the others, is because they don't have real people listening in to make corrections. And that I take privacy over the better performance of other assistants. But that seems to not be the case. So what the **** Apple, I brag about Apple privacy to all my Android friends (friends with Android phones, not robot friends) and this is how you repay me. SMH
If anything, you should SMH to your self for not having common sense and not realizing how things really work.
[doublepost=1564225628][/doublepost]
I am willing to bet the high number of false triggers w/ the apple watch have nothing to do with "Hey Siri", but rather the digital crown constantly getting accidentally depressed due to its placement on the device. This has been a frustrating design flaw of the watch since its launch; but hey form over function, right?
Simple, flip it. Button is now on the other side. Done.
 
No matter how much privacy is touted by companies, including Apple, it is difficult to trust any of them. Don't let any company have a spy in your house. Period.

That pretty much includes any device with a mic and camera then.

He'll either defend it or discredit the source; it'll be one of the two.

While I do like Rene Ritchie -- he seems like a good guy -- I think in the long run, he'll begin to erode away his credibility in the eyes of his readers/viewers if he continues in the self-appointed defense-against-mass-hysteria-resulting-from-fake-news role.

He needs to become more balanced, nuanced, and to develop an aesthetic and functional appreciation that is not predicated solely on what Apple did and does.

He also has to stop writing how people speak and speaking how people write.

Nah, he's preaching to the choir. There is a definite group of people who cannot countenance any criticism of Apple, they are his audience for the most part.


is the issue only when Siri is invoked? Or is Siri listening and recording audio 24/7? To hear drug deals, sex, etc., one would be using Siri at the same time?

Accidental triggers. When the user has the handsfree 'Hey Siri' enabled the device listens for the key phrase, often it is trigged accidentally from random conversation and starts recording.
[doublepost=1564227658][/doublepost]
For all those saying Apple apologists are silent:

I don't read much of the other person but Jason Snell I like, very honest never apologetic in situations that don't warrant it.
 
Last edited:
No personally identifiable information is recorded, so can’t me misused. Non-issue.
Pretty bold claim there. The voice is identifiable information unless they auto-tune it randomly.
Who removes the information if someone says a house address?
 
When would Siri record conversations between two users? It’s not how Siri works. For this to be true it would have to be recording call data, which I’m sceptical it does.

I think you're misunderstanding the issue.

The claim is that Siri is often accidentally triggered and then records private conversations that people are having in person.
 
Apple has responded to the article by simply stating ‘you can turn of location tracking in Siri or turn Siri off all together’, which pretty much confirms everything reported in the article...

Remember that next time you lot on here attempt to belittle Google or Amazon for apparently tracking you or recording everything you say, which they don’t they only record after the keyword just like Siri... only with Google and Amazon you can turn off the ability for your recordings to be used. Not on Apple you can’t.

What’s on your iPhone stays on your iPhone. So long as you turn off Siri...


http://9to5mac.com/2019/07/26/siri-privacy-concerns-report/

Apple really are two face hypocrites, shocker.
Repeating the same hyperbole multiple times doesn’t make it true. That type of tactic never did.
[doublepost=1564231869][/doublepost]
And I remind everyone:
From:lisa jackson@apple To: john.podesta Date: 2015-12-20 17:26

"Thousands of times every month, we give governments information about Apple customers and devices, in response to warrants and other forms of legal process. We have a team that responds to those requests 24 hours a day. Strong encryption does not eliminate Apple's ability to give law enforcement meta-data or any of a number of other very useful categories of data."
https://www.sott.net/article/332223...-bar-to-giving-govt-data-Apple-told-Democrats

And note that privacy is not about "you" this is about the potential for coercion and blackmail (of, for example the judicial branch, health and safety violations, whistle-blowing etc).

Note that this is not a political post but rather a legal and policy one which shows that it is Apple's actual policy to turn over information not necessarily with a subpoena - as their rhetoric of privacy is not legally binding AFAIK.
A three year old article? Might anything have changed in three years? But I do think there was something else going on. And also has nothing to do with the Siri privacy policy.
1E80E52F-5F1B-43C5-BF0C-1868EB8E1844.png
 
Last edited:
  • Like
Reactions: BigMcGuire
If Apple considers itself upholding the Privacy standards beyond others, it’s better to actually go beyond what others do in practice.

No third parties should be able to listen what is said in private because of frivolous such as improving AI. Once a voice is recorded the user is no longer anonymous, more so when location data is provided. This technical problem should be solved in order to keep AI evolving without Privacy breaches.
 
I’ve always found Siri and its ilk creepy and not seemingly actually very useful, and always turn it off. I prefer sending text messages and emails than talking on the phone and that’s with real human people(!), so why would I want to control searches and other functionality of my phone with my voice? Nothing wrong with typing commands or using a GUI as far as I’m concerned.

This story didn’t really surprise or shock me. I’d be similarly unsurprised if it was revealed that worse happens elsewhere with rival voice assistant services.
 
I'm also skeptical of this claim from the contractor. Apple's privacy policy is pretty clear on this point, even if it doesn't mention the human oversight. I believe names and addresses might sometimes be heard if spoken aloud, but not that info is sent with contact information/addresses/location.

There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.

You don't even need an Apple ID accompanied with it, Those three Data alone will provide a fairly high probability to identify the user. The site below shows something similar.

https://cpg.doc.ic.ac.uk/individual-risk/

And the problem isn't that alone, it is the definition of privacy from Apple, as Steve Jobs Put it ( D8 Conference Jun 2010 ):


For some reason the embedded media does not support link with time marked, but the 3 min is well worth a watch. At around ~2:30


And to me this incident isn't very Steve's Apple like.
 
It's the price that comes along with technology and automation. They can't improve it without checking performance. I don't like anyone hearing my personal conversations as much as the next person, but they'll never know me or anything about me, so I could really care less at the end of the day.

As someone who often overhears conversations at work, trust me, it gets old and we'd prefer not to hear it. After a few years you've heard everything and there's nothing fun or exciting about it.
 
  • Like
Reactions: Crowbot
to be fair, Apple gives you a warning that your recordings will be stored on their servers if you use Siri and for this reason I do not use it.

Plus, anonymizing does not do anything as you can have voice identification
 
  • Like
Reactions: BigMcGuire
I think Cortana is on its way out anyway, Microsoft will be removing it from the Xbox in the next update, so that’s millions of devices losing it...

I always said Apple was utterly hypocritical with its heavy advertising of privacy. Amazon and Google are quite open that for an AI to learn and get better it needs to be taught that, but it seems Apple does the same yet doesn’t tell anyone.. whilst Siri remains really crap.
I think a lot changed behind the scenes when the former Google guy came over. Siri seems to have improved at a faster rate since that announcement was made than all the years previously. That’s not a compliment, just an observation. Siri still needs a lot of work. Only now, I can see evidence the work is being put in.
 
  • Like
Reactions: DoctorTech
First it was Alexa, Then Google, and now Apple.

Stop listening to everyone's rhetoric about safety and use your own judgement about what you say or do around Smart technology.
 
  • Like
Reactions: ipponrg
I am willing to bet the high number of false triggers w/ the apple watch have nothing to do with "Hey Siri", but rather the digital crown constantly getting accidentally depressed due to its placement on the device. This has been a frustrating design flaw of the watch since its launch; but hey form over function, right?
I have had a couple of instances where Siri was accidentally triggered on my Apple Watch or iPhone. In nearly every case, something very close to "Hey Siri" had just been spoken (often involving phrases such as "they seriously..."). But by far, my most frequent accidental triggering of Siri occurs by accidentally depressing the digital crown on my Apple Watch. When I go for bike rides I always hit the water drop icon on my Apple Watch control center to disable the screen taps and the digital crown button so I don't accidentally trigger Siri while changing hand positions on my handlebars.
 
More Siri actions need to be able to be done on the device, no sending to a server.

An option in the Privacy section in Settings should allow you to disable this ‘ analyzing ‘ that they do.

That solves most the these privacy problems.
 
People....calm down. There IS a huge difference between Apple and the rest of these devices. Details and facts matter here.

Google/Amazon devices...ALWAYS RECORDING. Humans listening. Recordings subpoenaed by authorities.
https://techcrunch.com/2018/11/14/amazon-echo-recordings-judge-murder-case/
https://www.cnn.com/2017/03/07/tech/amazon-echo-alexa-bentonville-arkansas-murder-case/index.html

Apple devices...Recording AFTER YOU ACTIVATE SIRI. Humans listening to a small portion to improve recognition. Stripped of personal data.

Now the contractor is stating that they have heard drug deals, doctors convos, sexual activity...yes they have...from people activating Siri inadvertently!!!

The article is sensationalizing this and making it seem like Apple is always listening through Siri. They are not. Move along. Nothing to see here.

Please do not spread those lies. The articles you linked to do not even say full time recording and they only pertains to Amazon not Google.
 
How do these recordings happen? Someone says something during sex or a drug deal that sounds like “hey Siri”?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.