Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming

And so it was not a fake news after all as what others want to believe :)

:) Well, this shows what kind of a company Appel is :)

Claiming to be a privacy first company, I wonder what else are they hiding or not telling its customers :)

Who knows, maybe the T2 chip having its own cpu and os inside is sending data to Apple servers without users knowing.
 
Apple needs to review Siri for quality and improvement… I get that. Apple says as much in their terms of use and one should expect such things… I get that too. It's anonymised to protect one's privacy, I'm not sure how you can say that when Siri is listening in on you where you're saying your name and giving your credit card details and the chances of a "contractor" recognising a voice certainly isn't zero… I still get that as they realistically need to review something.

They, Google, Amazon and Samsung mostly, all do it and it's nothing unusual. That said, and this is where I have the big problem, is the use of "what happens on your iPhone, stays on your iPhone" as it's clearly a falsehood.

Additionally, while my friend with an iPhone may be ok with what they're signing up for, I certainly didn't sign up for anything while it's listening in to what I'm saying in their near vicinity. Yes it works in the reverse where one's Android phone is probably listening in just the same, actually probably more, but still, no-one else is using anything like "what happens on your iPhone, stays on your iPhone".

I understand the need of this. What I would like to see from everyone is an explicit notification to the device owner that a review incident occurred, what it was that got reviewed (you can click a link or something to listen to it) and by whom (when I say whom an anonymised identified of a "contractor" and employee). If the service provider has no means of contact that device is excluded from review outright.
 
I could have sworn Tim said he values privacy and it is his utmost concern and priority to keep it that way. If Apple want to conduct quality control then he could ask his employees to opt in and participate just like their employee participation in their new upcoming credit card.
 
It's more like they get your DNA- Your voice in the raw is pretty unique and can be matched to you with high accuracy...
Though it should be possible to prevent the human reviewers to have access to tools that can record or analyse your voice. Apple as an institution knows an awful lot about you, but that doesn't mean that any individual inside Apple (or contracted by Apple) should have ever access to more than small fractions of it.
[doublepost=1564743807][/doublepost]
I could have sworn Tim said he values privacy and it is his utmost concern and priority to keep it that way. If Apple want to conduct quality control then he could ask his employees to opt in and participate just like their employee participation in their new upcoming credit card.
I'm pretty sure that language and dialect-wise, Apple's employees are far from representative for the whole Siri user base. And this probably will extend to subject matter, music choices, address names etc..
 
What I'm curious about is while there's no specific option to not send audio snippets for Siri activations for quality control. Were the broader privacy settings being honored in regards to Siri? If one specifically went to Privacy > Diagnostic & Usage and disabled all diagnostic options plus went to Privacy > Analytics and disabled all analytics options. Then those snippets were still sent to contractors. That would be a breach of trust. If it only happened when you allowed Diagnostic or Analytics to be shared. It's not so bad. I'd expect that data to be used for improvement.

I thought there was an option to allow diagnostic and analytic information to be shared during iOS setup like on macOS. I'm finding no reference to it. This should be one of the options during initial setup.

What I do wonder. Why is Apple using contractors for this? They have a Siri development team. Why don't they hire people to do this internally? It's not as though Siri quality control is a short term project. It seems more like a permanent position for a lot of people.

Personally, I've never used Hey Siri nor voice activation with any other assistant. I don't intend to either in the future. At least not until I can set a custom activation phrase. Also not until voice assistants become useful. So far, I've found them far too limited for my uses. Outside of simple dictation. Even that is annoying at times.
 
Additionally, while my friend with an iPhone may be ok with what they're signing up for, I certainly didn't sign up for anything while it's listening in to what I'm saying in their near vicinity. Yes it works in the reverse where one's Android phone is probably listening in just the same, actually probably more,
Which brings up the point as whether even if Siri had zero human review, wouldn't we be around Android devices other people are using often enough that a lot could 'leak' that way?

but still, no-one else is using anything like "what happens on your iPhone, stays on your iPhone".
Part of the problem is processing power and memory. Currently Siri cannot run on your phone alone effectively (meaning its recordings have to be sent to Apple, even before considering the human review part). A couple of months ago, Google announced that they managed to shrink their voice assistant enough (I think the key aspect was in regard to memory) that it could operate on the phone alone without needing to phone home. Phones will get better as will the algorithms but all else equal so will probably the scope of voice assistance and the database they tip into. Hopefully Apple is able to match what Google has done here in the not-too-distant future.

I understand the need of this. What I would like to see from everyone is an explicit notification to the device owner that a review incident occurred, what it was that got reviewed (you can click a link or something to listen to it) and by whom (when I say whom an anonymised identified of a "contractor" and employee). If the service provider has no means of contact that device is excluded from review outright.
That's an interesting idea but boy could that freak people out.
 
You must be confused about how that works. A low power local circuit listens for the activation word, and only then activates the rest of the system. This has been talked about again and again.

Exactly, the processing of the trigger phrase happens on-device. The always on processor in the Mx motion coprocessor continuously analyzes the microphone output using a deep neural network acoustic model. If it think it hears “Hey Siri”, it wakes up the Ax processor and re-checks for the wake phrase using a more powerful and accurate algorithm.

If the wake phrase is confirmed by both checks, the first part of the audio data is further analyzed as the data is sent to the cloud for further processing. If a false activation is detected (e.g. “hey seriously”), the server sends a cancellation command and the device goes back to sleep.

There’s a lot more detail available at:

https://machinelearning.apple.com/2017/10/01/hey-siri.html

I'm not sure exactly how Apple does it, but I know that Amazon and Google both include the pre-activation recorded buffer when sending the audio data to the cloud for processing. In the case of Alexa, I read the buffer is pretty long - something like 30 seconds. So that's 30 seconds of pre-activation word conversation that may be sent to the cloud.

Again, I don't know how long Apple's buffer is - but presumably there must be one for the low-power coprocessor to do its thing.

The point is that even stuff you may not have intended to be recorded (pre-trigger conversation) may still be uploaded to a server.
 
[doublepost=1564743807][/doublepost]
I'm pretty sure that language and dialect-wise, Apple's employees are far from representative for the whole Siri user base. And this probably will extend to subject matter, music choices, address names etc..[/QUOTE]
It is still not a valid reason or excuse to tap into a customer's device even it may be anonymous. As Tim have already stated, he believes in privacy and prides in Apple's continued effort to maintain that, however, using the excuse for quality control by collecting and/or hearing information is still not right and, IMO, violates one's privacy. If Apple wants to improve Siri in whichever language and dialect, they will have to find other means even if that means hiring people for that specific purpose.
 
It is still not a valid reason or excuse to tap into a customer's device even it may be anonymous. As Tim have already stated, he believes in privacy and prides in Apple's continued effort to maintain that, however, using the excuse for quality control by collecting and/or hearing information is still not right and, IMO, violates one's privacy. If Apple wants to improve Siri in whichever language and dialect, they will have to find other means even if that means hiring people for that specific purpose.

I don't think they can improve Siri without tons of recording. Machine learning/AI needs a lot of samples. If only Apple was upfront about how they are using recordings from Sir and providing opt-out option then this would not be an issue.

The real issue here is, for a company that BOAST a lot (to a point that they have lots of print ads about it in streets) about privacy they BETRAYED the trust of its users by not informing them how they are using the user data exactly. Security and privacy always starts with TRUST and it shows Apple can't be trusted.
 
Basically Apple’s doing the oh **** we got caught reaction. They knew they were in the wrong. They were doing the holier than thou acting when they were doing it to.
 
I can’t believe some of you think this is actually anonymous. The article states they get your location and contacts. I don’t know about you, but even a 9 year old could figure out who you are by having those two things.
 
I Just the fact that Apple has the ability to get raw audio from your phone period should be a huge red flag for everyone! Your phones are listening to you and the people around you!
You're five years late. Apple added 'Hey Siri' in 2014. But of course you can still disable 'Hey Siri' (I do). That limits the risks of anything you don't want be heard by anyone to be send to Apple massively (you can still accidentally press the 'Hey Sire' button).
 
Hahahahahahaha oh dear.... so that newspaper report was true then, and yet again Apple has been caught with its trousers down...

The company is becoming more and more devious by the day, oh but remember what’s on your iPhone stays on your iPhone... yeah right!

I’ll stick to my Alexa as Amazon tell me what they do with it from day one and allow me to opt out!
You seem really strangely excited about this... Which if you are excited about this it must mean you're not an Apple fan and if you're not an Apple fan then why are you on an Apple-centric website? Differing views are cool, but you just seem overly giddy about this...
 
As others have said, it should be opt-in, not opt-out, if they are going to do it at all.

Just out of curiosity I’d like to know how they can be sure supposedly ‘anonymised’ recordings contain no personally-identifying data before humans hear them?! If that could be so assured, why do humans need to listen at all? It sounds paradoxical to me... like, they listening to improve accuracy, but claiming before the audio gets to those people they do something that would require an excellent level of accuracy... huh?! :confused:
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top