Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

MacRumors

macrumors bot
Original poster
Apr 12, 2001
54,485
16,566



Apple is suspending a Siri program that allows employees to listen to Siri recordings for quality control purposes, reports TechCrunch.

Apple is going to review the process that's currently used, where workers listen to anonymized Siri recordings to determine whether Siri is hearing questions correctly or being activated accidentally.

hey-siri.jpg

Apple in the future also plans to release a software update that will let Siri users opt out of having their Siri queries included in this evaluation process, called grading.
"We are committed to delivering a great Siri experience while protecting user privacy," Apple said in a statement to TechCrunch. "While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading."
The decision to suspend the program and offer an opt-out option comes following a report from The Guardian that shared details gleaned from one of the contractors working on evaluating Siri queries.

The employee expressed concern with Apple's lack of disclosure about the human oversight and said that contractors who work on the program have overhead confidential medical information, drug deals, recordings of couples having sex, and other private details from accidental Siri activations.

When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.

Article Link: Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming
 

alexhardaker

macrumors 6502a
Sep 12, 2014
522
447
“When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.“

When I read their policies (after their new policy pages came out), I took it to mean that they reviewed them. I don’t understand the shock behind this. How else are they meant to be improved if they aren’t listened to by other people? I’m assuming the contact info is only used when you say “Hey Siri, call my dad”. If it takes that stuff constantly no matter what, then that’s a bit of a concern.

It’s good they’re letting people opt out of it. That should’ve been there from day one
 

heov

macrumors 6502
Aug 16, 2002
297
727
Just because you took their policy one way doesn't mean the average joe did too. Apple is a generally transparent and upfront company, ESPECIALLY when it comes to privacy. They should have been clearer that they listen to recordings, even if they did "nothing wrong."

This is not unlike throttle-gate. Basically do something sort of shady, get caught, have an opt out option, then get sued.
 

Zenithal

macrumors G3
Sep 10, 2009
9,669
10,811
Except Apple's actions could be described as remedial this last year when it comes to security issues by them or third parties utilizing their platform(s).
 

Waxhead138

macrumors 6502
May 18, 2012
442
510
“When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.“

When I read their policies (after their new policy pages came out), I took it to mean that they reviewed them. I don’t understand the shock behind this. How else are they meant to be improved if they aren’t listened to by other people? I’m assuming the contact info is only used when you say “Hey Siri, call my dad”. If it takes that stuff constantly no matter what, then that’s a bit of a concern.

It’s good they’re letting people opt out of it. That should’ve been there from day one

Agree with the opt out part. Maybe the leaker is just waaaay to overly concerned. As long as the anonymity part is maintained, then no harm no foul. That said....sounds / phrases that might accidentally trigger Siri could be a concern.....but I can say, for myself anyway, the amount of times Ive accidentally triggered Siri via voice can be counted on one hand since launch. Physical activation is different, although still few examples in my experience.
 

User 6502

macrumors 6502
Mar 6, 2014
417
1,399
This is ridiculous. Siri is already by far the worst assistant out there, without grading it will never improve and be competitive. They should have just put an opt out (and give to people to opt out the current Siri, keeping the one improved by grading to those who opt in).
 

apolloa

Suspended
Oct 21, 2008
12,318
7,797
Time, because it rules EVERYTHING!
Hahahahahahaha oh dear.... so that newspaper report was true then, and yet again Apple has been caught with its trousers down...

The company is becoming more and more devious by the day, oh but remember what’s on your iPhone stays on your iPhone... yeah right!

I’ll stick to my Alexa as Amazon tell me what they do with it from day one and allow me to opt out!
 

apolloa

Suspended
Oct 21, 2008
12,318
7,797
Time, because it rules EVERYTHING!
Changed that to reflect the sentiment of the ADL. Any other company reported on here doing this would have righteous, vitriolic hate thrown its way by the paragraph load.

Yeap soooo many defenders in here made claims it was lies, the contractor made it up, the newspaper made it up, its bogus.....
How far would Apple have to go before they stop blindly defending them do you think?
At least Siri’s so crap most people just don’t bother using it anyway.
 

PickUrPoison

macrumors G3
Sep 12, 2017
8,131
10,721
Sunnyvale, CA
You're forgetting that Siri has to be listening to EVERYTHING in order to respond to Siri requests.

You must be confused about how that works. A low power local circuit listens for the activation word, and only then activates the rest of the system. This has been talked about again and again.

Exactly, the processing of the trigger phrase happens on-device. The always on processor in the Mx motion coprocessor continuously analyzes the microphone output using a deep neural network acoustic model. If it think it hears “Hey Siri”, it wakes up the Ax processor and re-checks for the wake phrase using a more powerful and accurate algorithm.

If the wake phrase is confirmed by both checks, the first part of the audio data is further analyzed as the data is sent to the cloud for further processing. If a false activation is detected (e.g. “hey seriously”), the server sends a cancellation command and the device goes back to sleep.

There’s a lot more detail available at:

https://machinelearning.apple.com/2017/10/01/hey-siri.html
 
Last edited:

Jsameds

Suspended
Apr 22, 2008
3,525
7,986
If all the data servers for voice assistant were hacked and their contents leaked, Apple’s would be the only one where the data is useless.
[doublepost=1564729709][/doublepost]
You might say something that identifies you. I don't know how they can anonymize this.

That’s a you problem.
 
  • Like
Reactions: t1meless1nf1n1t

Jim Lahey

macrumors 65816
Apr 8, 2014
1,180
2,014
Sunnyvale
Changed that to reflect the sentiment of the ADL. Any other company reported on here doing this would have righteous, vitriolic hate thrown its way by the paragraph load.

You may not be aware, but the internet is populated by billions of individual people with individual views and opinions. It’s not a hive mind, where everything you read on forums is posted by the same consciousness ;)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.