Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Aug 1, 2019.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    Apple is suspending a Siri program that allows employees to listen to Siri recordings for quality control purposes, reports TechCrunch.

    Apple is going to review the process that's currently used, where workers listen to anonymized Siri recordings to determine whether Siri is hearing questions correctly or being activated accidentally.

    [​IMG]

    Apple in the future also plans to release a software update that will let Siri users opt out of having their Siri queries included in this evaluation process, called grading.
    The decision to suspend the program and offer an opt-out option comes following a report from The Guardian that shared details gleaned from one of the contractors working on evaluating Siri queries.

    The employee expressed concern with Apple's lack of disclosure about the human oversight and said that contractors who work on the program have overhead confidential medical information, drug deals, recordings of couples having sex, and other private details from accidental Siri activations.

    When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.

    Article Link: Apple Suspends Program That Lets Employees Listen to Siri Recordings for Quality Control, Opt Out Option Coming
     
  2. alexhardaker macrumors 6502

    Joined:
    Sep 12, 2014
    #2
    “When The Guardian report came out, Apple confirmed that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri and dictation. While Apple anonymizes Siri data that's evaluated for quality control and Siri improvement, its current privacy policy and security documents do not explicitly mention human oversight.“

    When I read their policies (after their new policy pages came out), I took it to mean that they reviewed them. I don’t understand the shock behind this. How else are they meant to be improved if they aren’t listened to by other people? I’m assuming the contact info is only used when you say “Hey Siri, call my dad”. If it takes that stuff constantly no matter what, then that’s a bit of a concern.

    It’s good they’re letting people opt out of it. That should’ve been there from day one
     
  3. ds2000 macrumors 6502

    ds2000

    Joined:
    May 24, 2012
    #3
    So long as it is actually anonymized, I don't really care
     
  4. Waxhead138 macrumors 6502

    Joined:
    May 18, 2012
    #4
    Well, when it comes to the couples having sex part....maybe there is just a surge in the popularity of the name Siri, and it gets yelled out loud alot....
     
  5. heov macrumors regular

    Joined:
    Aug 16, 2002
    #5
    Just because you took their policy one way doesn't mean the average joe did too. Apple is a generally transparent and upfront company, ESPECIALLY when it comes to privacy. They should have been clearer that they listen to recordings, even if they did "nothing wrong."

    This is not unlike throttle-gate. Basically do something sort of shady, get caught, have an opt out option, then get sued.
     
  6. Zenithal macrumors G3

    Joined:
    Sep 10, 2009
    #6
    Except Apple's actions could be described as remedial this last year when it comes to security issues by them or third parties utilizing their platform(s).
     
  7. Waxhead138 macrumors 6502

    Joined:
    May 18, 2012
    #7
    Agree with the opt out part. Maybe the leaker is just waaaay to overly concerned. As long as the anonymity part is maintained, then no harm no foul. That said....sounds / phrases that might accidentally trigger Siri could be a concern.....but I can say, for myself anyway, the amount of times Ive accidentally triggered Siri via voice can be counted on one hand since launch. Physical activation is different, although still few examples in my experience.
     
  8. WalterTizzano macrumors 6502

    WalterTizzano

    Joined:
    Mar 6, 2014
    #8
    This is ridiculous. Siri is already by far the worst assistant out there, without grading it will never improve and be competitive. They should have just put an opt out (and give to people to opt out the current Siri, keeping the one improved by grading to those who opt in).
     
  9. albebaubles macrumors 6502

    albebaubles

    Joined:
    Feb 9, 2010
    Location:
    low Sierra
  10. fairuz macrumors 68020

    fairuz

    Joined:
    Aug 27, 2017
    Location:
    Silicon Valley
    #10
    You might say something that identifies you. I don't know how they can anonymize this.
     
  11. 555gallardo macrumors regular

    555gallardo

    Joined:
    Jan 16, 2016
    Location:
    Slovakia
    #11
    And the amount of times I've intentionally triggered Siri via voice and get her do what I wanted her to do successfully, can be counted on a clumsy lumberjack's hand. :p
     
  12. WBRacing macrumors 65816

    Joined:
    Nov 19, 2012
    Location:
    UK
    #12
    Changed that to reflect the sentiment of the ADL. Any other company reported on here doing this would have righteous, vitriolic hate thrown its way by the paragraph load.
     
  13. JuanGuapo macrumors 6502a

    JuanGuapo

    Joined:
    May 21, 2009
    Location:
    Los Angeles, CA
    #13
    Pity nobody but a lowly contractor had any scruples about this practice for as long as it went on.

    Ethics and progress seldom make good bedfellows.
     
  14. miniyou64 macrumors 6502a

    miniyou64

    Joined:
    Jul 8, 2008
    #14
    Siri is beyond useless. I think it worked better in 2011
     
  15. OriginalMacRat macrumors 6502

    Joined:
    Mar 9, 2007
    #15
    You're forgetting that Siri has to be listening to EVERYTHING in order to respond to Siri requests.
     
  16. tgwaste macrumors 6502a

    tgwaste

    Joined:
    Sep 18, 2013
  17. thasan macrumors 65816

    Joined:
    Oct 19, 2007
    Location:
    Germany
    #17
    "Oh Siri yesss..." :D
     
  18. apolloa macrumors G4

    Joined:
    Oct 21, 2008
    Location:
    Time, because it rules EVERYTHING!
    #18
    Hahahahahahaha oh dear.... so that newspaper report was true then, and yet again Apple has been caught with its trousers down...

    The company is becoming more and more devious by the day, oh but remember what’s on your iPhone stays on your iPhone... yeah right!

    I’ll stick to my Alexa as Amazon tell me what they do with it from day one and allow me to opt out!
     
  19. Lone Deranger macrumors 68000

    Lone Deranger

    Joined:
    Apr 23, 2006
    Location:
    Tokyo, Japan
    #19
    Thank god for whistle blowers and journalists. Imagine all the **** corporations would be able to get away with without having the public eye shine on their activities.
     
  20. apolloa macrumors G4

    Joined:
    Oct 21, 2008
    Location:
    Time, because it rules EVERYTHING!
    #20
    Yeap soooo many defenders in here made claims it was lies, the contractor made it up, the newspaper made it up, its bogus.....
    How far would Apple have to go before they stop blindly defending them do you think?
    At least Siri’s so crap most people just don’t bother using it anyway.
     
  21. cyanite macrumors member

    Joined:
    Sep 28, 2015
    #21
    You must be confused about how that works. A low power local circuit listens for the activation word, and only then activates the rest of the system. This has been talked about again and again.
     
  22. novakk86 macrumors member

    novakk86

    Joined:
    Apr 9, 2018
  23. PickUrPoison, Aug 2, 2019
    Last edited: Aug 2, 2019

    PickUrPoison macrumors 601

    Joined:
    Sep 12, 2017
    Location:
    Sunnyvale, CA
    #23
    Exactly, the processing of the trigger phrase happens on-device. The always on processor in the Mx motion coprocessor continuously analyzes the microphone output using a deep neural network acoustic model. If it think it hears “Hey Siri”, it wakes up the Ax processor and re-checks for the wake phrase using a more powerful and accurate algorithm.

    If the wake phrase is confirmed by both checks, the first part of the audio data is further analyzed as the data is sent to the cloud for further processing. If a false activation is detected (e.g. “hey seriously”), the server sends a cancellation command and the device goes back to sleep.

    There’s a lot more detail available at:

    https://machinelearning.apple.com/2017/10/01/hey-siri.html
     
  24. Jsameds macrumors 68040

    Joined:
    Apr 22, 2008
    #24
    If all the data servers for voice assistant were hacked and their contents leaked, Apple’s would be the only one where the data is useless.
    --- Post Merged, Aug 2, 2019 ---
    That’s a you problem.
     
  25. Jim Lahey macrumors 6502

    Jim Lahey

    Joined:
    Apr 8, 2014
    Location:
    Great Britain
    #25
    You may not be aware, but the internet is populated by billions of individual people with individual views and opinions. It’s not a hive mind, where everything you read on forums is posted by the same consciousness ;)
     

Share This Page

243 August 1, 2019