Contractors Working on Siri 'Regularly' Hear Recordings of Drug Deals, Private Medical Info and More Claims Apple Employee

Discussion in 'MacRumors.com News Discussion' started by MacRumors, Jul 26, 2019.

  1. MacRumors macrumors bot

    MacRumors

    Joined:
    Apr 12, 2001
    #1
    [​IMG]


    Contractors that are working on Siri regularly hear confidential medical information, drug deals, recordings of couples having sex, and other private information, according to a report from The Guardian that shares details collected from a contractor who works on one of Apple's Siri teams.

    The employee who shared the info is one of many contractors around the world that listen to Siri voice data collected from customers to improve the Siri voice experience and help Siri better understand incoming commands and queries.

    [​IMG]

    According to The Guardian, the employee shared the information because he or she was concerned with Apple's lack of disclosure about the human oversight, though Apple has several times in the past confirmed that this takes place and the practice has been outlined in past reports as well.
    In a statement, Apple confirmed to The Guardian that a small number of anonymized Siri requests are analyzed for the purpose of improving Siri. A small, random subset (less than 1 percent) of daily Siri activations are used for grading, with each clip only lasting for a few seconds.
    Apple has not made its human-based Siri analysis a secret, but its extensive privacy terms don't appear to explicitly state that Siri information is listened to by humans. The employee said that Apple should "reveal to users" that human oversight exists.

    The contractor who spoke to The Guardian said that "the regularity of accidental triggers on the watch is incredibly high," and that some snippets were up to 30 seconds in length. Employees listening to Siri recordings are encouraged to report accidental activations as a technical problem, but aren't told to report about content.

    Apple has an extensive privacy policy related to Siri and says it anonymizes all incoming data so that it's not linked to an Apple ID and provides no information about the user. Still, the contractor claims that user data showing location, contact details, and app data is shared, and that names and addresses are sometimes disclosed when they're spoken aloud. To be clear, Apple says that all Siri data is assigned a random identifier and does not include location or contact details as stated by the contractor.
    While Apple's Siri privacy policy and security documents do not mention human oversight specifically, they are detailed and provide information on how Siri recordings are used.

    As stated in Apple's security white paper, for example, user voice data is saved for a six-month period so that the recognition system can use them to better understand a person's voice. The voice data that's saved is identified using a random identifier that's assigned when Siri is turned on, and it is never linked to an Apple ID. After six months, a second copy is saved sans any identifier and is used by Apple for improving Siri for up to two years. A small number of recordings, transcripts, and associated data without identifying information is sometimes used by Apple for ongoing improvement of Siri beyond two years.

    Apple's privacy website has a Siri section that offers up more info, explaining that all Siri queries are assigned a random identifier not associated with an Apple ID. The identifier is reset whenever Siri is turned off and then on again, and turning Siri off deletes all user data associated with a Siri identifier.
    Those concerned about Siri triggering accidentally on devices like the iPhone, Apple Watch, and HomePod can turn off the "Hey Siri" feature and can instead activate Siri manually, and Siri can also be turned off entirely.

    Article Link: Contractors Working on Siri 'Regularly' Hear Recordings of Drug Deals, Private Medical Info and More Claims Apple Employee
     
  2. radio893fm macrumors 6502

    Joined:
    Aug 11, 2004
    Location:
    Boston
    #3
    Timmy in 5, 4, 3, 2, 1... "No! Impossible. Never happened."
     
  3. mi7chy macrumors 603

    mi7chy

    Joined:
    Oct 24, 2014
    #4
    Apple can do a much better job of spinning this in a positive way.
     
  4. HackerJL macrumors regular

    Joined:
    Sep 19, 2009
    #5
    This here is the next week of headlines blowing this out of proportion. Brace yourself.
     
  5. Pelea macrumors 6502

    Joined:
    Oct 5, 2014
    #6
    So what? Apple employs the best people and even if they hear that it is a PRIVILEGE to have the personal attention of apple
     
  6. Tivoli_ macrumors member

    Joined:
    Dec 14, 2017
    #7
    No matter how much privacy is touted by companies, including Apple, it is difficult to trust any of them. Don't let any company have a spy in your house. Period.
     
  7. SDJim macrumors regular

    SDJim

    Joined:
    Aug 4, 2017
    Location:
    San Diego, CA
  8. forerunnerg34 macrumors 6502

    forerunnerg34

    Joined:
    Oct 6, 2015
    Location:
    Ecuador
    #9
    oh you are evil :D:D
     
  9. ersan191 macrumors 65816

    Joined:
    Oct 26, 2013
    #10
    Apple made it pretty clear that Siri is anonymized - contractors absolutely should not be able to see "contact details" of whoever the recording is from. This would be huge if that were the case, which I find hard to believe. If names and addresses are referenced separately then what does "contact details" even entail?
     
  10. weup togo macrumors 6502

    Joined:
    May 6, 2016
    #11
    People don't understand that any record that exists can be subpoenaed. Apple's anonymization techniques leave many holes, and they can be compelled to undo them by court order. There's nothing stopping a prosecutor from subpoenaing Apple for all audio records from a particular location at a particular time to solve a crime.

    We've all traded convenience for real privacy.
     
  11. drinkingtea, Jul 26, 2019
    Last edited: Jul 26, 2019

    drinkingtea macrumors 6502

    drinkingtea

    Joined:
    Jan 31, 2016
  12. Glockworkorange, Jul 26, 2019
    Last edited by a moderator: Jul 27, 2019

    Glockworkorange macrumors 68000

    Glockworkorange

    Joined:
    Feb 10, 2015
    Location:
    Chicago, Illinois
    #13
    There will be a Rene Ritchie video defending this soon. If it were Google/Facebook, he'd go *******.
     
  13. newyorksole macrumors 68040

    Joined:
    Apr 2, 2008
    Location:
    New York.
    #14
    Let me just throw away every piece of technology in my apartment.
     
  14. NickName99 macrumors 6502

    NickName99

    Joined:
    Nov 8, 2018
    #15
    Sounds like they just need to improve the vetting of these contractors. Naturally some private information will end up in Siri requests sometimes.

    I occasionally have access to sensitive information when I’m debugging using restored client database backups. I take it very seriously, I don’t go poking around, I delete the database when I’m done working on the issue. I treat it with the same respect I would want a fellow professional to treat my data with.
     
  15. jclo Editor

    jclo

    Staff Member

    Joined:
    Dec 7, 2012
    Location:
    California
    #16
    I'm also skeptical of this claim from the contractor. Apple's privacy policy is pretty clear on this point, even if it doesn't mention the human oversight. I believe names and addresses might sometimes be heard if spoken aloud, but not that info is sent with contact information/addresses/location.
     
  16. heov macrumors regular

    Joined:
    Aug 16, 2002
    #17
    Lol would love to see what users defending Apple here had to say when MacRumors posted about Amazon and Google employees listening in.

    So basically Apple does exactly what Amazon and Google does- listen. Whodathunk the only way to improve recognition is to have a human grade it.
     
  17. GrumpyMom macrumors G3

    GrumpyMom

    Joined:
    Sep 11, 2014
    #18
    Home Pod gets triggered super easily. She butts into a lot of conversations in our house. I’ve never seen my watch activate accidentally but my iPhone sometimes does.

    Alexa is also nosy and sometimes blurts out things that make me think a person is directly on the other end. One day I complained about not feeling well and Alexa self activated and wished me to get better soon. I was not able to trigger that reaction again. Even the first time I didn’t say anything remotely that sounded like “Alexa” nor did I notice the light come on indicating it had been triggered.

    With Google it’s easy to go in and see all of your recordings and delete them. At least delete them from your own eyes. Who really knows if they’re deleted on Google’s end of things. I quite doubt it.
     
  18. FontGeek macrumors newbie

    FontGeek

    Joined:
    Sep 15, 2018
    #19
    Let me know your "anonymous" location. I'll gladly come pick up your tech for you
     
  19. Optheduim macrumors regular

    Joined:
    Jun 9, 2011
    Location:
    NYC
  20. robjulo macrumors 65816

    Joined:
    Jul 16, 2010
    #21
    But....but...but Google and Alexa.

    It’ll be interesting to see how this is spun by the usual defenders.
     
  21. muadibe macrumors 6502

    Joined:
    Oct 11, 2010
    #22
    This is why all my devices require a button press to activate Siri, and the reason I didn’t get a HomePod. Having any continuous listening device just doesn’t seem like a good idea to me.
     
  22. heov macrumors regular

    Joined:
    Aug 16, 2002
    #23
    I think the main point is Google and Amazon seem to be unfairly criticized that humans are listening. That is their main criticism. Apple is not immune to this. Basically, what happened to Google a few weeks back can happen to Apple, too. https://www.cnbc.com/2019/07/11/google-admits-leaked-private-voice-conversations.html
     
  23. rtomyj macrumors 6502a

    rtomyj

    Joined:
    Sep 3, 2012
    #24
    You misspelled Google there. Though Apple does disclose this it might be a good idea if they also explicitly say humans are the ones listening.

    Imagine if Apple were an ad company with a personal assistant *shudders*
     
  24. gaximus macrumors 6502a

    Joined:
    Oct 11, 2011
    #25
    So I used to tell everyone that the reason Siri is so much worse than the others, is because they don't have real people listening in to make corrections. And that I take privacy over the better performance of other assistants. But that seems to not be the case. So what the **** Apple, I brag about Apple privacy to all my Android friends (friends with Android phones, not robot friends) and this is how you repay me. SMH
     

Share This Page

289 July 26, 2019