Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,560
30,891



Apple has suspended the grading program that used contractors to listen to Siri recordings for quality control purposes, but in a new report, The Irish Examiner (via The Verge) gives some additional insight into how it worked.

According to one of the contractors who worked on Siri grading in Cork, Ireland, employees were expected to listen to more than 1,000 Siri recordings per shift. Most recordings were a few seconds in length, and "occasionally" employees would hear personal data or snippets of conversation. Contractors primarily heard Siri commands, though.

hey-siri.jpg

Each recording was "graded" based on different factors, such as whether or not a Siri activation was accidental or if the query was something the personal assistant could or couldn't help with.

The employee said that Siri user details were kept anonymous, and that he or she mostly heard recordings with Canadian, Australian, and UK accents.
"I understood the reasons why the company was doing it but I could see why people would feel it was a breach of privacy because they weren't telling people. I think the lack of consent was the issue."
Data analysts who worked with Globetech, a Cork-based firm, were told this week that their work with Apple has been terminated. Apple and Globetech have not commented on how many employees were let go, but The Irish Examiner says that more than 300 contractors working on transcription and grading for Apple may have lost their jobs.

Apple last week told Globetech that it would be ending all transcription and voice grading work, and Globetech has confirmed that it will no longer be providing these services to Apple.

Prior to Apple's decision to end all grading and transcription work with Globetech, Apple prohibited employees from bringing their cell phones to work after the original story from The Guardian hit. In that report, an anonymous contractor said that employees working on Siri often heard private data including confidential medical information, drug deals, recordings of couples having sex, and more.

Following that story, where the employee also called out Apple for not properly disclosing human-based Siri grading to its customers, Apple announced that it would temporarily suspend the program worldwide.

Apple said it would review the process that's currently used, and also add a feature to let people to opt out of allowing their Siri recordings be used for quality control purposes. In a statement to The Irish Examiner, Apple said that it is still evaluating its grading processes and is "working closely" with partners to reach the "best possible outcome" for all involved.
"We believe that everyone should be treated with the dignity and respect they deserve -- this includes our own employees and the suppliers we work with in Ireland and around the world. Apple is committed to customer privacy and made the decision to suspend Siri grading while we conduct a thorough review of our processes. We're working closely with our partners as we do this to ensure the best possible outcome for our suppliers, their employees and our customers around the world."
It's not if and clear when Siri grading will resume, but it's likely going to remain suspended until Apple is able to release a software update that adds a toggle allowing customers to opt out.

Apple is facing a class action lawsuit over the issue, which claims Apple did not inform consumers that they are regularly being recorded without consent."

Article Link: Apple Contractors Listened to 1,000+ Siri Recordings Per Shift
 

zahuh

macrumors regular
Oct 22, 2004
219
1,474
Liam: Hey Siri

Siri: (Doo doo)

Liam: ...I don't know who you are....I don't know what you want... If you're looking for ransom, I can tell you I don't have money... but what I do have are a very particular set of skills. Skills I have acquired over a very long career. Skills that make me a nightmare for people like you. If you let my daughter go now, that will be the end of it - I will not look for you, I will not pursue you... but if you don't, I will look for you, I will find you... and I will kill you.


Siri: bro I’m just a contractor.
 

nordique

macrumors 68000
Oct 12, 2014
1,976
1,600
Considering how much Apple beats the privacy drum, I can understand why consumers would feel slighted, beyond the consent factor. If I knew Apple had contractors listening to my Siri commands I’d be upset too, doesn’t matter if 99% of those are calculations or setting alarms
 

now i see it

macrumors G4
Jan 2, 2002
10,644
22,254
same ole story- Apple gets caught red handed doing something sneaky then once they're caught they back track and say it's all a misunderstanding (while stopping what they were doing).

I have a novel idea! Why not not do stuff that'll get you into hot water in the first place instead of doing it and waiting to get caught?
 

jayducharme

macrumors 601
Jun 22, 2006
4,535
5,995
The thick of it
I just assumed that this practice was part of the "send analytics to Apple" checkbox when you activate any new Apple device. It wasn't like the contractors were continuously monitoring people's audio; they were simply checking Siri's response when a query was made in order to improve Siri's accuracy. No one was able to connect someone's name with the query. I can see why this makes good fodder for the media, but IMO it's being overblown.
 

Chrjy

macrumors 65816
May 19, 2010
1,095
2,098
UK
I do find it odd that Apple promotes privacy and yet does stuff like this. It kind of disappoints me because I sincerely hope they are not like the others but when stuff like this surfaces, it makes me lack confidence.

And before anyone says it's in the small print somewhere, it's the principle that matters. If you are going to brag about it, make damn sure that you follow it up.
 

Mdracer

macrumors regular
Jul 1, 2016
160
836
I wonder if incentives were placed to employees that listened to the most of private requests or listens per hour.
 

profets

macrumors 603
Mar 18, 2009
5,115
6,146
I just assumed that this practice was part of the "send analytics to Apple" checkbox when you activate any new Apple device. It wasn't like the contractors were continuously monitoring people's audio; they were simply checking Siri's response when a query was made in order to improve Siri's accuracy. No one was able to connect someone's name with the query. I can see why this makes good fodder for the media, but IMO it's being overblown.

While I mostly agree, it seems this was happening regardless of whether you checked that box to send analytics to apple.
 

BlueShirt11

macrumors newbie
Jun 19, 2019
17
72
Sounds like some idiot contractor who made it sound way worse than it actually was just got 300 people laid off...

I personally don't understand how you would expect the process to improve if it wasn't randomly sampled and corrected. Sounds like exactly what I would have expected to be going on, with zero concern.
 

Rob_2811

Suspended
Mar 18, 2016
2,569
4,253
United Kingdom
Sounds like some idiot contractor who made it sound way worse than it actually was just got 300 people laid off...

I personally don't understand how you would expect the process to improve if it wasn't randomly sampled and corrected. Sounds like exactly what I would have expected to be going on, with zero concern.

Like the battery thing talking in vague language buried in some fine print somewhere isn't enough. Particularly, as in this case, when you are shouting about privacy to anybody who will listen.

Apple is deliberately vague when it comes to this kind of thing, take this paragraph from their iCloud security overview document. It's very misleading.

End-to-end encrypted data

End-to-end encryption provides the highest level of data security. Your data is protected with a key derived from information unique to your device, combined with your device passcode, which only you know. No one else can access or read this data.

These features and their data are transmitted and stored in iCloud using end-to-end encryption:

  • Home data
  • Health data (requires iOS 12 or later)
  • iCloud Keychain (includes all of your saved accounts and passwords)
  • Payment information
  • Quicktype Keyboard learned vocabulary (requires iOS 11 or later)
  • Screen Time
  • Siri information
  • Wi-Fi passwords


https://support.apple.com/en-gb/HT202303
 

JPack

macrumors G5
Mar 27, 2017
12,558
23,273
Sounds like some idiot contractor who made it sound way worse than it actually was just got 300 people laid off...

I personally don't understand how you would expect the process to improve if it wasn't randomly sampled and corrected. Sounds like exactly what I would have expected to be going on, with zero concern.

If it was "worse than it actually was," Apple would have provided a statement. Apple has been silent for a month.
 

coolfactor

macrumors 604
Jul 29, 2002
7,076
9,767
Vancouver, BC
Considering how much Apple beats the privacy drum, I can understand why consumers would feel slighted, beyond the consent factor. If I knew Apple had contractors listening to my Siri commands I’d be upset too, doesn’t matter if 99% of those are calculations or setting alarms

I'm puzzled. When setting up a new iOS device, one the questions is "Do you agree to share information with Apple to help make their products and services better?". Every single consumer has the option to decline.

If declining, is the Siri data not shared? Or was Siri an exception to this agreement, hence the media firestorm?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.