Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,209
38,991



Apple today announced that it will resume its Siri quality evaluation process in the fall with several privacy-focused changes.

hey-siri.jpg

Going forward, Apple will only gather audio samples from users who opt in to the grading program, and those who participate will be able to opt out at any time. And when a customer does opt in, only Apple employees will be allowed to listen to the audio samples, and the recordings will no longer be retained.

Apple says it will work to delete any recording which is determined to have resulted from Siri being triggered inadvertently.

These changes come after The Guardian reported that Apple contractors "regularly" heard confidential information while grading anonymized Siri audio samples. Following the report, Apple suspended the grading program and began conducting a review of its process, and it has now apologized over the matter.
As a result of our review, we realize we haven't been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users -- but only after making the following changes:

o First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

o Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

o Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.
Prior to suspending the grading program, Apple says it reviewed less than 0.2 percent of Siri interactions and their computer-generated transcripts to measure how well Siri was responding and to improve its reliability, including whether the user intended to invoke Siri or if Siri responded accurately.

In its press release, Apple emphasizes its commitment to protecting user privacy and outlines how Siri adheres to that. The company does not use Siri data to build a marketing profile of any user, for example, and it also uses a random identifier to keep track of the data while it is being processed.

Apple has shared a new support document with more details on Siri privacy and grading.

Article Link: Apple Apologizes Over Siri Privacy Concerns, Will Resume Grading Program in Fall With Several Changes
 
  • Like
Reactions: MrUNIMOG
I'd really like an option to send a specific query for auditing.

My most recent frustration is (I'm on the iOS 13 Public Beta) I say "Hey Siri play the soundtrack to Monsters Inc." and the onscreen text transcribes it correctly.

Siri replies: "Okay, here is the podcast Wait Wait, Don't Tell Me..."

As opposed to when I say "Hey Siri play the soundtrack to Wall-e" which is transcribed as Wally. So I don't necessarily it gets it right... but it played a random song called Wally or by someone called Wally. *shrug* that one I can kind of explain.

What can I say, my kid is on a Pixar kick.
 
Congrats guys your unwarranted freakout made Siri worse for everyone. You only have yourselves to blame the next time you’re cursing at it because it can’t understand you.
 
  • Like
Reactions: fjfjfjfj
Why is apple making it opt in rather than opt out?? There is no reason not to send ur data to Apple. Apple is a loving company and won’t do anything bad they just want to make Siri better!!
 
  • Like
Reactions: LiveM
What is "grading"? Are these the people that figure out if a request should go through as you wait, or is this way after the part where Siri couldn't understand what you were asking? If it is after the fact what does this "grading" do? Does it teach Siri that people having sex is not a request? I don't technically understand what this process is actually doing or improving.
 
What is "grading"? Are these the people that figure out if a request should go through as you wait, or is this way after the part where Siri couldn't understand what you were asking? If it is after the fact what does this "grading" do? Does it teach Siri that people having sex is not a request? I don't technically understand what this process is actually doing or improving.
They probably listen to the request and then grade Siri’s response for relevance, accuracy, etc. and then the grades are used to train Siri to respond more accurately. Just my guess.
 
Congrats guys your unwarranted freakout made Siri worse for everyone. You only have yourselves to blame the next time you’re cursing at it because it can’t understand you.


Apple should have apologize and pulled Siri completely and took this year to buy/develop a AI that is designed for the future and can at least do what the others can do.. I believe it’s possible For Siri to do what google assistant can do without the extra data..and PLEASE get rid of the “hey” every time I have to envoke Siri
 
  • Like
Reactions: apolloa
Having apple employees listening to Siri requests that doesn’t work for some reason seems like a good idea in order to fix the problem. I don’t mind it, if it’s anonymized (which it is). Should have been away to opt out of it from the get go, but here we are. It’s fixed now.
 
I can care less if Apple listens to my Siri requests to help make Siri better. It's not like I talk about personal matters with Siri.

For those complaining about Siri, she gets better in iOS 13, right?
 
You’d rather they don’t apologize and change their practices at all?

Changing practices is important. However, the apology, "As a result of our review, we realize we haven't been fully living up to our high ideals" is meaningless. They are changing their approach though. After being caught at the iPhone battery shenanigans, they made up an excuse for doing so when they apologised. This time no excuse, just sorry.
 
Apple should have apologize and pulled Siri completely and took this year to buy/develop a AI that is designed for the future and can at least do what the others can do.. I believe it’s possible For Siri to do what google assistant can do without the extra data..and PLEASE get rid of the “hey” every time I have to envoke Siri

how can they do that if people arent letting them use recordings
 
These things should have been in place from day-one but they are all good changes. Isn't this how things are supposed to work? A corporation over reaches, gets called out, publicly makes changes. Cynically, I think Apple making these changes is more likely to get Amoogle to change their process than all the negative press about the spying their devices do.
 
  • Like
Reactions: Sasparilla
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.