Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,553
30,882



Apple is facing a class action lawsuit [PDF] for employing contractors to listen to and grade some anonymized Siri conversations for the purpose of quality control and product improvement.

Apple's Siri practices were highlighted in a recent report where one of the contractors claimed that Apple employees evaluating Siri recordings often hear confidential medical information, drug deals, and other private information when Siri is activated accidentally.

hey-siri-800x350.jpg

The lawsuit, filed in a Northern California court today (and shared by CNBC's Kif Leswing), accuses Apple of "unlawful and intentional recording of individuals' confidential communications without their consent," violating California privacy laws when accidental Siri activations are recorded and evaluated by humans.
Siri Devices are only supposed to record conversations preceded by the utterance of "Hey Siri" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used Siri Devices and interacted with Siri have not consented to Apple recording conversations where "Hey Siri" was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.
As outlined in its privacy policies, Apple collects some anonymized Siri recordings for the purpose of improving Siri and, presumably, cutting down on accidental Siri activations. These recordings are analyzed by humans and can include details recorded when Siri mishears a "Hey Siri" trigger word.

The lawsuit claims that Apple has not informed consumers that they are "regularly being recorded without consent," though it also highlights Apple's privacy policy where Apple does state that such data can be used for improving its services.

The plaintiffs in the case, one of whom is a minor, claim to own an iPhone XR and an iPhone 6 that they would not have purchased had they known that their Siri recordings were stored for evaluation. The plaintiffs are seeking class action status for all individuals who were recorded by a Siri device without their consent from October 12, 2011 to the present.

The lawsuit asks for Apple to obtain consent before recording a minor's Siri interactions, to delete all existing recordings, and to prevent unauthorized recordings in the future. It also asks for $5,000 in damages per violation.

Apple has suspended its Siri evaluation program right now as it reviews the processes that are in place in light of the contractor's claims. Prior to the suspension of the program, Apple said that a small, random subset (less than 1%) of daily Siri requests are analyzed for improving Siri and dictation, with requests not associated with a user's Apple ID.

Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time. All collected Siri data can be cleared from an iOS device by turning Siri off and then on again, while accidental recordings can be stopped by disabling "Hey Siri."

Article Link: Apple Facing Lawsuit for 'Unlawful and Intentional' Recording of Confidential Siri Requests Without User Consent
 
  • Like
Reactions: macfacts

ersan191

macrumors 68000
Oct 26, 2013
1,711
3,971
Siri Devices are only supposed to record conversations preceded by the utterance of "Hey Siri" (a "wake phrase") or through a specific gesture, such as pressing the home button on a device for a specified amount of time. California law prohibits the recording of oral communications without the consent of all parties to the communication.

Individuals who have purchased or used Siri Devices and interacted with Siri have not consented to Apple recording conversations where "Hey Siri" was not uttered or where they did not otherwise perform a gesture intending to activate Siri, such as pressing and holding down the home button on a device for a certain period of time.
Uhh... did I miss something - I don’t think anyone ever suggested that it was recording outside of those times.
 
  • Like
Reactions: gixxerfool

ersan191

macrumors 68000
Oct 26, 2013
1,711
3,971
Yeah apparently the sound of a zipper can trigger Siri.. There was an article about the different sounds that could trigger "Hey Siri"
Is accidental triggering the basis of this lawsuit? Because it sounds like they are claiming Apple is recording conversations outside of wake word intentionally - which they definitely are not.
 

mmomega

macrumors demi-god
Dec 30, 2009
3,879
2,089
DFW, TX
My wife watching iJustine and she said "Hey Temi" (sounds like Tammy almost) from some new personal robot she got. yep that activated my HomePod and it can rarely turn my bedroom lamps on properly.
 
  • Like
Reactions: DeepIn2U and Huck

JetBlack7

macrumors 68030
May 14, 2011
2,544
792
Portugal
1st: Siri is activated on command, so not on all the time.

2nd: Google is always listening, yet no lawsuit.

3rd: They should have disclosed this issue from the start.
 
  • Like
Reactions: DeepIn2U

uajafd

macrumors member
Jun 17, 2015
38
32
"Apple in the future plans to release a software update that will let Siri users opt out of having their Siri queries included in the evaluation process, something that's not possible at the current time."

Not nearly good. This should be opt-in, not opt-out. Also, I imagine this is some grounds for a GDPR lawsuit.

I also read the whole privacy statement displayed when you try to enable Siri. Nowhere in it does it say that humans listen to snippets of your conversations.
 

MaxinMusicCity

macrumors regular
Mar 20, 2013
187
69
Nashville
I think its more that QA work... Theres something more to this story...
[doublepost=1565213050][/doublepost]

Yeah apparently the sound of a zipper can trigger Siri.. There was an article about the different sounds that could trigger "Hey Siri"
[doublepost=1565213136][/doublepost]
Who is saying it's 1%? I think thats BS from Apple... I bet it's every Siri encounter. No reason it couldn't be.. No reason the government couldn't catch all of that or force Apple to.
 

sir1963nz

macrumors 6502a
Feb 9, 2012
738
1,217
"What happens on your iPhone stays on your iPhone" ... apparently not.

Apple has made HUGE statements about privacy, "Privacy is King" , and it turns out that is 'misleading' at best.

Will I opt-in, nope because you have shown that you are no more trustworthy than any other company when it suits you.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.