Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Looks like some lawyer is going to make $46 million dollars and each of us will make a buck fifty-seven. WOO-HOO CAN'T WAIT.

I still can't believe the amount of suckers who sign up for these class action lawsuits. The lawyers are ALWAYS the big winners, while the consumption-junkie public gets squat. It just seems so idiotic and a large waste of time...
 
  • Like
Reactions: macduke
Good thing for check and balances. Hopefully, "what happens on your iPhone stays on your iPhone" false advertising class action lawsuit is next.
 
It is opt in, when you set up your phone it asks if you want to enable Siri or not. You can say no. You can turn off Siri. You can never use Siri and your voice will never be captured.

The ONLY way to improve voice recognition is to compare what the algorithm THINKS you said with what you ACTUALLY said. The only way to do that comparison is to have a person capable of understanding the language in question, listen to the recording and compare it to the output of the algorithm. There is no other way for it to work, not just from a technological standpoint but from a logical one.

For privacy and practical reasons Apple doesn't listen to every single Siri conversation. The conversations it does listen to are decoupled from the associated AppleID (at least thats what Apple claims, they could perhaps be lying, but it would be a stupid thing to do and that would be a bigger issue than the listening). Assuming Apple is being honest about that, then you've got random people listening to random conversations. Which is necessary for Apple to improve Siri. So Siri is listening and sometimes those conversations include personal details, not because Apple linked them, but because the person speaking said them. Theres not a lot Apple can do about that. They can TRY to train the algorithms to recognize personally identifiable information and exclude those kind of clips, but the algorithms ability to recognize that would mean Siri is really really advanced, which its probably not at this point. So from time to time random clips of Siri recordings are going to be listened to that contain personal information, its really not avoidable.

So why not just let people opt out but still use Siri? Well, it sounds like Apple is going to do that, but its a PR move not a product quality move. Its meant to assuage people who are outraged over something they agreed to in the first place and is a natural consequence of using Siri (or similar). So now that they are going to allow it, whats the problem? The problem is the more people who opt out, the smaller the pool of data is going to become, and the less randomized it will become, and the less ability to improve Siri there will be. Now at first, if only a handful of people drop out, its not going to likely have an impact, but the more people who drop out, the more they are dependent on the people who don't. But both groups of people are going to benefit from the work being done by an increasingly fewer number of participants. Paranoid people who overreacted to a non-story are going to become a drain on the system, hopefully not enough to impact the product improvement overall but possibly. So its potentially damaging from a practical standpoint.

If you are worried about Siri recording you, thats fair, some people value that kind of privacy highly and I have no problem with that, I don't have an Echo or an Alexa for that very reason, I don't trust Google/Amazon with my privacy like I do Apple, and some people don't trust any of them. HOWEVER, if thats the case, if you don't trust these systems, then you should not USE them. This is not the case of listening being an optional but not necessary component, this is a case of a necessary component being limited by people who want all the advantages without paying any price and expecting others to do it for them. Its selfish.

So by all means, opt out, its not hard to do, you just turn off Siri. You've had that option since day 1.

Amazon sells me a device for what I consider to be a very reasonable price. They provide a clear way to use Alexa but opt out of recording my voice for improvements. They have acted in good faith.

Apple sells me a device for what I consider to be very highly priced. They constantly act superior about security yet refuse to provide a clear way to use Siri but opt out of recording my voice for improvements. They have not acted in good faith.

Why should I care about improving Siri for Apple, I'm a (highly) paying customer. Why should I give Apple any benefit of the doubt when they can't even bother respecting me enough to provide a check box? Why should I forgive Apple for sending recording of me to third parties when they deliberately ran an expensive ad campaign lying that they didn't do that? And the alternative is they throw their toys out of the pram and not let you use Siri at all?

It's Apple who are selfish and want you to pay a high price, have no options, and improve their product for them.

All companies want your money and information, but Apple are the most hypocritical about it.
 
  • Like
Reactions: ipponrg
...:
Apple sells me a device for what I consider to be very highly priced. They constantly act superior about security yet refuse to provide a clear way to use Siri but opt out of recording my voice for improvements. They have not acted in good faith....
Nope. Their terms and conditions specify exactly what happens. additionally, security is not privacy. They act superior about security because they are taking active steps to make sure your user data is handled according to their privacy policy. So Apple has acted in good faith.

Why should I care about improving Siri for Apple, I'm a (highly) paying customer. Why should I give Apple any benefit of the doubt when they can't even bother respecting me enough to provide a check box? Why should I forgive Apple for sending recording of me to third parties when they deliberately ran an expensive ad campaign lying that they didn't do that? And the alternative is they throw their toys out of the pram and not let you use Siri at all?
You don’t have to do anything. Don’t have to buy their products, use Siri or whatever. You can sell any Apple devices you have and move totally to android. It’s up to you to figure out what you want from Apple and if they aren’t meeting your needs, move on.

The review of Siri recordings by third party was in the TOS and Apple doesn’t have to tell you anything. Security and privacy are not points in time they are a process and to that end Apple is updating its Siri review policy.

It's Apple who are selfish and want you to pay a high price, have no options, and improve their product for them.
Apple is selfish and amazon is the epitome of corporate largess with regards to security and privacy.

All companies want your money and information, but Apple are the most hypocritical about it.
You caught them dead to rights. Saw right through their thin veil of hypocrisy.;)

Of course with some hundreds of millions of customers people will think differently.
 
Did I miss it or has nobody yet actually noticed that it’s stated in BOLD PRINT in the iOS EULA that your Siri recordings are sent to and used by Apple and it’s subsidiaries? This lawsuit will go in the trash lol
 

Attachments

  • B3CD9004-830F-40F0-9E43-7F3694B097C1.png
    B3CD9004-830F-40F0-9E43-7F3694B097C1.png
    701.3 KB · Views: 77
Did I miss it or has nobody yet actually noticed that it’s stated in BOLD PRINT in the iOS EULA that your Siri recordings are sent to and used by Apple and it’s subsidiaries? This lawsuit will go in the trash lol

EULAs are not legally binding
 
  • Like
Reactions: ipponrg
However can’t say Apple was secretive about this. This lawsuit hopefully will go down as the FaceTime bug lawsuit.

Burying it in hundreds of pages of documentation while putting the message "what's on your phone stays on your phone" on billboards shows Apple is dishonest
 
  • Like
Reactions: mi7chy
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.