Where did I complain about Siri?And then complain about issues with Siri.
Where did I complain about Siri?And then complain about issues with Siri.
Proof for iOS 18? I’ll wait while this is dug up. Nothing like 4 year old research along with an insult to disprove a point.Gizmodo didn't do the research. Not able to refute any of it then?
Heres a link to the researchers Twitter account showing the code as it was running in real time.
You are a gullible as the day is long if you think your constantly connected pocket computer made by mega corp that recently went to court to defend their collusion in Google search monopoly is looking out for your privacy.
The original complaint seems to believe Apple does in fact surreptitiously record for targeted advertising. Shame this didn't play out in court, perhaps Apple didn't want it to because of discovery.But...
In these types of cases, are we to believe that Siri was accidentally triggered, recorded what was said AND Apple then sold that to advertisers causing ads to be shown? Seems unlikely.
I believe there is no doubt that Apple provided recordings to third parties for Siri research, but how are we to make the jump to advertisers?
Apps can and do have access to the microphone, were the plaintiffs perhaps mistaken about how their talk of Air Jordans etc were disseminated to advertisers? This is far more likely.
My thoughts exactly. Or Apple could claim Siri is so bad it's not capable of something this advancedThat's a laugh. When does Apple back down from defending itself?
Could the real reason for why Apple is settling be that Apple doesn't want it to get out that Apple was selling user data to advertisers? This would be revealed if the case went to trial during the discovery phase as Apple would have to turn over such data.
Proof for iOS 18? I’ll wait while this is dug up. Nothing like 4 year old research along with an insult to disprove a point.
Ios 14.6? How many years ago? Here on earth it’s 4 years ago;uncompressed time.Wait all you like. It makes no odds if they stopped doing it when they got caught.
Also not sure what time zone you are in but November 2022 is not 4 years ago here on planet earth.
Ios 14.6? How many years ago? Here on earth it’s 4 years ago;?uncompressed time.
Apples privacy stance can’t be dismissed either. Your twitter post about iOS 14.6. You and I will believe what we want, old chap.iOS 15 dropped three years ago. Regardless of time elapsed it shows that Apple's privacy claims can't be taken at face value.
Privacy has an interesting history.Impossible! Tim told us that privacy is a fundamental human right!!
Clearly yoi don’t understand what the CSAM scanning was then.The meme doesn't say anything about people not being reported for CSAM either.
It points out the irony of Apple refusing to unlock the San Bernadino shooters iPhone because they argued at the time that any backdoors in iOS could be exploited by bad actors and then later proposing systematically scanning all iPhones for CSAM which is obviously open to abuse also.
Apple didn't go ahead with its proposed CSAM scanning, so does Apple think CSAM is ok too?![]()
How is scanning a hash of a known image database of CSAM open to abuse?
Clearly yoi don’t understand what the CSAM scanning was then.
How is scanning a hash of a known image database of CSAM open to abuse? Whatever. If you don’t care, you don’t know.
I think Apple believes their stance or -more likely- have collectively convinced themselves that they are privacy focused so naturally why wouldn't people believe what they do (as long as it aligns with shareholder value, else they won't follow that path and will take another route like their problematic relationship with China). I see it as mostly performative but it's important to remember the wise words of George Costanza when evaluating Apple's stance: "it's not a lie if you believe it." Apple is expert at convincing itself of things that do not align with reality, to be charitable let's call some of their statements disingenuous- we can all recount times where Apple said something that was situationally true but not objectively true. Or times when they were silent and afraid to say anything (2FA iCloud incident last year). Institutionally they seem so awfully sure of their position, which is an insight into how they view their customers ability to evaluate what they say. Brand loyalty is cult-adjacent. Watch out for that Kool aid. (I say this as a huge fan of the company).Apples privacy stance can’t be dismissed either. Your twitter post about iOS 14.6. You and I will believe what we want, old chap.
As someone who had a hand injury I probably could have had ads sent to me about doctors and rehab. I haven’t posted about it on FB or Instagram and what do you know I haven’t had anything sent to be nor targeted. I do believe Apples does believe in privacy the Siri thing should never have happened and Apple should have thought that out. Meta/X/ Google they will do anything they can to bypass Apples privacy measures as they need the info for more ad revenje. It just makes sense.I've been talking to Siri since it launched and never experienced anything like this.
In fact, I've had many(!) moments where I felt it would have helped my experience if it was able and allowed to suggest products and services to me. But, as always, it's only ever able to talk about trivia-esque facts and then "search the web for that" for everything else.
The best Siri has done with pairing my data with ads is sometimes suggesting apps from businesses when I travel around, based on my location. But I feel that's about it (and it only works on occasion). It's great when it works as it saves me a little time finding the app myself. But, of course, that's something I've enabled and not happening without my permission
With that in mind, I can entertain that there's some truth to what plaintiffs are alleging. Sure, I don't know the exact details, but I know that some apps and tech corporations have been doing something like that. So why not also Apple and its advertisers? It's not impossible just because Apple is branding itself and selling many of their products on "privacy".
But I also have to employ some critical thinking and suggest that getting ads for (generic, mainstream, hugely popular) clothing and fast-food brands, like Nike and Olive Garden, will happen to most of us a couple times a year.
The same goes for surgical treatments: millions of people are researching all kinds of surgeries every day. I've never had any elective surgeries, and don't look for beauty and health topics online or in apps. But have still gotten many ads for surgeries and various clinics.
Furthermore, all websites and many apps are constantly trying to build a "profile" on users to serve them ads, even the ones that aren't logged in, haven't used the site before, etc.: With just a few interactions, your IP address, and maybe gps location, they can quite accurately make some very good guesses about who you are and what you want to buy.
Not super specific, revealing guesses that would identify any particular individual. But something like "it's March and users in this location, who are online at this hour, probably belong to this age range, who have clicked these things on the site, usually shop for these mainstream clothing brands or look up locations of these fast-food chains. Let's send some New Balance and Subway ads their way. If they don't click that then they belong to a different demographic and want Urban Armour and McDonald's".
I'm undecided.
But in the meantime, I'll keep using Siri every day, constantly berating and belittling it for its incompetency while "share audio recordings with Apple to help make Siri better" is confidently toggled on, thank you very much! 😏
I’m awake and fully caffeinated. Having humans listen to a small percentage of recordings without metadata to improve the service was necessary before AI became as competent as it is now, and is FAR and away a different situation from selling that data, linked to you, to advertisers.Wake up!
![]()
Apple contractors 'regularly hear confidential details' on Siri recordings
Workers hear drug deals, medical details and people having sex, says whistleblowerwww.theguardian.com
![]()
Confirmed: Apple Caught In Siri Privacy Scandal, Let Contractors Listen To Private Voice Recordings
Atherton Research's Principal Analyst and Futurist Jeb Su unpacks the ramifications of Apple letting outside contractors access private conversations of Siri users.www.forbes.com
Let me condense this a bit. People will believe what they want.I think Apple believes their stance or -more likely- have collectively convinced themselves that they are privacy focused so naturally why wouldn't people believe what they do (as long as it aligns with shareholder value, else they won't follow that path and will take another route like their problematic relationship with China). I see it as mostly performative but it's important to remember the wise words of George Costanza when evaluating Apple's stance: "it's not a lie if you believe it." Apple is expert at convincing itself of things that do not align with reality, to be charitable let's call some of their statements disingenuous- we can all recount times where Apple said something that was situationally true but not objectively true. Or times when they were silent and afraid to say anything (2FA iCloud incident last year). Institutionally they seem so awfully sure of their position, which is an insight into how they view their customers ability to evaluate what they say. Brand loyalty is cult-adjacent. Watch out for that Kool aid. (I say this as a huge fan of the company).
The winning move here is to go forward with the lawsuit and prove their words about privacy have teeth. So something is rotten. I would take anything jobs said to the bank because his rep was at stake, I don't believe Cook operates under the same moral compunction
You can’t see the difference between giving an entire phone to law enforcement on the basis of a fact finding expedition, in an unlocked condition, and controlled scanning of known data related to a child abuse database? Fair enough. San Bernardino was literally a test case for not needing a warrant to search for something without a specific suspicion, and having the CSAM legislation is no different to having a biosecurity dog walking around an airport, which already occurs, but with real life victims.Because Apple can be compelled to use it to spy on citizens, the principle is no different that the San Bernadino phone.
You put that provision in place and it's out there for better or worse. That is fine if all it is doing is preventing the share of CSAM but what happens when it goes beyond that.
If Apple were going to do this they should've just unlocked the San Bernadino phone no questions asked.
And people who quote Seinfield as "wise words" don’t provide much confidence in what they say.Let me condense this a bit. People will believe what they want.
This should seriously be against the law. Since when did spying on people become the norm for these tech companies.Siri? I thought it was meta listening to us. this still happens all the time. every time I discuss something it gets advertised to me on Facebook and instagram.
You can’t see the difference between giving an entire phone to law enforcement on the basis of a fact finding expedition, in an unlocked condition, and controlled scanning of known data related to a child abuse database? Fair enough. San Bernardino was literally a test case for not needing a warrant to search for something without a specific suspicion, and having the CSAM legislation is no different to having a biosecurity dog walking around an airport, which already occurs, but with real life victims.
And people who quote Seinfield as "wise words" don’t provide much confidence in what they say.
Client-side scanning (CSS) gives access to data on users’ devices, including stored data, which “brings surveillance to a new level”, according to analysis from academics at Harvard Kennedy school, Massachusetts Institute of Technology (MIT) and the University of Cambridge, among others.
They write that the technology, which introduces background software on users’ devices, “tears at the heart of privacy of individual citizens” but is also fallible and could be evaded by those meant to be targeted, and misused.
In Bugs in Our Pockets, the Risks of Client-Side Scanning, a 46-page analysis of CSS published on open-access website arXiv on Friday, the authors say: “In reality, CSS is bulk intercept, albeit automated and distributed … CSS makes law-abiding citizens more vulnerable with their personal devices searchable on an industrial scale.
“Plainly put, it is a dangerous technology. Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope. We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”
Apple’s plans, unveiled this year, involve a technique called “perceptual hashing” to compare photos with known images of child abuse when users upload them to the cloud. If the company detects enough matches, it would manually review the images before flagging the user account to law enforcement.
Apple paused implementation after a backlash from privacy campaigners last month but not before researchers managed to construct vastly different images that produced the same fingerprint and thus would appear identical to Apple’s scanning system, creating false positives.
Others managed to do the reverse: change the mathematical output of an image without changing how it looks at all, thereby creating false negatives.
The report’s authors say people may also try to disable scanners or avoid using devices such as iPhones with CSS. They added: “The software provider, the infrastructure operator, and the targeting curator must all be trusted. If any of them – or their key employees – misbehave, or are corrupted, hacked or coerced, the security of the system may fail.”
While CSS may be mooted as intended to target specific content, the report warns: “Come the next terrorist scare, a little push will be all that is needed to curtail or remove the current protections.”
Or they have more important fish to fry. Either way settling is not an admission of guilt as many here believe.The original complaint seems to believe Apple does in fact surreptitiously record for targeted advertising. Shame this didn't play out in court, perhaps Apple didn't want it to because of discovery.
What happens? My wife and I did an experiment where we threw keywords, such as prostate cancer, hiv, etc into Siri and iMessage chat and guess what? Nothing.This still happens to this day. I don't know exactly under what circumstances they are listening but they are using the data spoken in Facetime calls and sent in iMessage for targeted advertising. There are very specific things I have been shown ads for within 30 minutes of speaking about it on a call or sending it in iMessage that was targeted and not searched for or visited on other devices. Encryption be damned that doesn't mean anything once it's decrypted on somebody's servers and had keywords parsed.