Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So much ignorance on display in this thread, especially for a tech forum.

This was a problem but it isn't responsible for any of the things people are worried about or claim is happening. Data brokers, social graphs, and telemetry all work together so well it seems like a lot of things are occurring that aren't. Much of that can be stymied but not entirely prevented, and it takes a ton of work.

The real outstanding issue with regard to Apple and privacy right now is holes in their VPN stacks at least on non-Mac devices. It's absurd Lockdown Mode doesn't default to a kill-switch, or even afford the opportunity. And some of Apple's network activity (including the newly introduced RCS support) often goes around the tunnels, provably so.

Apple also needs to improve their VPN support, and build in support for simultaneous IKEv2 and WireGuard – right now you have to have an on-device "firewall" use IKEv2 generally, and that means no VPN support for WireGuard which is bad because the other protocol is not really secure anymore but it is sufficient for blocking on-device app telemetry traffic via DNS. This should be greatly expanded with a new iOS version, ideally via a fully featured firewall within the OS for those that would want to take advantage of it.

Broadly speaking Apple is the most secure device provider for consumers en masse but there are a couple flavors of Android that are significantly better and Apple should have people using those and learning from them. The "first boot" after N timeout mode they just introduced is one such feature for example.
 
Last edited:
Gizmodo didn't do the research. Not able to refute any of it then?

Heres a link to the researchers Twitter account showing the code as it was running in real time.



You are a gullible as the day is long if you think your constantly connected pocket computer made by mega corp that recently went to court to defend their collusion in Google search monopoly is looking out for your privacy.
Proof for iOS 18? I’ll wait while this is dug up. Nothing like 4 year old research along with an insult to disprove a point.
 
But...



In these types of cases, are we to believe that Siri was accidentally triggered, recorded what was said AND Apple then sold that to advertisers causing ads to be shown? Seems unlikely.



I believe there is no doubt that Apple provided recordings to third parties for Siri research, but how are we to make the jump to advertisers?

Apps can and do have access to the microphone, were the plaintiffs perhaps mistaken about how their talk of Air Jordans etc were disseminated to advertisers? This is far more likely.
The original complaint seems to believe Apple does in fact surreptitiously record for targeted advertising. Shame this didn't play out in court, perhaps Apple didn't want it to because of discovery.
 
That's a laugh. When does Apple back down from defending itself?

Could the real reason for why Apple is settling be that Apple doesn't want it to get out that Apple was selling user data to advertisers? This would be revealed if the case went to trial during the discovery phase as Apple would have to turn over such data.
My thoughts exactly. Or Apple could claim Siri is so bad it's not capable of something this advanced
 
I had a very interesting instance of this…

I am one of the coaches on my son’s youth hockey team. He hurt his leg at one of the practices and I was discussing with the other coaches what we should do. Didn’t have my Watch on and my phone was in my pocket.

We get to the car and when CarPlay booted up, I get a Siri suggestion in Apple Maps for directions to Urgent Care. True story. I was in awe.

Edit* added “Apple” Maps
 
Last edited:
The meme doesn't say anything about people not being reported for CSAM either.

It points out the irony of Apple refusing to unlock the San Bernadino shooters iPhone because they argued at the time that any backdoors in iOS could be exploited by bad actors and then later proposing systematically scanning all iPhones for CSAM which is obviously open to abuse also.

Apple didn't go ahead with its proposed CSAM scanning, so does Apple think CSAM is ok too? :oops:
Clearly yoi don’t understand what the CSAM scanning was then.

How is scanning a hash of a known image database of CSAM open to abuse? Whatever. If you don’t care, you don’t know.
 
How is scanning a hash of a known image database of CSAM open to abuse?

A whole bunch of ways once it is implemented as then concerns open up about what database are we scanning against and for what purpose, particularly in different jurisdictions

Also, once you open up potential for that type of technology, you’re going to get requested to, if not forced to, implement it by various governments around the world for any manner of purposes.

It’s not the type of tool you want to implement at all, if it can be avoided.

Doing so creates endless and growing issues
 
Clearly yoi don’t understand what the CSAM scanning was then.

How is scanning a hash of a known image database of CSAM open to abuse? Whatever. If you don’t care, you don’t know.


Because Apple can be compelled to use it to spy on citizens, the principle is no different that the San Bernadino phone.

You put that provision in place and it's out there for better or worse. That is fine if all it is doing is preventing the share of CSAM but what happens when it goes beyond that.

If Apple were going to do this they should've just unlocked the San Bernadino phone no questions asked.
 
Apples privacy stance can’t be dismissed either. Your twitter post about iOS 14.6. You and I will believe what we want, old chap.
I think Apple believes their stance or -more likely- have collectively convinced themselves that they are privacy focused so naturally why wouldn't people believe what they do (as long as it aligns with shareholder value, else they won't follow that path and will take another route like their problematic relationship with China). I see it as mostly performative but it's important to remember the wise words of George Costanza when evaluating Apple's stance: "it's not a lie if you believe it." Apple is expert at convincing itself of things that do not align with reality, to be charitable let's call some of their statements disingenuous- we can all recount times where Apple said something that was situationally true but not objectively true. Or times when they were silent and afraid to say anything (2FA iCloud incident last year). Institutionally they seem so awfully sure of their position, which is an insight into how they view their customers ability to evaluate what they say. Brand loyalty is cult-adjacent. Watch out for that Kool aid. (I say this as a huge fan of the company).

The winning move here is to go forward with the lawsuit and prove their words about privacy have teeth. So something is rotten. I would take anything jobs said to the bank because his rep was at stake, I don't believe Cook operates under the same moral compunction
 
Last edited:
I've been talking to Siri since it launched and never experienced anything like this.

In fact, I've had many(!) moments where I felt it would have helped my experience if it was able and allowed to suggest products and services to me. But, as always, it's only ever able to talk about trivia-esque facts and then "search the web for that" for everything else.

The best Siri has done with pairing my data with ads is sometimes suggesting apps from businesses when I travel around, based on my location. But I feel that's about it (and it only works on occasion). It's great when it works as it saves me a little time finding the app myself. But, of course, that's something I've enabled and not happening without my permission

With that in mind, I can entertain that there's some truth to what plaintiffs are alleging. Sure, I don't know the exact details, but I know that some apps and tech corporations have been doing something like that. So why not also Apple and its advertisers? It's not impossible just because Apple is branding itself and selling many of their products on "privacy".

But I also have to employ some critical thinking and suggest that getting ads for (generic, mainstream, hugely popular) clothing and fast-food brands, like Nike and Olive Garden, will happen to most of us a couple times a year.

The same goes for surgical treatments: millions of people are researching all kinds of surgeries every day. I've never had any elective surgeries, and don't look for beauty and health topics online or in apps. But have still gotten many ads for surgeries and various clinics.

Furthermore, all websites and many apps are constantly trying to build a "profile" on users to serve them ads, even the ones that aren't logged in, haven't used the site before, etc.: With just a few interactions, your IP address, and maybe gps location, they can quite accurately make some very good guesses about who you are and what you want to buy.

Not super specific, revealing guesses that would identify any particular individual. But something like "it's March and users in this location, who are online at this hour, probably belong to this age range, who have clicked these things on the site, usually shop for these mainstream clothing brands or look up locations of these fast-food chains. Let's send some New Balance and Subway ads their way. If they don't click that then they belong to a different demographic and want Urban Armour and McDonald's".

I'm undecided.

But in the meantime, I'll keep using Siri every day, constantly berating and belittling it for its incompetency while "share audio recordings with Apple to help make Siri better" is confidently toggled on, thank you very much! 😏
As someone who had a hand injury I probably could have had ads sent to me about doctors and rehab. I haven’t posted about it on FB or Instagram and what do you know I haven’t had anything sent to be nor targeted. I do believe Apples does believe in privacy the Siri thing should never have happened and Apple should have thought that out. Meta/X/ Google they will do anything they can to bypass Apples privacy measures as they need the info for more ad revenje. It just makes sense.
 
Wake up!

I’m awake and fully caffeinated. Having humans listen to a small percentage of recordings without metadata to improve the service was necessary before AI became as competent as it is now, and is FAR and away a different situation from selling that data, linked to you, to advertisers.

Like 1% of Apple’s revenue comes from advertising, why would they risk their reputation on something like that? It doesn’t make sense from a business perspective.

I’m going to let it shake out in the courts, if there is anything to it they will find out, but I’d be surprised and highly disappointed if they actually did anything like that
 
I think Apple believes their stance or -more likely- have collectively convinced themselves that they are privacy focused so naturally why wouldn't people believe what they do (as long as it aligns with shareholder value, else they won't follow that path and will take another route like their problematic relationship with China). I see it as mostly performative but it's important to remember the wise words of George Costanza when evaluating Apple's stance: "it's not a lie if you believe it." Apple is expert at convincing itself of things that do not align with reality, to be charitable let's call some of their statements disingenuous- we can all recount times where Apple said something that was situationally true but not objectively true. Or times when they were silent and afraid to say anything (2FA iCloud incident last year). Institutionally they seem so awfully sure of their position, which is an insight into how they view their customers ability to evaluate what they say. Brand loyalty is cult-adjacent. Watch out for that Kool aid. (I say this as a huge fan of the company).

The winning move here is to go forward with the lawsuit and prove their words about privacy have teeth. So something is rotten. I would take anything jobs said to the bank because his rep was at stake, I don't believe Cook operates under the same moral compunction
Let me condense this a bit. People will believe what they want.
 
Because Apple can be compelled to use it to spy on citizens, the principle is no different that the San Bernadino phone.

You put that provision in place and it's out there for better or worse. That is fine if all it is doing is preventing the share of CSAM but what happens when it goes beyond that.

If Apple were going to do this they should've just unlocked the San Bernadino phone no questions asked.
You can’t see the difference between giving an entire phone to law enforcement on the basis of a fact finding expedition, in an unlocked condition, and controlled scanning of known data related to a child abuse database? Fair enough. San Bernardino was literally a test case for not needing a warrant to search for something without a specific suspicion, and having the CSAM legislation is no different to having a biosecurity dog walking around an airport, which already occurs, but with real life victims.

Let me condense this a bit. People will believe what they want.
And people who quote Seinfield as "wise words" don’t provide much confidence in what they say.
 
  • Like
Reactions: Tagbert and I7guy
Siri? I thought it was meta listening to us. this still happens all the time. every time I discuss something it gets advertised to me on Facebook and instagram.
This should seriously be against the law. Since when did spying on people become the norm for these tech companies.
 
First I’ve heard of this lawsuit… and I don’t have any doubts the thing is always listening. For sure apps themselves are, Facebook being one. Speak about someone and they’ll magically be at the top of your feed.
 
You can’t see the difference between giving an entire phone to law enforcement on the basis of a fact finding expedition, in an unlocked condition, and controlled scanning of known data related to a child abuse database? Fair enough. San Bernardino was literally a test case for not needing a warrant to search for something without a specific suspicion, and having the CSAM legislation is no different to having a biosecurity dog walking around an airport, which already occurs, but with real life victims.


And people who quote Seinfield as "wise words" don’t provide much confidence in what they say.

Yeah you know for me, the combined research of Harvard, Cambridge and MIT > Steve off the internet. Especially when Steve can't get the facts right when citing other examples.

The FBI obtained court order under the all writs act for the San Bernadino phone, so no they were not asking for information to be supplied without the order of a court.


Client-side scanning (CSS) gives access to data on users’ devices, including stored data, which “brings surveillance to a new level”, according to analysis from academics at Harvard Kennedy school, Massachusetts Institute of Technology (MIT) and the University of Cambridge, among others.


They write that the technology, which introduces background software on users’ devices, “tears at the heart of privacy of individual citizens” but is also fallible and could be evaded by those meant to be targeted, and misused.

In Bugs in Our Pockets, the Risks of Client-Side Scanning, a 46-page analysis of CSS published on open-access website arXiv on Friday, the authors say: “In reality, CSS is bulk intercept, albeit automated and distributed … CSS makes law-abiding citizens more vulnerable with their personal devices searchable on an industrial scale.

“Plainly put, it is a dangerous technology. Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope. We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”

Apple’s plans, unveiled this year, involve a technique called “perceptual hashing” to compare photos with known images of child abuse when users upload them to the cloud. If the company detects enough matches, it would manually review the images before flagging the user account to law enforcement.

Apple paused implementation after a backlash from privacy campaigners last month but not before researchers managed to construct vastly different images that produced the same fingerprint and thus would appear identical to Apple’s scanning system, creating false positives.

Others managed to do the reverse: change the mathematical output of an image without changing how it looks at all, thereby creating false negatives.

The report’s authors say people may also try to disable scanners or avoid using devices such as iPhones with CSS. They added: “The software provider, the infrastructure operator, and the targeting curator must all be trusted. If any of them – or their key employees – misbehave, or are corrupted, hacked or coerced, the security of the system may fail.”

While CSS may be mooted as intended to target specific content, the report warns: “Come the next terrorist scare, a little push will be all that is needed to curtail or remove the current protections.”


The bold bit is key, they then go on to cite examples of Apple bowing to state pressure in both Russia and China.
 
This still happens to this day. Data from Facetime calls and sent in iMessage for targeted advertising. There are very specific things I have been shown ads for within 30 minutes of speaking about it on a call or sending it in iMessage that was targeted and not searched for or visited on other devices. Encryption be damned that doesn't mean anything once it's decrypted on somebody's servers and had keywords parsed.
 
Last edited:
This still happens to this day. I don't know exactly under what circumstances they are listening but they are using the data spoken in Facetime calls and sent in iMessage for targeted advertising. There are very specific things I have been shown ads for within 30 minutes of speaking about it on a call or sending it in iMessage that was targeted and not searched for or visited on other devices. Encryption be damned that doesn't mean anything once it's decrypted on somebody's servers and had keywords parsed.
What happens? My wife and I did an experiment where we threw keywords, such as prostate cancer, hiv, etc into Siri and iMessage chat and guess what? Nothing.
 
Last edited:
  • Like
Reactions: Tagbert
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.