Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But what is the alternative? Google collects even more data and you definitely can’t opt out (no toggle even). Linux phone?

Something like that... Maybe a lite phone or something less smart, and less connected. I'm not really as interested in carrying a internet-connected device as I used to be.
 
Last edited:
  • Like
Reactions: 8667132
hahahahahahaha
Apple completely lost it.
Looking for ways to get out of the ecosystem from now on
I am not too far behind with this notion, but as others have said, not a lot of great alternatives that I know of. I was actually ready to walk away from Apple on my next phone purchase when the original CSAM business came about, and had they not reversed that, my current phone would not be an Apple one.

It’s not about any one thing. It’s lots of little nits, aggravations, impositions, and intrusions that over time push some to the conclusion that the Apple premium price is simply not worth it anymore.

On the AI features themselves, I’m starting to wonder if there might be a market for “AI free” devices, meaning you don’t even give up resources to it if you don’t use it. When everyone else is pushing it so hard, what if Apple could legitimately claim “It’s your device. With us, if you turn AI features off, it really means OFF, and your device will be free of it.” I don’t know about anyone else, but that would get my attention.
 
You need to educate yourself on how Apple AI works. Most requests are processed on your device, and if they need to be sent to the cloud, it is done on a private cloud session. If they need to be processed by a 3rd party, it will request it first.
As I already wrote, this is not a valid reason for an opt-in if you take data protection seriously:
No - the privacy measures that Apple is supposedly taking cannot justify an opt-out.
- These measures may be incomplete and flawed.
- these measures are not fully transparent
- to my knowledge, these measures are not audited by a competent independent third party
 
I agree with you 100%, but that's not how the market sees it. All they care about is AI, AI, AI and the more obvious the AI is and more users sign up, the more their stock has a chance of going up.
Yes, but Apple can call anything “AI” and then go out and say that people are using it.
The “ depth effect” on the lock screen is technically using AI… but you don’t hear anyone complaining about it because it doesn’t have a label attached.
 
  • Like
Reactions: ifxf
Even Apple themselves admitted to it potentially spitting out “unexpected results”, as well as inaccurate ones.

Considering this then, don’t you think Apple Intelligence being opt-in –as the beta feature it unequivocally ismakes more sense than pushing it to everybody, THEN having people opt out?
No, because Apple Intelligence only “spits out” anything like that if you ask it to.
I am very critical of Apple Intelligence, so far a lot of the stuff they’ve implemented isn’t exactly useful and the notification summaries have been nothing short of kind of a disaster.
However, the absolute refusal from people in this thread to actually learn anything about what they are talking about is ridiculous.
Even enabled by default, Apple Intelligence doesn’t just start throwing things at you.
The automatic features like notifications summaries, and ChatGPT integration with Siri, have to be manually enabled and customized. In the process of setting them up, there are plenty of splash screens telling you that they might not be that accurate.
Outside of that, though, Apple Intelligence is not what you think it is. It is not ChatGPT, it doesn’t fundamentally change the way you use your phone.
It gives you some extra buttons in the keyboard, Photos app, and throughout the operating system, that’s it. You don’t even have to use them. If you don’t use them, your phone functions the exact same way it did before. AI changes nothing fundamental yet, and it is certainly not sending off your data unless you specifically ask it to complete a task.
 
  • Like
Reactions: spazzcat
Apple never learns. Remember when they forced a U2 album on everyone whether they wanted it or not?
New software features, which are already added yearly, is completely different than an album.
That’s like complaining that there are new wallpapers built into iOS that weren’t there before.
 
  • Like
Reactions: spazzcat
As I already wrote, this is not a valid reason for an opt-in if you take data protection seriously:
No - the privacy measures that Apple is supposedly taking cannot justify an opt-out.
- These measures may be incomplete and flawed.
- these measures are not fully transparent
- to my knowledge, these measures are not audited by a competent independent third party
I'm pretty sure they said there was 3rd party auditing.


The process involves multiple Apple teams that cross-check data from independent sources, and the process is further monitored by a third-party observer not affiliated with Apple.

 
As I already wrote, this is not a valid reason for an opt-in if you take data protection seriously:
No - the privacy measures that Apple is supposedly taking cannot justify an opt-out.
- These measures may be incomplete and flawed.
- these measures are not fully transparent
- to my knowledge, these measures are not audited by a competent independent third party
So basically what you’re saying is, you haven’t read any of Apple‘s paperwork, or you have and don’t want to believe them.
And if you don’t want to believe that they are telling the truth on this, why would you believe them on anything else?
In fact, if you think that they are “lying” about their privacy practices in their legal documents, why on earth would you be using their products at all, AI or not?

“Verifiable transparency​

We consider allowing security researchers to verify the end-to-end security and privacy guarantees of Private Cloud Compute to be a critical requirement for ongoing public trust in the system. Traditional cloud services do not make their full production software images available to researchers — and even if they did, there’s no general mechanism to allow researchers to verify that those software images match what’s actually running in the production environment. (Some specialized mechanisms exist, such as Intel SGX and AWS Nitro attestation.)

When we launch Private Cloud Compute, we’ll take the extraordinary step of making software images of every production build of PCC publicly available for security research. This promise, too, is an enforceable guarantee: user devices will be willing to send data only to PCC nodes that can cryptographically attest to running publicly listed software. We want to ensure that security and privacy researchers can inspect Private Cloud Compute software, verify its functionality, and help identify issues — just like they can with Apple devices.

Our commitment to verifiable transparency includes:

  1. Publishing the measurements of all code running on PCC in an append-only and cryptographically tamper-proof transparency log.
  2. Making the log and associated binary software images publicly available for inspection and validation by privacy and security experts.
  3. Publishing and maintaining an official set of tools for researchers analyzing PCC node software.
  4. Rewarding important research findings through the Apple Security Bountyprogram.
Every production Private Cloud Compute software image will be published for independent binary inspection — including the OS, applications, and all relevant executables, which researchers can verify against the measurements in the transparency log. Software will be published within 90 days of inclusion in the log, or after relevant software updates are available, whichever is sooner. Once a release has been signed into the log, it cannot be removed without detection, much like the log-backed map data structure used by the Key Transparency mechanism for iMessage Contact Key Verification.

As we mentioned, user devices will ensure that they’re communicating only with PCC nodes running authorized and verifiable software images. Specifically, the user’s device will wrap its request payload key only to the public keys of those PCC nodes whose attested measurements match a software release in the public transparency log. And the same strict Code Signing technologies that prevent loading unauthorized software also ensure that all code on the PCC node is included in the attestation.

Making Private Cloud Compute software logged and inspectable in this way is a strong demonstration of our commitment to enable independent research on the platform. But we want to ensure researchers can rapidly get up to speed, verify our PCC privacy claims, and look for issues, so we’re going further with three specific steps:

  • We’ll release a PCC Virtual Research Environment: a set of tools and images that simulate a PCC node on a Mac with Apple silicon, and that can boot a version of PCC software minimally modified for successful virtualization.
  • While we’re publishing the binary images of every production PCC build, to further aid research we will periodically also publish a subset of the security-critical PCC source code.
  • In a first for any Apple platform, PCC images will include the sepOS firmware and the iBoot bootloader in plaintext, making it easier than ever for researchers to study these critical components.
The Apple Security Bounty will reward research findings in the entire Private Cloud Compute software stack — with especially significant payouts for any issues that undermine our privacy claims.”
 
Apple has lived long enough to be the good guy who became the villain. It’s a real shame that Apple and Tim Cook has sunk this low to spend all their RD money on Apple Intelligence and it’s so bad that they have to sneakily turn it on by default so those that aren’t tech savvy won’t know any different. I’ve supported Apple for years buying their products, but next time around, it’s time to look at other competitors. The beauty of the internet is I can buy something from overseas easily. Shame on Apple…
 
Apple has lived long enough to be the good guy who became the villain. It’s a real shame that Apple and Tim Cook has sunk this low to spend all their RD money on Apple Intelligence and it’s so bad that they have to sneakily turn it on by default so those that aren’t tech savvy won’t know any different. I’ve supported Apple for years buying their products, but next time around, it’s time to look at other competitors. The beauty of the internet is I can buy something from overseas easily. Shame on Apple…
This is just straight up fear mongering and propaganda. Apple has one of the most strict AI policies for privacy out there.
 
  • Like
Reactions: Tagbert
Can we make AI requests without involving Siri/Search? I'm under the impression that the Apple can collect the AI requests, per their Siri and their Search policies.
And this comment right here is exactly why this entire conversation is ridiculous.
Siri up to this point is mostly the same as it was in iOS 17, just with a new interface and slightly better speech recognition. Outside of that, it works identically to the version of Siri on iOS 17.
There is an option in the settings to use ChatGPT as an extension, but you have to manually enable it, just like you would a third-party keyboard or a Safari extension.
“AI requests through Siri” makes no sense, those are not and never have been separate things.
Apple has published plenty of documentation, if anything, if Apple‘s wording is to be believed the “Apple Intelligence” version of Siri is actually *more* secure because it specifically goes through their private cloud compute servers, which are different than the regular Apple servers that Siri has been using since 2011.
 
  • Like
Reactions: nathansz
As I already wrote, this is not a valid reason for an opt-in if you take data protection seriously:
No - the privacy measures that Apple is supposedly taking cannot justify an opt-out.
- These measures may be incomplete and flawed.
- these measures are not fully transparent
- to my knowledge, these measures are not audited by a competent independent third party
it's been running for awhile on my machine and has yet to send anything off device

you have actually use it for something to trigger "private cloud compute" and so far haven't been able to do that
 
Apple has lived long enough to be the good guy who became the villain. It’s a real shame that Apple and Tim Cook has sunk this low to spend all their RD money on Apple Intelligence and it’s so bad that they have to sneakily turn it on by default so those that aren’t tech savvy won’t know any different. I’ve supported Apple for years buying their products, but next time around, it’s time to look at other competitors. The beauty of the internet is I can buy something from overseas easily. Shame on Apple…

how are they a villain here exactly?

because they spent money on a feature you aren't interested in?
 
And this comment right here is exactly why this entire conversation is ridiculous.
Siri up to this point is mostly the same as it was in iOS 17, just with a new interface and slightly better speech recognition. Outside of that, it works identically to the version of Siri on iOS 17.
There is an option in the settings to use ChatGPT as an extension, but you have to manually enable it, just like you would a third-party keyboard or a Safari extension.
“AI requests through Siri” makes no sense, those are not and never have been separate things.
Apple has published plenty of documentation, if anything, if Apple‘s wording is to be believed the “Apple Intelligence” version of Siri is actually *more* secure because it specifically goes through their private cloud compute servers, which are different than the regular Apple servers that Siri has been using since 2011.

Sorry? I was trying to get clarity on whether or not AI requests can be collected by Apple. Perhaps "AI requests" means different things to different people--and perhaps I shouldn't have used "AI" as shorthand for "Apple Intelligence?"

Apple Intelligence, IMO, was the original subject (as opposed to ChatGPT-integrated-AI). But I might have misread.
 
Sorry? I was trying to get clarity on whether or not AI requests can be collected by Apple. Perhaps "AI requests" means different things to different people--and perhaps I shouldn't have used "AI" as shorthand for "Apple Intelligence?"

Apple Intelligence, IMO, was the original subject (as opposed to ChatGPT-integrated-AI). But I might have misread.

AI requests are not collected by apple
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.