Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When iOS 18 communicates ChatGPT, Apple obscures IP addresses and prevents OpenAI from storing user requests, and the same technique will be used in iOS 26 to maintain privacy.
As long as the user doesn’t have an Open AI account and just uses the free plan.

From what I know, and correct me if I’m wrong, the moment you sign in to enjoy your Plus subscription on the iPhone, this privacy measures disappear.
 
Hmm it seems to me like this just kicks the ball further away… GPT integration was only supposed to be a temporary band-aid until Apple had their own AI system figured out. The issue is GPT 5 moves the bar even higher. Can Apple ever replace GPT with their own system without it being a major downgrade? The way things look now… it seems unlikely.
 
...plus the addition of Apple Intelligence to Apple Shortcuts - that by far is the biggest and most useful thing Apple Intelligence has done so far.

Now you can build anything you can think of and link it together with an LLM - be in the tiny on device model if you need something very simple to work off line, private cloud for something more complex but data sensitive and GPT5 for the most complex stuff.
Agreed. Shortcuts with AI is so much more powerful and hopefully it will help more people use Shortcuts. It is one of the most powerful apps in iOS and unfortunately very few people use it.
 
Following a ChatGPT-5 article in theregister.com, I asked
“Who won the 2024 U.S. presidential election?”
It stumbled and couldn’t give an answer earlier this morning.
Just tried again and it told me Joe Biden won re-election in 2024 and was inaugurated in 2025.
…. Great… following right in Siri’s footsteps.
 
  • Haha
Reactions: bmark
One word: Kickbacks
Not currently. Apple and OpenAI are both on record saying no money is changing hands in either direction. Other AI companies want Apple to either purchase their company or license their LLM for a fee. So the reality is Apple doesn't feel pressured to spend money to give us choices. It's hard to beat free.

Why not Gemini?
I would love Gemini integration. Already have my action button mapped to Gemini Live and use it daily to check my Gmail and add things to Google Calendar. It's great. But I doubt Apple wants to get us to rely on Google services more. So they are in no rush to do anything. They might see Gemini as the slippery slope where more people start migrating from iCloud to Gmail and such and have less reasons to not switch to Android. Google may actually have to go the kickback route if they are willing (and not stopped by the DoJ) to sweeten the deal. (I know that sort of contradicts my first point, but I think Google is an exceptional case where Apple would tread a bit more carefully before giving Gemini keys to the kingdom.)

After MechaHilter? Never going to happen.
 
Last edited:
‘Siri will tap into the latest AI model when Apple's own systems can't handle specific requests’

So that’s everything but timers then.

And maybe even timers sometimes.
 
Still trying to understand this emphasis on Live Translation as a primary new feature. How often do people find themselves having conversations with people who both speak a language they don't understand at all, but with whom they are also close enough or who have enough patience to stand there waiting while they fiddle with their Airpods and phone in the middle of the exchange? And if the other person is also wearing a translating device, I feel you likely would have already been able to communicate on some level before setting up this device-assisted translation exchange.

Maybe there's a large use-case I'm not seeing, but I travel for work and am constantly surrounded by people who speak languages I don't. Most speak a common language (usually English) well enough to get by, and that's especially true of working professionals.
 


Apple Intelligence will integrate OpenAI's newly launched ChatGPT-5 model when iOS 26 arrives next month, Apple has confirmed (via 9to5Mac).

chatgpt-apple-haiku.jpg

The integration means Siri will tap into the latest AI model when Apple's own systems can't handle specific requests. ChatGPT-5, which OpenAI announced Thursday, offers enhanced reasoning capabilities and coding tools, and better voice interaction and video perception, compared to the current GPT-4o model powering Apple Intelligence.

Currently, ChatGPT can be invoked selectively within Apple Intelligence for tasks like web searches, document queries, and Visual Intelligence on iPhone 15 Pro and later models. Users can access these features without an OpenAI account, although linking one enables subscription benefits.

Apple Intelligence will also gain new capabilities in iOS 26, including Live Translation for real-time conversation interpretation in FaceTime and Messages, plus Visual Intelligence upgrades for systemwide content searching.
When iOS 18 communicates ChatGPT, Apple obscures IP addresses and prevents OpenAI from storing user requests, and the same technique will be used in iOS 26 to maintain privacy. The software update will arrive alongside the expected iPhone 17 launch next month.

Article Link: iOS 26 to Bring ChatGPT-5 Integration to Apple Intelligence
And as an example, you show a ChatGPT written Haiku? Suggest you take a look at the Apple Intelligence Shortcuts in the Gallery in the Shortcuts app.
 

Attachments

  • IMG_1813.jpeg
    IMG_1813.jpeg
    61.4 KB · Views: 13
Following a ChatGPT-5 article in theregister.com, I asked
“Who won the 2024 U.S. presidential election?”
It stumbled and couldn’t give an answer earlier this morning.
Just tried again and it told me Joe Biden won re-election in 2024 and was inaugurated in 2025.
…. Great… following right in Siri’s footsteps.

You can tell OpenAI has been around Apple a little to much.. 😂

Captura de pantalla 2025-08-08 a la(s) 09.36.07.png
 
  • Haha
Reactions: Ursadorable
Still trying to understand this emphasis on Live Translation as a primary new feature. How often do people find themselves having conversations with people who both speak a language they don't understand at all, but with whom they are also close enough or who have enough patience to stand there waiting while they fiddle with their Airpods and phone in the middle of the exchange? And if the other person is also wearing a translating device, I feel you likely would have already been able to communicate on some level before setting up this device-assisted translation exchange.

Maybe there's a large use-case I'm not seeing, but I travel for work and am constantly surrounded by people who speak languages I don't. Most speak a common language (usually English) well enough to get by, and that's especially true of working professionals.
My Apple Watch already does two way conversation translation, and for me, it’s a safety thing. The Live Translation, assuming it works well (i haven’t tried) is, I thought, for phone / Facetime call use? That would be very handy, especially for doing things like talking to my wife’s relatives in Spain.
 
  • Like
Reactions: StanleyOmar
You can tell OpenAI has been around Apple a little to much.. 😂

View attachment 2535680
Apple recently published a paper on reducing hallucinations and incorrect information in LLM based artificial intelligence systems. One of the fixes is to use the Internet as a resource, same as humans do. Siri gets it right because of that.
 

Attachments

  • IMG_1814.jpeg
    IMG_1814.jpeg
    117.6 KB · Views: 11
Apple recently published a paper on reducing hallucinations and incorrect information in LLM based artificial intelligence systems. One of the fixes is to use the Internet as a resource, same as humans do. Siri gets it right because of that.
The knowledge cutoff of the GPT-5 model is before the election. It's important to understand this as it's one of the key details of any LLM. You can augment this by either injecting things into the system prompt (Claude has a specific note about the election for this) or searching external sources e.g. the web.
 
The knowledge cutoff of the GPT-5 model is before the election. It's important to understand this as it's one of the key details of any LLM. You can augment this by either injecting things into the system prompt (Claude has a specific note about the election for this) or searching external sources e.g. the web.

Yet Grok manages to have up to the minute information on current affairs.

I was asking it other day why AMD stocks dropped 6% after the earnings announcement, and it had current and up to date information.

What good is an AI that has a time-based lapse on knowledge?
 
"Write me a Haiku about Apple the computer company?"
That's a wind-up, right?
If you're trying to convince people that you will make Siri a more capable assistant then you've got to come up with something better than that for your pitch. After all these years of mis-steps and complaints.
Here's hoping that integration with Frontier LLM brings us something a bit more valuable than Haiku on your iPhone.
If you want to write Haiku, then write Haiku. It's an act of expression and something like that you need to do yourself or there's no point.
'nuff said.
 
I am curious why not any LLM? I would prefer Grok to be in my iphone vs ChatGPT. Or swap with Gemini 2.5 pro or Claude.

No thanks. I'll continue to disable Apple Intelligence on my devices, and use the Grok app instead.

Sure, if you need more antisemitic, neo-Nazi and conspiracy-theory speech on your phone…. But I rather prefer Apple to stay far away.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.