Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
64,156
31,882


As part of its Apple Intelligence feature set, Apple on Monday announced a partnership with OpenAI that will allow Siri to access ChatGPT directly in iOS 18, iPadOS 18, and macOS Sequoia to provide better responses in relevant situations.

google-gemini.jpg

In conversation with reporters after the WWDC keynote, Apple's senior VP of software engineering Craig Federighi revealed that as Apple Intelligence evolves, the company eventually wants to give its users a choice between different AI models, and suggested that Google Gemini could be an option in the future.

"We think ultimately people are going to have a preference for certain models they want to use, maybe one that's great for creative writing or one that they prefer for coding," said Federighi. "Maybe Google Gemini in the future. I mean, nothing to announce right now, but that's our direction."

Federighi said that Apple's decision to start with ChatGPT was because the company wanted to "start with the best." ‌Siri‌ will leverage GPT-4o for free, with no need to create an account. Requests are not logged and IP addresses are obscured, while ChatGPT subscribers will also be able to access paid features within the experiences.

Apple Intelligence features are not included in the first beta of iOS 18, and instead will become available for testing over the U.S. summer. iOS 18 is expected to be publicly released in September.

Article Link: Apple Likely to Add Google Gemini and Other AI Models to iOS 18
 
  • Love
Reactions: SFjohn

dugbug

macrumors 68000
Aug 23, 2008
1,887
2,020
Somewhere in Florida
So it was not clear to me watching the videos. Where is the line drawn between the A(pple)I features something built by apple and running on the AS servers and the plugins enabling ChatGPT/Gemini?

edit: From ArsTechnica answering this very question... sort of:
First up is Siri, which can tap into ChatGPT to answer voice questions. If Siri thinks ChatGPT can help answer your question, you'll get a pop-up permission box asking if you want to send your question to the chatbot. The response will come back in a window indicating that the information came from an outside source. This is the same way Siri treats a search engine (namely, Google), so how exactly Siri draws a line between ChatGPT and a search engine will be interesting. In Apple's lone example, there was a "help" intent, with the input saying to "help me plan a five-course meal" given certain ingredient limitations. That sort of ultra-specific input is something you can't do with a traditional search engine.

Siri can also send photos to ChatGPT. In Apple's example, the user snapped a picture of a wooden deck and asked Siri about decorating options. It sounds like the standard generative AI summary features will be here, too, with Apple SVP of Software Engineering Craig Federighi mentioning that "you can also ask questions about your documents, presentations, or PDFs."
 
Last edited:

kraistt

macrumors newbie
Mar 8, 2023
14
25
So it was not clear to me watching the videos. Where is the line drawn between the A(pple)I features something built by apple and running on the AS servers and the plugins enabling ChatGPT/Gemini?
i mean, the demo was pretty clear, Siri will suggest using a 3rd party AI to answer the question; you have to accept and then the request is sent to OAI
 

Lift Bar

macrumors regular
Nov 1, 2023
213
438
So it was not clear to me watching the videos. Where is the line drawn between the A(pple)I features something built by apple and running on the AS servers and the plugins enabling ChatGPT/Gemini?
It wasn’t clear to me which AI wrote the presentation. Had a little ChatGPT flavor, maybe some Gemini thrown in. Who knows, maybe even Grok got thrown in for a real knee-slapper!
 
  • Like
Reactions: supremedesigner

kraistt

macrumors newbie
Mar 8, 2023
14
25
so which features use the plugin? I assume without enabling a 3rd party model there are new SIRI capabilities?
we don’t know

we know that they have their own models for A LOT of stuff


In the following overview, we will detail how two of these models — a ~3 billion parameter on-device language model, and a larger server-based language model available with Private Cloud Compute and running on Apple silicon servers — have been built and adapted to perform specialized tasks efficiently, accurately, and responsibly. These two foundation models are part of a larger family of generative models created by Apple to support users and developers
 

Stiksi

macrumors 6502
Dec 7, 2007
391
570
Seems Apple has no problem leveraging Open AI’s completely unlicensed dataset. Noted. I knew to expect this from Microsoft but I still had a little hope for Apple to do the right thing.

All big tech companies are trying to outrun legislation and grab as much data and IP for their models as they can before it catches up. Privacy and copyright seem to be yesteryear’s things.
 

gigapocket1

macrumors 68020
Mar 15, 2009
2,283
1,766
I can already imagine the server crashes that’s about to happen on iOS 18 public release day…
i don’t know how you even start planning for something at that scale
 
  • Like
Reactions: bousozoku

bluegt

macrumors 6502
Jul 3, 2015
418
428
Seems Apple has no problem leveraging Open AI’s completely unlicensed dataset. Noted. I knew to expect this from Microsoft but I still had a little hope for Apple to do the right thing.

All big tech companies are trying to outrun legislation and grab as much data and IP for their models as they can before it catches up. Privacy and copyright seem to be yesteryear’s things.
The world has changed my friend.

When our neighbours across the pacific don’t even have those P and C words in their vocabulary, we need to adapt to compete.
 

Lord of the Pies

macrumors regular
Sep 2, 2016
100
180
South Africa
I can already imagine the server crashes that’s about to happen on iOS 18 public release day…
i don’t know how you even start planning for something at that scale
Doubt it. Most of the AI features are on-device and limited to iPhone 15 Pro series. That's a small chunk of the iPhone userbase. And OpenAI has already said on their blog that paid ChatGPT Plus users can link their account and enjoy paid benefits on their Apple devices. So they aren't giving away Plus for free.
 

dugbug

macrumors 68000
Aug 23, 2008
1,887
2,020
Somewhere in Florida

Jackbequickly

macrumors 68030
Aug 6, 2022
2,717
2,778
Sound like the user has to turn on the off the device AI feature, you have a choice. I just want Siri to be able to do stuff inside of iOS. Not looking to find the meaning of life.
 
  • Like
Reactions: gusmula

Fuzzball84

macrumors 68020
Apr 19, 2015
2,228
5,098
The drive for embedding AI deep into a variety of OS is at full speed. For a lot of us it’s still not 100 percent clear what is doing what behind the scenes, where the training data and other data sets the current iteration of AI have been “trained” on and what mitigations are in place for security and privacy. Yes they talk about on device AI processing. But a lot of what we do is on device and security remains a concern.

AI undoubtedly has huge benefits but there is a massive rush to be the first. And a patch it later mentality. I mean the legal and moral ramifications have still not been fully communicated and discussed in public for many of these technologies. The ramifications could be huge.
 

nicolas_s

macrumors regular
Nov 22, 2020
157
518
It's weird, people have been complaining for years that Apple is late in AI, that Siri is dumb and needs AI, that AI features on other devices like Samsung are so cool compared to Apple.

ChatGPT had the fastest growing user base in history, and everybody seems to love generative AI for everything including writing, photo editing or image generation.

But now that Apple introduced these feature, all of a sudden everybody hates AI 🤔
 
Last edited:

CarAnalogy

macrumors 601
Jun 9, 2021
4,414
8,081
I commented on this yesterday. ChatGPT does in fact use Bing (it says so when you ask it to look something up) so this is kind of changing the default browser.

I wonder how big a check Google is going to write to get Gemini on there before the end of the year.
 
  • Haha
Reactions: FriendlyMackle

Lord of the Pies

macrumors regular
Sep 2, 2016
100
180
South Africa
What is going on, truly? How are the checks and balances supposed to work between Apple and third parties?
ChatGPT enterprise users' chats aren't using for training and there some other minor privacy features compared to regular (free and paid) users. It seems Apple has those benefits for all of its users.

Really nothing else they can do. Gemini does about the same and Apple has a good relationship with Google, so I'd be surprised if the choice between ChatGPT and Gemini took a long time to arrive. I really hope Claude is also integrated but that seems more problematic.
 

Etc_

macrumors newbie
Sep 10, 2022
29
102
The drive for embedding AI deep into a variety of OS is at full speed. For a lot of us it’s still not 100 percent clear what is doing what behind the scenes, where the training data and other data sets the current iteration of AI have been “trained” on and what mitigations are in place for security and privacy. Yes they talk about on device AI processing. But a lot of what we do is on device and security remains a concern.

AI undoubtedly has huge benefits but there is a massive rush to be the first. And a patch it later mentality. I mean the legal and moral ramifications have still not been fully communicated and discussed in public for many of these technologies. The ramifications could be huge.
Apple made a smart move during their presentation, in my opinion. They showcased all new systems without the AI Stuff first (never even said it once) and then after that the AI Stuff. For me that seems like they still develope their systems without deep AI integration in mind and just put the whole AI thing on top of that, while still maintaining the impression it is deep embedded. I bet you can disable it entirely and never notice a thing.

Additionally, Google pays Apple a substantial amount of money to remain the default search engine on iPhones. Naturally, Google wants this "arrangement" to extend to AI features as well.
 

swingerofbirch

macrumors 68040
There are so many threads I don't know where to post my questions so I guess I will here:

1) If Apple Intelligence needs to at times revert to cloud-based computing power, why is it limited to certain devices based on their local processors? If you mix local and cloud, why couldn't any device just use the cloud services for all of it if doesn't have enough computing power?

2) Did they specify which parts are being offloaded to Apple's servers versus on device?

3) The only parts of this that seem computationally demanding to me are the writing tools and image generation, which seem like a completely different feature set than the other new Apple Intelligence features, which are essentially an expanded "Siri Suggestions." If they could have "Siri Suggestions" features on models going back a long ways, why not now? Will "Siri Suggestions" continue to exist? Will they exist as they do now with no upgrades for users below iPhone 15 Pro?

4) And I think I already know the answer to this, but Writing Tools is on Apple's servers, whereas anything creating new information from scratch goes to ChatGPT? It seems like Writing Tools is the closest Apple is getting to encroaching on ChatGPT's territory (well also with image creation which competes with DALL-E which OpenAI owns), and I can only imagine they want to continue taking ownership of that themselves, unless they get some super lucrative deal with OpenAI like they did with Google Search, but it doesn't seem likely given that OpenAI is not ads based.

What's frustrating for anyone below iPhone 15 Pro is that Apple has removed Siri features like using Siri to search for photos that they are now re-adding. I had hoped for a Siri improvements that would move it to being functional at what it was originally advertised to do, which it so often fails at or has regressed from altogether (like removing the ability to search photos by parameters such as videos I took in June), but it now looks like you have to upgrade your device for any improvement, unless there are general Siri upgrades for all models they haven't announced. It seems like the announcements were so scattershot and noncohesive that maybe describing general Siri improvements would have made it more confusing. This had a bit of the feel to me of their big Services keynote which felt similarly clunky. Too much business strategy and not enough about the end result.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.