The OpenAI connection is opt in. If you don't want it you don't have to use it. No data is sent without your consent.And we all know that when you don't pay, you are the product… so… yeah.
OpenAI are definitely getting something in return. Just, perhaps not in cash.
Explains the stock surge.
No money now but when revenue is generated Apple will take a chunk.
Are we the product now?
Yes, they will run on device AI for the simpler tasks and other bigger tasks you can send to Open AII don't quite understand how Apple's system is set up.
So, does this mean that Apple have developed their own genAI framework and trained an LLM in 18 months? Or have Open AI provided the ChatGPT framework to Apple which they then run on Apple servers and train with their own LLMs? Is this going to be like Maps where the Apple servers/LLMs start out quite limited (hence "world data" questions are sent to ChatGPT), but over time Apple will add more servers and train the LLMs with wider data sets so they can eventually operate an end-to-end Apple AI service?
- There is on-device processing for most tasks
- There is cloud processing for some heavy tasks on Apple-owned & operated servers
- Some requests on "world data" are optionally sent to ChatGPT (in a way that OpenAI can't build up a picture of your requests over time)
In your dreams. Of course they are getting data, why do you think you have to consent for the query to be addressed by ChatGPT? Clearly whichever privacy arrangement is in place for ‘Apple intelligence’ does not apply when you choose to use ChatGPT.they're not getting any
Is the other way around...like Google is paying Apple to be the default search engine
Yes, they have a smaller 3B model running on device and a larger 'GPT4 Turbo' rival running on the Private Cloud.I don't quite understand how Apple's system is set up.
So, does this mean that Apple have developed their own genAI framework and trained an LLM in 18 months? Or have Open AI provided the ChatGPT framework to Apple which they then run on Apple servers and train with their own LLMs? Is this going to be like Maps where the Apple servers/LLMs start out quite limited (hence "world data" questions are sent to ChatGPT), but over time Apple will add more servers and train the LLMs with wider data sets so they can eventually operate an end-to-end Apple AI service?
- There is on-device processing for most tasks
- There is cloud processing for some heavy tasks on Apple-owned & operated servers
- Some requests on "world data" are optionally sent to ChatGPT (in a way that OpenAI can't build up a picture of your requests over time)
To put it more accurately, we are the leverage. Apple has aggregated the best customers in the world, and the beauty of this is that they are not afraid to use this access to its user base as leverage in its discussions with developers and suppliers.Of course Apple is not paying money.
It's paying with their users, we're the currency.
You are comparing Apples to Oranges. MS reasons for working with Openai are long term. Apple's reasons are short-term:There’s the difference between Apple and Microsoft.
Apple acts like “We aren’t paying you! You need us. You should thank us!”
Microsoft is “Here’s $10 billion and you can use our servers”
Someone think of the AAPL shareholders!![]()
To put it more accurately, we are the leverage. Apple has aggregated the best customers in the world, and the beauty of this is that they are not afraid to use this access to its user base as leverage in its discussions with developers and suppliers.
Here's what I think is happening here.
1) Apple clearly has little desire to come up with their own chatGPT / Gemini competitor, and why should they? It costs a ton of money to train and develop, and it's not something Apple has a competitive advantage in or can monetise directly. What Apple does have to offer is a massive user base that they can offer openAi access to, and on Apple's terms.
2) At the same time, I would rather not see Apple partner with Google on this, because I am getting flashbacks of Google Maps all over again. Instead, what we get is chatGPT integrated with iOS at a system level, which should be able to fight off a Gemini-integrated Android OS on a features level.
3) From OpenAI's perspective, they may also not have much choice in this matter. Google already has Gemini, and if they don't take Apple up on this deal, Apple may well find another LLM willing to partner with them, even if that alternative is not as good as theirs. Despite not getting any user data from Apple, openAI can at least get access to Apple's user base and maybe sell more chatGPT subscriptions that way (and Apple gets a 30/15% cut out of this as well, presumably).
If my understanding is correct, this encapsulates what I find most impressive about Apple (which I think many people still either don't get, or love to mischaracterise as them exhibiting monopolistic behaviour). Apple, better than anyone else, understands its position as an aggregator in the value chain, and does not hesitate to bring that position to bear on companies and developers in order to serve its interests (and by extension, the interests of its user base as well) on its terms.
For a company which until recently, was criticised as being way behind in the AI race, I think that Apple is once again, right on time, and right where they need to be. They have so far avoided a Microsoft-level controversy, and I believe their promise of privacy and security will go down well with their users (and with consumers in general), making Apple well-positioned to "do AI right".
If any of you wonder why I remain such an Apple fanboy, this is the reason why. Apple's ability to execute remains as impressive as ever. As to whether they do actually deliver on this promise, well, we will just have to wait and see. But so far, kudos to Apple. 😊
Yes Apple has two LLMs: one for on device processing and one for server side. Contrary to what a lot of people are saying those models are quite good. Local one is state of art and server one is not that far behind gpt4.I don't quite understand how Apple's system is set up.
So, does this mean that Apple have developed their own genAI framework and trained an LLM in 18 months? Or have Open AI provided the ChatGPT framework to Apple which they then run on Apple servers and train with their own LLMs? Is this going to be like Maps where the Apple servers/LLMs start out quite limited (hence "world data" questions are sent to ChatGPT), but over time Apple will add more servers and train the LLMs with wider data sets so they can eventually operate an end-to-end Apple AI service?
- There is on-device processing for most tasks
- There is cloud processing for some heavy tasks on Apple-owned & operated servers
- Some requests on "world data" are optionally sent to ChatGPT (in a way that OpenAI can't build up a picture of your requests over time)
However, Gurman notes that OpenAI could profit from the deal by encouraging Apple users to subscribe to ChatGPT Plus for $20 a month. If these users sign up through an Apple device, Apple may also be in a position to receive a commission.
Article Link: Apple Reportedly Not Paying OpenAI to Use ChatGPT in iOS 18
And we all know that when you don't pay, you are the product… so… yeah.
OpenAI are definitely getting something in return. Just, perhaps not in cash.
And their data to train.
Do you have a source for that? I am not trying to argue, just curious where they say this.OpenAI isn't allowed by Apple to use the data to train their models.
That's a little bit cryptic. They won't store the data, but they still could use it? And if I login with my OpenAI account all the rules won't apply?Privacy protections are built in for users who access ChatGPT — their IP addresses are obscured, and OpenAI won’t store requests. ChatGPT’s data-use policies apply for users who choose to connect their account.