Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I disagree. LLMs are getting dramatically smarter each year. There is no evidence as yet that LLM researchers are failing to maintain the hygiene of their training data.

There is a whole separate discussion of course that as the years go by and the majority of content is either generated by LLMs - or quality-checked by LLMs at the very least - how that will impact the training data being used. It's not about accuracy, which is easier to control, but to what extent we want training data to self-reinforce.

If neural networks can inadvertently develop behaviour remarkably reminiscent of human intelligence and reasoning, purely by sampling text excerpts of human activity, it suggests that we do not understand human intelligence.

Perhaps the core of human cognition fundamentally operates on similar principles under the hood - a widely distributed self-learning system that progressively distils the statistical patterns of our experiences into higher-level models of abstraction, reasoning and generalised intelligence.



ChatGPT is free to use though. The free tier of service is what is being baked into Apple's devices, with the same privacy afforded to enterprise customers. If you want the full-fat GPT-4o model (with its higher context size and higher usage limits) you still need to pay $20/mo for ChatGPT Plus and link it to your Apple Account.
I’m not an expert who does this for a living but the difference between LLMs with labeled data and unlabeled data sets seems clearly defined and documented in the relevant literature.
 
  • Haha
Reactions: ipedro
Let's assume OpenAI remains king of AI. If in the future there's a dispute and the partnership ends, which of the two companies stands to lose out the most?
 
Why does apple need their own LLM? They never made a search engine and have done just fine.

I think you are underestimating the importance of AI. AI is not just for searching the web, AI can or will be able to do almost anything! If you cannot understand what that means then watch some Science Fiction movies. Apple needs an AI that works locally on the iPhone that is specialized for what Apple's goals are priorities are... It can reason and make important decisions to a certain extent and it will only improve over time. AI driving a car is a good example, it is actually making life and death decisions, with not only your life but others lives! AI is not just limited to web searches or driving cars but it can potentially do almost anything!
 
The next sentence says: "Users can also choose to connect their ChatGPT account, which means their data preferences will apply under ChatGPT’s policies." So as soon as I login those protections become obsolete?

Yes, except the IP address will still be obscured, but that doesn't really matter much, since OpenAI already know who you are.

I know business customers can turn of training for their data, and I believe if you turn off chat history in ChatGPT it won't use it for training. But it's up to OpenAI.
 
Could be a step towards some Apple subscription for AI, or it gets added into one of the existing subscriptions Apple has. and on OpenAI's end, this could be a play at becoming the name brand for AI, just as Google is the name for search engines. Eventually OpenAI capitalizes, but for now they are fighting to become the one name you know for AI.
 
  • Like
Reactions: AlexMac89
From what I understand, ChatGPT requests within Siri and writing tools are not going to be stored. What about request outside Siri and writing tools? Will they be stored?

These are, as announced, the only two ways in which Apple will send data to ChatGPT.

All other request will come from you using the ChatGPT app, their website or some third-party app you use which might integrate with ChatGPT.

Those will follow OpenAI's policy which means they'll be stored and used for training by default, I believe. What you can do to stop it from storing and use your data for training, depends on whether you use it as a consumer, business, API, paid, unpaid etc.
 
Just curious what you mean by this. When you say “the best” do you mean people with more disposable income? People with better taste, of a higher class? Just trying to understand your definition of “the best”.

He probably means willing to spend money which is easier to do when you have more disposable income.
 
Let's assume OpenAI remains king of AI. If in the future there's a dispute and the partnership ends, which of the two companies stands to lose out the most?

If OpenAI didn’t need Apple to become current King why would they need them in future to remain king ? There is already a ChatGPT app onthe Apple App Store. They don’t need Apple to be on the platform . (. Did Microsoft Word die when Pages showed up ? No .). OpenAI’s greater threat is bumbling their upgrades over time.


If Apple could not find a deal with anybody that would be bad. This “escape hatch”/punt off the limited top of Apple Intelligence could work just like the option search engine setting in safari/spotlight. Apple doesn’t really need a deal with just one vendor. In fact, likely better for Apple if have competing vendors for that slot to keep the costs down .

There are some open source models brewing. If those catch and pass OpennAI then Apple being boat anchored to OpenAI would be a problem.
.
 
What are the benefits of paying for ChatGPT Plus? Also, I wonder if there are any privacy implications in this deal. 🤔

The free part became so much better when they released the latest version, which made the Plus subscription less valuable.

I would say for faster answer, especially during busy times and no practically limit for request per day as the most important ones.

For just exploring and learning how and when to use ChatGPT, the free one is more than good enough.
 
  • Like
Reactions: macos9rules
I still want a third-party AI tool selection of "None" to be available

Siri would still have to pop up a dialog box of. “ I’m sorry Dave. I can’t do that” just answer no . ( even if no place elsewhere to go ) . In the demos, Every single fall out of Siri is suppose to pop up a dialog box. It isn’t like hit a button once and forever every gets shoveled off to ChatGPT Promtless.


And Apple Intelligence is opt-in. Get some classic Siri abilities presumably by default. Or maybe nothing . Current Siri can be avoided .
 
Last edited:
Free promotion?

Apple loves getting money from Google but not giving OpenAI any?
It is likely the same arrangement as Google - you become a preferred provider where we send a large amount of traffic to you, under an agreement where you share service revenue with us.

It's interesting though in that right now OpenAI's obvious revenue is subscriptions and corporate/API access. This could be an order of magnitude increase in ChatGPT traffic; they need to figure out how to monetize that
 
I'm speculating, but maybe ChatGPT is making out by using Apple to train its ecosystem.
 
I don't quite understand how Apple's system is set up.

  • There is on-device processing for most tasks
  • There is cloud processing for some heavy tasks on Apple-owned & operated servers
  • Some requests on "world data" are optionally sent to ChatGPT (in a way that OpenAI can't build up a picture of your requests over time)
So, does this mean that Apple have developed their own genAI framework and trained an LLM in 18 months? Or have Open AI provided the ChatGPT framework to Apple which they then run on Apple servers and train with their own LLMs? Is this going to be like Maps where the Apple servers/LLMs start out quite limited (hence "world data" questions are sent to ChatGPT), but over time Apple will add more servers and train the LLMs with wider data sets so they can eventually operate an end-to-end Apple AI service?

Yes, Apple has stated they have their own models on device and on their servers (which are Apple Silicon based servers). When it is determined that a request/query falls outside of Apple’s models scope, then the user has the option of sending the request to a 3rd party model - currently only ChatGPT.

Apple did not specify if they plan to expand their own models, but I don’t see why they wouldn’t or couldn’t eventually. They did however mention that they are really only interested in areas where they can provide the biggest benefit to their users, and that’s within personal requests/tasks and Apple’s ecosystem.
 
  • Like
Reactions: kiranmk2
AI driving a car is a good example, it is actually making life and death decisions, with not only your life but others lives! AI is not just limited to web searches or driving cars but it can potentially do almost anything!

And, you find this reassuring?! Or, a good thing?! It has proven that, so far.
 
I'm speculating, but maybe ChatGPT is making out by using Apple to train its ecosystem.

ChatGPT gets exposure and the possibility of growing their subscription base. The deal may have been that Apple doesn’t take a cut of ChatGPT’s subscription fees.
 
Doesn't that mean they are definitely training the chatbot on our queries? I mean, OpenAI is clearly paying for the server costs of handling what is probably going to be a LOT of queries from around the world. The only thing they could really gain is training data, which they are reported to be running out of. Wonder if Apple or OpenAI will explain what each gains from it cause I definitely thought OpenAI was paying them to be on Apple devices
 
Apple did not specify if they plan to expand their own models, but I don’t see why they wouldn’t or couldn’t eventually. They did however mention that they are really only interested in areas where they can provide the biggest benefit to their users, and that’s within personal requests/tasks and Apple’s ecosystem.

Making models "as big as possible" pushes them toward servers doing the inference work. With reasonable sized models local on devices. Who pays for the electricity and HVAC to run that model? The users ( Apple's cost approximately $0.00) . The hardware costs for running that model? The users ( Apple's cost approximately $0.00 ). Sever building mortgage costs ... $0.00. Eventually, 1 Billion users and operational services overhead costs $0.00. . Apple doesn't want that? They want to pay more money out of their own pocket for that?

Incrementally bigger as base level RAM capacity across systems go up? Sure. But the notion Apple is gong to aggressively expand for expansion sake ( prioritize getting bigger resource footprint ) is probably off.

The Private Cloud Compute models are likely just bigger than what Apple base RAM configurations. Not huge for huge sake. Pretty good chance a maxed out Ultra SoC wouldn't touch those servers unless highly distressed by other local workload RAM footprint. A factor there is Apple looking for margins on entry configurations on the RAM capacity dimension.



If Apple's model is always on by default then there would be less room for competing models in the local RAM pool. ( It is an opt-in but if folks flip it on and then forget to turn it off. .... )
 
Last edited:
Doesn't that mean they are definitely training the chatbot on our queries? I mean, OpenAI is clearly paying for the server costs of handling what is probably going to be a LOT of queries from around the world. The only thing they could really gain is training data, which they are reported to be running out of.

There are two general piles of 'training' data. One is human composed text/images/etc. The way Apple has outlined this , they are not getting this per the aggreement. Second, is 'correctness' assignment to a piece of text or a response from the system ( Did these result prove useful ? Yes / No). To a large extent that really isn't user data being captured ( OpenAI is posing the 'Yes' and 'No' the user is just clicking. ]

That isn't necessarily directly training data. A response that said "The USA Presidental Election was rigged" would get different answers from different people. Was the end result useful in more than few cases is going to loop in biases, not necessarily helpful correctness/categorial assignments. More so getting data on whether users are happy (or not) about what the results say.

Wonder if Apple or OpenAI will explain what each gains from it cause I definitely thought OpenAI was paying them to be on Apple devices

Probably not. Because is Apple wrangled a better deal than other folks , then they don't want to tell other folks that they got it. [ USA Government usually demands that they get at least the best deal that you gave anyone else. So if Apple is free. Uncle Sam has to get free also. ]
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.