Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JulienCasey

macrumors newbie
Original poster
Sep 24, 2025
15
26
California

Key points of the video:

  1. Privacy
  2. Ecosystem
  3. Upcoming products
Another thing the video didn't talk about: Apple uses TSMC for it's chips which has neural engines and used for AI servers. TSMC has the smallest and most efficient chips in the world.
 
  • Haha
Reactions: cateye and maflynn
The future of devices is not a smartphone, so they likely will not. OpenAI are working with Jonny and I personally consider this a major thread to Apple. Microsoft CEO has also said recently, it keeps him up at night thinking would Microsoft be relevant in x amount of years.

I for one have owned every single iPhone since the first iPhone. My last upgrade was iPhone 15 Pro, I cant see Apple's breakthrough in design or features anymore. iPhone Air is nice. I will be considering switching to Android with the new Pixel Fold comes out and using the Microsoft launcher as the skin.
 
The future of devices is not a smartphone, so they likely will not. OpenAI are working with Jonny and I personally consider this a major thread to Apple. Microsoft CEO has also said recently, it keeps him up at night thinking would Microsoft be relevant in x amount of years.

I for one have owned every single iPhone since the first iPhone. My last upgrade was iPhone 15 Pro, I cant see Apple's breakthrough in design or features anymore. iPhone Air is nice. I will be considering switching to Android with the new Pixel Fold comes out and using the Microsoft launcher as the skin.
But Apple Intelligence can still be on the future of devices.
 
Yes, They do. Large LLMs will not be sustainable because they will contain so much slop that their results will be terrible. And their automation is what a smaller, local LLM would do, maybe even better.
The future will, in my mind, be small, personalised LLMs and AI tools that can communicate with other LLMs to provide the information needed.

For example, you ask Apple Intelligence for a recipe. It will not have it in its LLM yet. Apple Intelligence will ask another LLM that contains the recipe what the recipe is and provide it to you.
We will not have large, megalithic LLMs like ChatGPT, but rather smaller, specialised LLMs with specific information. Maybe with a subscription model, so that they can provide the requested information.

Maybe you're doing research for school, but need information about a certain part of history. A library with a specialised LLM will have that information, but it is not freely available. You can add that subscription to Apple Intelligence, and Apple Intelligence will be able to access that information and temporarily store that information for research purposes.

I do not believe in the LLMs that we have now. Maybe Gemini because Google can use its search engine as a source. ChatGPT will end because it is only a black box with nothing to refer to. Their use case is small, and Apple Intelligence will be able to do everything else in the future.
 
If Apple does eventually win, it's because all its other competitors have run out of money, or gotten acquired by larger companies. These are marathons, not short sprints, and who have a splashy entrance early on isn't nearly as important as who has the stamina to be left standing when the dust has settled.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.