Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,584
39,450


Apple at WWDC 2024 will reveal a turbo-charged version of Siri powered by large language models (LLMs) that will debut in iOS 18, but some new cutting-edge generative AI features could be exclusive to iPhone 16 models, according to a new rumor.

iPhone-16-Side-2-Feature.jpg

Last month, Bloomberg reporter Mark Gurman revealed that Apple is developing a large array of features that use generative AI, including a "smarter version of Siri" and new LLM-based AI features that will be baked into iOS 18 and iPadOS 18.

Gurman said Apple was still debating whether to limit generative AI to on-device processing, deploy it via the cloud, or adopt a hybrid approach combining the two. It was not mentioned either way whether some AI features would require specific hardware architecture or trickle down to all models capable of running iOS 18.

However, according to new information independently shared by the leaker @Tech_Reve, iOS 18 will bring the company's new LLM to millions of existing devices by using cloud-based AI, while new on-device AI features will likely remain exclusive to the iPhone 16.

In terms of iOS 18 features, Gurman's sources mention a revamped interaction between ‌Siri‌ and the Messages app, enabling users to field complex questions and auto-complete sentences more effectively. We may also see auto-generated Apple Music playlists and integration with productivity apps like Pages and Keynote, such as AI-assisted writing and slide deck creation. Where this patina of AI integration crosses over into new hardware requirements is still unknown.

Apple is designing new A-series chips for the iPhone 16 lineup, built on TSMC's latest N3E 3-nanometer node. Efficiency and performance improvements are of course expected, but there could be other benefits that feed into Apple's AI intentions. Notably, TSMC is the sole manufacturer for Nvidia's powerful H100 and A100 AI processors, the hardware that powers AI tools like ChatGPT and which also comprises the majority of AI data centers.

All models in the iPhone 16 series are also rumored to have an extra button that we don't know the purpose of yet. Internal versions of the iPhone 16 that Apple is working on include an extra capacitive button, known internally as the "Capture Button."

The button is located on the same side as the Power button, and is a capacitive button that is able to detect pressure and touch, providing haptic feedback when pressed. There has been no word yet on what this button might be used for, but it could conceivably have unforeseen practical AI applications.

Apple is said to be on course to spend $1 billion per year on AI research, with some of the company's biggest executive names overseeing development, including senior vice president of software engineering Craig Federighi, senior vice president of machine learning and AI strategy John Giannandrea, and senior vice president of services Eddy Cue.

Article Link: iPhone 16 Likely to Get Exclusive AI Features in iOS 18
 
Last edited:
  • Like
Reactions: decypher44
If they do the cloud approach, I guarantee they will have some sort of “Siri+” subscription to access it.

You might be on to something just like Hide my E-Mail and Privat Relay (?) is iCloud+ exclusive. They can even spin it like „if you truly only care about privacy, you can stick to basic Siri but if you want to have advanced Siri functionality, you can use Siri in the Cloud and for only X Euro per month, your data will stay save in the cloud!“
 
Of no interest to me personally. Don’t recall the last time I used Siri and Alexa I only use to turn on / off lights or to start playing music. Something about having to think of what to even say is too much work for me. I rather press a button somewhere
In our family home our light switches on the wall have had close to 50 years of uptime. It's really a great system. The idea of buying more expensive bulbs with less reliability doesn't appeal to me.
 
Getting really annoying to hear about the amazing neural engine in every chip every year, then the next year be told that only this year’s incredible neural engine can actually run Siri. WTF has that neural engine been doing for years?
 
Of no interest to me personally. Don’t recall the last time I used Siri and Alexa I only use to turn on / off lights or to start playing music. Something about having to think of what to even say is too much work for me. I rather press a button somewhere

This is mostly the fault of Siri being awful. We’ve basically been trained like dogs with shock collars to not even try. If Siri could take verbal input as flexibly as GPT-4, it would be so much easier.
 
Siri getting smarter don’t hold your breath for this one
Siri still to this day gives me cafés on the other side of Earth when I'm looking for one, and cannot answer basic questions such as "How far is Portland Oregon from Oregon City Oregon?" (I'm not in the US, but that should not be an issue regarding this question)
 
Last edited:
There is nothing that signifies the complete lack of innovation in the iPhone line more then adding a new button each year.

The big selling point of the first iPhone was having big screen and no buttons and now they want to make a new button the star of the show.

I have the iPhone 14pro and the UI for the new action button is just a joke. Could they be more grandiose about a fricking button?
 
I get much better information from chatGPT than I do from google these days and I can ask it information without just limiting it to keywords. If Apple AI was like that but more integrated with device functions also that would be great. Unfortunately Siri has been and likely will still be a weak link because if it can't understand what I am saying then it all becomes useless.
 
Getting really annoying to hear about the amazing neural engine in every chip every year, then the next year be told that only this year’s incredible neural engine can actually run Siri. WTF has that neural engine been doing for years?

You poor fool you must only have the octacore neural engine. This super advanced feature requires the 24 core neural engine pro max ultra
 
There is nothing that signifies the complete lack of innovation in the iPhone line more then adding a new button each year.

The big selling point of the first iPhone was having big screen and no buttons and now they want to make a new button the star of the show.
A rumour at this point.
 
  • Like
Reactions: 4odomi
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.