Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,771
38,341


Apple is not expected to introduce its most significant Apple Intelligence features in September when ‌iOS 18‌ sees a public release. Instead, many will come alongside a Siri overhaul in a future ‌iOS 18‌ update that's set to be introduced in 2025.

apple-intelligence.jpg

That doesn't mean Siri will not have any improvements when iOS 18 initially launches. According to Bloomberg's Mark Gurman, Siri in iOS 18 will still have some "new bells and whistles" come September, including a new interface that glows a light around the edge of the screen, richer natural language capabilities that will allow Siri to understand you even if you stumble over your words, a greater knowledge of Apple products to power on-device support, and a Type to Siri option for entering queries via text.

But according to Gurman, we will have to wait until next year to see Apple's most significant enhancements to Siri. An iOS 18 update in 2025 should bring with it the following improvements to Apple's virtual assistant:
  • Personal Context: Siri will have the ability to draw from your photos, calendar events, Messages, and other apps to inform responses to queries. Apple gave an example of a person asking when their mom's flight is landing, which Siri figured out based on recent text conversations and emails.
  • Semantic indexing: Apple Intelligence creates an on-device semantic index to store data retrieved from your emails, images, websites you visit, and apps you use.
  • App Control: ‌Siri will be able to‌ control all individual features in apps for the first time, expanding the range of functions the personal assistant can perform. ‌Siri‌ will be able to do things like open specific documents, move a note from one folder to another, delete an email, summarize an article, email a web link, and open a particular news site in Apple News.
  • On-screen awareness: With onscreen awareness, Siri will be able to understand and take action with users' content in more apps over time. For example, if a friend texts a user their new address in Messages, the receiver can say, "Add this address to his contact card."
Like these Siri improvements, Apple Intelligence features will initially only support American English, and support for additional languages and regions won't arrive until 2025.

AI Updates Coming Later in 2024

Gurman notes that some of the Mail app's new AI features won't be ready until "later in 2024." These include a redesign that can group emails in categories like newsletters, announcements and shopping. Swift Assist, a new programming companion for Xcode that uses cloud-based AI models, is similarly not expected to arrive until later in the year. Siri integration with ChatGPT may also miss the initial release, though Gurman thinks it should be ready later in 2024, based on language Apple has used in its marketing materials.

Going on past release timelines, these features are likely to arrive in an iOS 18.1 or iOS 18.2 update around October or December.

Despite the missing enhancements, the initial version of Apple Intelligence will still offer a plethora of features, according to Gurman. It will leverage AI to prioritize notifications and provide quick recaps of your alerts and text messages. The software will also have the capability to summarize webpages, voice memos, meeting notes, and emails. Additionally, it will introduce new writing tools, image generation, and custom AI-generated emojis called Genmoji.

Article Link: iOS 18: These Apple Intelligence Features Won't Be Ready Until 2025
 
So AI is IOS-only? (yes, I know it is for iOS and for MacOS, but it's very telling that the whole article forgets about the Mac -not even a single mention to the Mac, while a lot for iOS ...and then people criticize that we Mac users complain "for no reason" when we say that Apple displaces the Mac on purpose)
 
It's not a matter of lazy software development. I'm sure Apple realized that all these AI enhancements and advancement "perform" poorly on even their state of the art iPhone 15 Pro (Max) and a new silicon is needed to better take advantage of it.

How convenient. This will start the great "fragmentation", something that Android went through long time ago - didnt end well.
 
When *I* stumble over words?

It's *Siri* that stumbles over words!

It's that Apple's line now, that I've been the one messing up all along?

I talk to it slowly, enunciating every word, like I'm teaching a foreign language, and it is getting worse and worse at understanding as time goes on.

The voice control system they had for placing calls even prior to Siri worked better than this.

I won't doxx names but the person I call most often hands-free is constantly switched last second to calling another person with a COMPLETELY different name that sounds nothing alike. I can even see that it transcribes it correctly, and then last second, let's wildcard it! Let's call some random person you haven't spoken to in 20 years for the hell of it! And I have to rush to hit end before the call goes through. If sometimes even tries calling me! It picks my contact card to call.

They should have no aspirations other than getting it to do what they said it was supposed do 14 years ago.
 
When you look at Android having this out at start of year, it really does look like Apple was asleep at the wheel, doesn’t it. Perhaps is time for younger blood there?
I think they just need to take more risks. Yes, some of the ideas are dumb, but things like Ping showed a level of forward thinking they haven't shown in a bit. They'll continue to do well, and have their fans, but I personally don't get excited about these announcements anymore because its clear they're going to have features delayed, or even just not come out until the next iOS. This wouldn't have happened in the past where they just announced features they already had available.
 
When *I* stumble over words?

It's *Siri* that stumbles over words!

It's that Apple's line now, that I've been the one messing up all along?

I talk to it slowly, enunciating every word, like I'm teaching a foreign language, and it is getting worse and worse at understanding as time goes on.

The voice control system they had for placing calls even prior to Siri worked better than this.

I won't doxx names but the person I call most often hands-free is constantly switched last second to calling another person with a COMPLETELY different name that sounds nothing alike. I can even see that it transcribes it correctly, and then last second, let's wildcard it! Let's call some random person you haven't spoken to in 20 years for the hell of it! And I have to rush to hit end before the call goes through. If sometimes even tries calling me! It picks my contact card to call.

They should have no aspirations other than getting it to do what they said it was supposed do 14 years ago.

Apple's favorite thing is to blame their users when their technology fails.
 
Given Apple is pushing for on-device/ hybrid model, I venture to guess we won’t see anything substantial until at least 2026, and possibly 2027, after they are able to gather data on what users are doing with it and stressing the system. By then, they will have optimized the architecture and have more CPU headroom to work with.
 
Here come the obligatory “I’m willing to wait because Apple will do it right” comments. What a load of BS.

Really there is no excuse for them being this late to the AI party. And who are we kidding, it will be riddled with bugs anyway when it launches.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.