Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.


In his "Power On" newsletter, Bloomberg's Mark Gurman today provided an update on the status of Apple Intelligence and the plans for it in 2026.

Apple-Intelligence-General-Feature-2.jpg

Apple is still planning to roll out its revamped version of Siri around March of next year. The release should be accompanied by the release of a new smart home display product with speaker-base and wall-mount options. A new Apple TV and HomePod mini, which are set for launch soon, will also "help showcase" next year's new Siri and Apple Intelligence features.

The new version of Siri will apparently "lean" on Google's Gemini and include an AI-powered web search feature. Gurman warned "there's no guarantee users will embrace it, that it will work seamlessly or that it can undo years of damage to the Siri brand."

Apple is said to be paying Google to create a custom Gemini-based model that can run on its Private Cloud Compute servers to power Siri. Gurman clarified that this doesn't mean Siri will include Google services or Gemini features. Instead, Siri will simply be powered by a Gemini model in the background, enabling it to deliver the features that users expect with an Apple user interface.

Apple will preview iOS 27, macOS 27, watchOS 27 and other operating systems at its annual Worldwide Developers Conference in June. The updates will apparently focus on major updates to Apple Intelligence and the company's broader AI strategy.

The company is also apparently still running into problems with the launch of Apple Intelligence in China. Despite partnerships with Chinese companies, Apple Intelligence in China is still mired by regulatory issues and the launch is now a "rolling target."



Article Link: New Version of Siri to 'Lean' on Google Gemini
To paraphrase Tim Cook, "just buy your mother a Google Pixel." Apple Intelligence will be Google Gemini anyway.
 
  • Haha
Reactions: Shirasaki
Siri is so absolutely useless, that I'd be up for anything that actually works.
I've had Siri turnded off on all my devices for a long time now - don't miss her at all.
I still like Apple's hardware, but that's pretty much it, and I'm not Apple's testpliot for their Siri 'evolution'.
I use ChatGPT for my small AI needs - but I'm considering their cheapest subscription to see how useffull it can really be for me.
 
  • Like
Reactions: Shirasaki
I just hope it's voice language understanding will be heaps better than gemini's. I find chatgpt a million times better when it comes to natural conversations than gemini..
My guess is that Gemini's voice recognition/conversational understanding will continue to lag behind ChatGPT's for some time. Maybe Google will start working on improving this sooner than they'd planned, now that this deal with Apple has been worked out. One can dream.
 
Last edited:
  • Like
Reactions: chfilm
where is google a default? your browser, whichever it is? I'm sure you can set any engine as default.

what I'd like is for all of them to be gone from bloody spotlight - I'm not looking for my files on the internet!
You can go into Settings->Search and toggle off Show Related Content.
I was thrilled when I found this.
Basically Spotlight will search your stuff but not do a google search.
 
  • Love
Reactions: HazeAndHahmahneez
You can go into Settings->Search and toggle off Show Related Content.
I was thrilled when I found this.
Basically Spotlight will search your stuff but not do a google search.

there is nothing like that in spotlight settings on a mac, I have 'websites' and 'other' (whatever that is) ticked off, but it's still plastering the Firefox shortcut at the top.

Apple also wants me to allow it to store all my queries on a Secret Secure Server to somehow 'improve search results' - yeah, no.
 
Disappointing part: Why this has taken ages? This strategy was obvious and could have been brought to production year ago. Why Apple failed to build in-house model training capabilities so badly that they have to outsource model training to Google?
Could it be that apples stricter data collection policy backfired somewhat in AI training while Google just take everything and use everything for training?
 
  • Like
Reactions: HazeAndHahmahneez
Could it be that apples stricter data collection policy backfired somewhat in AI training while Google just take everything and use everything for training?

Training is one thing — Apple just do not have the data (hello YouTube), but that is not the point.

Why the _decision_ to go with "plan B" took so long? Why it has not been executed in parallel?
 
Training is one thing — Apple just do not have the data (hello YouTube), but that is not the point.

Why the _decision_ to go with "plan B" took so long? Why it has not been executed in parallel?
Could just be some mental blocks in Apple Exec’s mind pushing them to “must have in-house models or at least die trying” before conceding defeat and seek external support.
 
  • Like
Reactions: jole
you literally praised Steve Jobs as being the one delivering surprises on stage and compared it to the lack of surprises by Tim Cook recently.

talk about being amazed
Maybe. But my main point is if Steve Jobs were still be alive today (which was totally realistic if he didn’t have his health problems), would he still be able to live up to his legacy and continue to deliver? Just because he did the seemingly impossible before didn’t automatically make him able to do the same again in the future.
 
Training is one thing — Apple just do not have the data (hello YouTube), but that is not the point.

Why the _decision_ to go with "plan B" took so long? Why it has not been executed in parallel?

Because it’s stupid and a last resort gotta do something plan b. Or more of a plan F. To be sure this wasn’t what Apple had in mind. But they quickly discovered they lacked the intelligence to do AI on their own. This is simply a band aid to cover the wound and hope no one notices.
 
I honestly just want Siri to be able to do the basics well, and fully offline (for requests that fit that mold, of course).

I can't stand how often it pauses in what seems to be doing a network request for something that shouldn't require information from the cloud at all.
 
I honestly just want Siri to be able to do the basics well, and fully offline (for requests that fit that mold, of course).

I can't stand how often it pauses in what seems to be doing a network request for something that shouldn't require information from the cloud at all.

Bar for the basics (like robustly understanding what you asked) is quite high. It requires a multimodal LLM — mere voice recognition model is not robust enough. One needs to understand the meaning of what you’re saying, not just the trying to understand word by word.

No-one is able to do this on device yet. I believe this is close — maybe something is actually possible with A19 Pro (faster prefill on inference, 12GB RAM).
 
Bar for the basics (like robustly understanding what you asked) is quite high. It requires a multimodal LLM — mere voice recognition model is not robust enough. One needs to understand the meaning of what you’re saying, not just the trying to understand word by word.

No-one is able to do this on device yet. I believe this is close — maybe something is actually possible with A19 Pro (faster prefill on inference, 12GB RAM).


I'm talking about the basics it used to do better like 10 years ago.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.