Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
69,066
40,102


Apple plans to add an AI-powered web search tool to Siri next year, reports Bloomberg's Mark Gurman. The search tool will be an integrated Siri feature that will provide information on general search queries, similar to ChatGPT and Google's AI overviews for searches.

iPhone-Siri-Glow.jpeg

Apple is calling the search feature World Knowledge Answers internally, and some Apple executives apparently refer to it as an "answer engine." It will be limited to Siri to begin with, but Apple could add it to Spotlight search and Safari in the future. Apple has also considered creating a chatbot-like app for search, though it is not clear if that will happen.

Siri search will include an interface that supports text, photos, videos, and local points of interest, with Siri being able to summarize search results to provide an clear summary of content.

The new Siri answer engine will be included in the Siri updates that Apple is introducing in 2026. Apple has been working on a smarter, more personalized version of Siri that was supposed to be introduced as part of iOS 18 earlier this year, but it was delayed.

Apple couldn't get the first-generation Siri architecture that it was using to work properly, so Siri needed to be rebuilt from the ground up using a second-generation architecture that relies on large language models. In August, Apple software engineering chief Craig Federighi said that the overhauled Siri architecture was giving Apple the results that it needed, and that Apple was now in a position to "not just deliver what we announced, but to deliver a much bigger upgrade than we envisioned."

The new Siri features have three systems that power them, including a planner that interprets voice or text input, the search system that looks through the web and the user's device, and a summarizer that provides the end answer to the user.

Apple has been weighing using its own LLM models for these Siri elements or relying on an outside partner like OpenAI, Anthropic, or Google, and it sounds like progress has been made with Google. Apple and Google have apparently signed a formal agreement that will see Apple evaluating and testing a custom Google-designed Gemini AI model that could power some of the summarization Siri features.

Apple is still testing models designed by Anthropic and its own in-house models for the planner functionality, though it could opt to use Google's AI models for that too. Apple's own Foundation Models will be used for searching user data, making sure customer data isn't processed using third-party models.

The upcoming LLM version of Siri is on track to launch as early as March 2026 in an iOS 26.4 update. Along with the world knowledge feature, Siri will have the personalization capabilities that were promised in iOS 18. Siri will be able to use personal information like emails and messages to answer questions and help users find what they're looking for, plus it will be able to interact with on-screen content and do more within apps.

Later in 2026, Siri will get a visual redesign and a built-in health feature that will be the backbone of a paid wellness subscription service.

Article Link: LLM Siri With 'World Knowledge' Search Feature Coming in Early 2026
 
Looking forward to seeing how they manage to deliver all this Apple Intelligence stuff on an iPhone 16 with just 8GB of RAM. They sold it on that promise, after all, and the iPhone 16 is supposedly “built from the ground up for Apple Intelligence” (lol, as if).
 
So this is how teens end up asking for advice on depression and they end up killing themselves. Not sure how Apple is going to handle these sort of cases. Very tough problem to solve.
 
  • Sad
Reactions: PlayUltimate
Siri is currently just stupid, absolutely unusable. And I think it's outrageous of Apple to have promised an AI solution for the release of the iPhone 16 Pro (that was THE reason for me to buy it) and now only offer this solution over a year later... and certainly only for English-speaking countries :(
 
The way they describe being so indecisive about which third-parties to work with or whether to do something in-house sounds very much like the panic of what would be the successor to classic Mac OS in the late 1990s, with some thinking Apple might purchase BeOS, development on the internal Copland system, and eventually purchasing NeXT.

It really is quite striking the differences in capability from my low-tier Samsung Galaxy A54 that's updated itself to have Gemini versus even Apple's most expensive iPhones in terms of the utility of the built-in assistant.

I only realized the other day my Galaxy A54, which I use as a side phone, had updated itself with Gemini, and I was playing around with it, and it was quite impressive. It was immediate, fluid, intelligent, natural. I asked it to give me the news, and it had everything localized and personalized. I asked Siri the same and it had some response telling me to fix my settings in the Music app. It seems obvious that Tim Cook made huge strategic mistakes, and he's paid so much money specifically for strategy. It's not like he's breaking his back building things. The whole reason anybody should possibly be worth as much money as him is seeing into complex situations on the horizon and making the right decisions about them. And he obviously hasn't.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.