Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,227
39,035


Apple's software engineering chief Craig Federighi and marketing chief Greg Joswiak are on a media tour this week, following the WWDC 2025 keynote.

iOS-18-Siri-Personal-Context.jpg

The latest interview comes from The Wall Street Journal's Joanna Stern, who sat down with Federighi and Joswiak to discuss Apple's delayed personalized Siri features.

Stern asked the executives if Apple had a working version of the more personalized Siri when the company demonstrated the features during its WWDC 2024 keynote.

According to Federighi, it did.

"We were filming real working software, with a real large language model, with real semantic search, that's what you saw," said Federighi.

"There's this narrative out there that it was demoware only," added Joswiak. "No."


Federighi also gave the same answer about why the personalized Siri features were delayed as he did in another interview. In short, there were quality issues, and Apple is shifting to a newer underlying architecture for Siri to overcome that.

Apple first announced the personalized Siri features during its WWDC 2024 keynote. The new capabilities will include better understanding of a user's personal context, on-screen awareness, and deeper per-app controls. For example, Apple showed an iPhone user asking Siri about their mother's flight and lunch reservation plans based on info from the Mail and Messages apps. Apple said it currently plans to release the features in 2026.

Article Link: Apple Says Personalized Siri Features Shown at WWDC Last Year Were 'Real' and 'Working'
 
It seems fairly believable that they were real and working at the time, in that one could ask a question and get a response. But very likely they were completely inaccurate. Think about it, Apple shipped their notifications summaries that would put completely made up news headlines next to the logo of major news outlets. How bad must have the working Siri model been? Hey Siri, who is this guy I’m talking to, I’ve forgotten his name because I was distracted by my phone?

That’s George Washington, you met him last year at the snowboarding event in the 1956 summer Olympics in Paris.
 
It seems fairly believable that they were real and working at the time, in that one could ask a question and get a response. But very likely they were completely inaccurate. Think about it, Apple shipped their notifications summaries that would put completely made up news headlines next to the logo of major news outlets. How bad must have the working Siri model been? Hey Siri, who is this guy I’m talking to, I’ve forgotten his name because I was distracted by my phone?

That’s George Washington, you met him last year at the snowboarding event in the 1956 summer Olympics in Paris.
They gave themselves 9 months to figure it out for a Spring 2025 release and couldn’t close the gap. This strategy clearly worked for them in the past and they ended up on the losing side of the bet they could do it again. At least they are owning up to it, which is something a boatload of people said they wouldn’t do.
 
I believe they have it in their lab, but when will the general public see it? Maybe WWDC 2026, or maybe they will wait until WWDC 2027 when it will be a huge iPhone year.

Whatever is holding it up, is holding back the Apple home smart display from being released.
 
I feel like this would be much easier to take seriously if it were only Joz doing the press tour…do they not realize Craig is a living meme at this point? 😂
 
  • Like
Reactions: lbmdk1
I can buy that they had it working in tightly controlled environments with good data and thought they’d figure it out in a few months.

Now given the state of Siri, why they thought they could figure it out in a few months is a whole other question.
That reminds me of the strategy of shooting an arrow through the problem: the developer(s) drilling through the most direct, basic permutation of the task, validating that it works in principle, (long?) before actually producing a working solution.
 
  • Like
Reactions: 01cowherd
I love Joswiak's response to this;

"Again, it's important to realize our strategy is a little bit different than some other people, right? Our idea of Apple intelligence is using generative AI to be an enabling technology for features across our operating system — so much so that sometimes you're doing things you don't even realize that you're using Apple intelligence, or, you know, AI, to do them, and that's our goal.

Integrate it. There's no destination, there's no app called Apple Intelligence, which is different than a chat bot, which, again, what I think some people have kind of conflated a bit, like, 'Where's your chat bot?' We didn't do that.

What we decided was that we would give you access to one through ChatGPT, because, you know, we think that was the best one, but our idea is to integrate across the operating system, make it features that, you know, I certainly use every day
."

Makes perfect sense: https://appleinsider.com/articles/2...in-the-ai-race-how-it-isnt-necessarily-behind
 
I can buy that they had it working in tightly controlled environments with good data and thought they’d figure it out in a few months.

Now given the state of Siri, why they thought they could figure it out in a few months is a whole other question.
They say they couldn’t reconcile stapling the old version of Siri on top of the LLM updated one to get it working properly in the stated time frame, which basically dovetails with Gurman’s reporting on this. That’s why they basically are rebuilding Siri from scratch with an LLM.
 
I love Joswiak's response to this;

"Again, it's important to realize our strategy is a little bit different than some other people, right? Our idea of Apple intelligence is using generative AI to be an enabling technology for features across our operating system — so much so that sometimes you're doing things you don't even realize that you're using Apple intelligence, or, you know, AI, to do them, and that's our goal.

Integrate it. There's no destination, there's no app called Apple Intelligence, which is different than a chat bot, which, again, what I think some people have kind of conflated a bit, like, 'Where's your chat bot?' We didn't do that.

What we decided was that we would give you access to one through ChatGPT, because, you know, we think that was the best one, but our idea is to integrate across the operating system, make it features that, you know, I certainly use every day
."

Makes perfect sense: https://appleinsider.com/articles/2...in-the-ai-race-how-it-isnt-necessarily-behind
Yep. Apple sees these AI models as commodities like Internet search and hope to integrate and monetize them like they did with Google. They just need to get their offering up to par enough to create a moat around their platform/ecosystem.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.