Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,065
38,788


A new report from The Information today reveals much of the internal turmoil behind Apple Intelligence's revamped version of Siri.

iOS-18-Siri-Personal-Context.jpg

Apple apparently weighed up multiple options for the backend of Apple Intelligence. One initial idea was to build both small and large language models, dubbed "Mini Mouse" and "Mighty Mouse," to run locally on iPhones and in the cloud, respectively. Siri's leadership then decided to go in a different direction and build a single large language model to handle all requests via the cloud, before a series of further technical pivots. The indecision and repeated changes in direction reportedly frustrated engineers and prompted some members of staff to leave Apple.

In addition to Apple's deeply ingrained stance on privacy, conflicting personalities within Apple contributed to the problems. More than half a dozen former employees who worked in Apple's AI and machine-learning group told The Information that poor leadership is to blame for its problems with execution, citing an overly relaxed culture, as well as a lack of ambition and appetite for taking risks when designing future versions of Siri.

Apple's AI/ML group has been dubbed "AIMLess" internally, while employees are said to refer to Siri as a "hot potato" that is continually passed between different teams with no significant improvements. There were also conflicts about higher pay, faster promotions, longer vacations, and shorter days for colleagues in the AI group.

Apple AI chief John Giannandrea was apparently confident he could fix Siri with the right training data and better web-scraping for answers to general knowledge questions. Senior leaders didn't respond with a sense of urgency to the debut of ChatGPT in 2022; Giannandrea told employees that he didn't believe chatbots like ChatGPT added much value for users.

In 2023, Apple managers told engineers that they were forbidden from including models from other companies in final Apple products and could only use them to benchmark against their own models, but Apple's own models "didn't perform nearly as well as OpenAI's technology."

Meanwhile, Siri leader Robby Walker focused on "small wins" such as reducing wait times for Siri responses. One of Walker's pet projects was removing the "hey" from the "hey Siri" voice command used to invoke the assistant, which took over two years to achieve. He also shot down an effort from a team of engineers to use LLMs to give Siri more emotional sensitivity so it could detect and give appropriate responses to users in distress.

Apple started a project codenamed "Link" to develop voice commands to control apps and complete tasks for the Vision Pro, with plans to allow users to navigate the web and resize windows with voice alone, as well as support commands from multiple people in a shared virtual space to collaborate. Most of these features were dropped because of the Siri team's inability to achieve them.

The report claims that the demo of Apple Intelligence's most impressive features at WWDC 2024, such as where Siri accesses a user's emails to find real-time flight data and provides a reminder about lunch plans using messages and plots a route in maps, was effectively fictitious. The demo apparently came as a surprise to members of the Siri team, who had never seen working versions of the capabilities.

The only feature from the WWDC demonstration that was activated on test devices was Apple Intelligence's pulsing, colorful ribbon around the edge of the display. The decision to showcase an artificial demonstration was a major departure from Apple's past behavior, where it would only show features and products at its events that were already working on test devices and that its marketing team had approved to ensure they could be released on schedule.

Some Apple employees are said to be optimistic that Craig Federighi and Mike Rockwell can turn Siri around. Federighi has apparently instructed Siri engineers to do "whatever it takes to build the best AI features," even if that means using open-source models from other companies in its software products as opposed to Apple's own models.

For more details on Apple's Siri debacle, see The Information's full report.

Article Link: Report Reveals Internal Chaos Behind Apple's Siri Failure
 
Last edited:
Apple AI chief John Giannandrea was apparently confident he could fix Siri with the right training data and better web-scraping for answers to general knowledge questions. Senior leaders didn't respond with a sense of urgency to the debut of ChatGPT in 2022; Giannandrea told employees that he didn't believe chatbots like ChatGPT added much value for users.
If that's true Giannandrea should be fired.

Federighi has apparently instructed Siri engineers to do "whatever it takes to build the best AI features," even if that means using open-source models from other companies in its software products as opposed to Apple's own models.
Good.
 
This is more or less a version of what happened to the industry as a whole when OpenAI shook things up with ChatGPT, it would seem. It looked like big tech in general almost ignored the popularity of LLMs until it was too late...all to focus on something that ended up being a niche - Vision Pro. Now, just to play catch up, Google is trying to find a way to make their people work 60 hours a week. Apple, on the other hand, hopefully solves their leadership problem.
 
Apple is currently experiencing a significant decline in its software quality. The recent issues with the iPhone 15 Pro, including overheating, are unacceptable and should not have been released to the market. As a long-time Apple user, I have never considered switching to Android until this year. The company’s focus has shifted from delivering high-quality products to prioritizing extravagant features, that they can't deliver, which is concerning.
 
Aside from quick search answers, small AI isn't really going to help most people. An LLM on a phone is also going to need several gigabytes of memory and run really, really slowly. Not exactly the most efficient use of power. It'll start to be useful at the 100GB+ memory model range, where it can then start to operate on your email inbox and other social media and files to become more of a personal assistant. We are very far away (decades?) from a local LLM being useful on a phone.

On the other hand, we could see MacBook Pros next year with 128GB memory as a BASE model, purely for local LLM.
 
The decision to showcase an artificial demonstration was a major departure from Apple's past behavior, where it would only show features and products at its events that were already working on test devices and that its marketing team had approved to ensure they could be released on schedule.

What???? In the past Apple would show at WWDC demos of the next OS using MacroMind Director.
How old am I?
 
Is this Apple's Nokia/Blackberry moment? The beginning of the end (a long end, no doubt) for Apple? Falling behind at a crucial moment such that it will never catch up while other big tech and start ups leave Apple in the dust? It has enough cash to buy it's way out of this, but we haven't seen any good moves lately, only fumbles. What's going on?
 
Will someone with knowledge on the topic please educate me?

What properties exist in current open-source models that are at odds with Apple's privacy vision? Is there a theoretical way for Apple to employ a good, open-source model while simultaneously satisfying their own vision? The available models seem good - even a slightly knocked-down version would be better than what they're working with today.
 
Apple should relax their privacy stance with Siri and LLMs, and make it opt-in with informed disclosure. I'm fully in support of Apple's commitment to privacy and security but it just doesn't work with "AI" assistants and LLMs. This has been an issue with Siri for years and clearly hasn't been solved.

Also I've been saying for years that Giannandrea does nothing and needs to go—he's been there for six years and Siri hasn't meaningfully improved one bit (and has arguably gotten worse). This report makes that even clearer.

Apple's AI/ML group has been dubbed "AIMLess" internally

Incredible 😂
 
The move from live to fully prerecorded keynotes has enabled this. Pre-2020, if a feature wasn't in a usable state, it couldn't be shown because the demos were live. Now, they've gotten way too comfortable with the ability to literally fabricate features and say "We'll get it working later." There's no sense of urgency to get things done.

Go back to live keynotes.
 
Announcing a feature that didn’t even exist is still what gets me.
I have been a Mac person for decades. This blatant lie and misrepresentation sealed it for me. I'm going to transition to non-Apple products. And I don't care what the fan boys say. I was a professional programmer, unix, IBM... Stability and consistency is what I want. Apple is no longer it.
 
Apple is currently experiencing a significant decline in its software quality. The recent issues with the iPhone 15 Pro, including overheating, are unacceptable and should not have been released to the market. As a long-time Apple user, I have never considered switching to Android until this year. The company’s focus has shifted from delivering high-quality products to prioritizing extravagant features, that they can't deliver, which is concerning.
I made the switch this year and it's been awesome. I was in a very similar boat as you, fed up with Apple. My S25 Ultra "just works" like Applet products used to for me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.