Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The problem foremost is that LLMs are judged purely on the illusion of competency. Apple and Google were well aware of the issues with next token prediction from the mountains of data they have from their small language models (like autocomplete). It is impossible to know exactly what someone intends. Even the best models today require a lot of re-prompting to get satisfactory results—which only someone who has an idea of what should be the correct response will know. Sounds true enough isn’t really a satisfactory answer for general queries and is actually unacceptable for taking real world action. Natural language is simply not a sufficiently expressive user interface, but for some reason, an alarmingly large portion of the public believe it can be.
 
I always read that Apple is behind on AI. I don’t think they really are. Apple is making an effort to do something no one has before, which is to create an LLM that keeps your interactions private without having to subscribe to it. No one is really doing this yet with the exception of companies paying large sums of money to keep their intellectual property from going public. People who ask an LLM for information then share that to someone else, ends up being searchable on google. So if you’re looking up heath info on an LLM, and share it with your spouse, the world will have access to it. Your job, insurance co, spammers, lawyers, literally everyone who can access google. 🤯
 
Apple never truly invested in Siri and overlooked the real potential of voice control. Now, with ‘Apple Intelligence,’ the company seems to be playing catch-up—more a marketing move than a breakthrough. If it ends up leaning on Google behind the scenes, that’s clever branding on Apple’s part: let Google handle the data risks while Apple claims the intelligence.
 
I always read that Apple is behind on AI. I don’t think they really are. Apple is making an effort to do something no one has before, which is to create an LLM that keeps your interactions private without having to subscribe to it. No one is really doing this yet with the exception of companies paying large sums of money to keep their intellectual property from going public. People who ask an LLM for information then share that to someone else, ends up being searchable on google. So if you’re looking up heath info on an LLM, and share it with your spouse, the world will have access to it. Your job, insurance co, spammers, lawyers, literally everyone who can access google. 🤯
If Apple had Google’s/OpenAI’s/Anthropic’s LLM technology, they could just run that on their own hardware (as they planned to do with server models and Private Cloud Compute) and protect user privacy. But they don’t have that technology (which is why they are considering partnering up), and they are behind in that respect. Regarding on-device models, Apple was never under the illusion that those could provide full-fledged LLM functionality.
 
All I want to know is will the new iPhones be built for the New Siri? Because my iPhone 16 Pro was built for Apple Intelligence and that’s kind of a baseline for me now. ;)
 
  • Like
Reactions: Fara82Light
LLM Siri or not… Siri and AI will remain turned off on my devices.
Siri and Apple (Un)-Intelligence, will remain unused and off on all my devices too!
Apple can take care of and raise their lost child without me.

I use ChatGPT/Open AI for my use of AI now - without issues and mental peace.
 
  • Disagree
Reactions: dominiongamma
Apple is developing a new version of (z) that's supposed to be better than the existing (z) in every way. It will be smarter and able to do more, functioning like (leading competition) instead of a barely competent (aging version of itself).
Hope Apple delivers. Last time we saw statements like this was in the Copland era.
 
I don’t know what they are doing but honestly not what I would expect.

The main problem with Siri is that the voice recognition engine is very bad. Any AI is trained on different voices, accents and such, and it must be amalgamated as single “English” language, so user would not need to worry which regional language style is enabled in settings. C’mon in 3 month our friends from outta space arriving on their 3I/Atlas ship, and we still haven’t figured out English (UK), English (US) and English (AU) are only divided by accents and some minor local differences? Apple must have already figured out how to make Siri recognize all available accents and even playfully respond to user and let user understand that machine now gets thru language barrier.

Take Open AI, their ChatGPT will even pick up the language of my browser or OS even if I ask in English, so the model follows user and tries to be as comfortable as possible, while Siri tries to do some weird “mental gymnastics” (or technically “algorithm gymnastics”) and then tells something like “sorry I don’t understand”. But Chat GPT does!

Apple doesn’t even need LLM, then urgently need bugfixes and general cleanup in their voice recognition department. Training on new data and amalgamating sets isn’t a hard task, at least not harder than reinventing something ChatGPT have already created
 
Last edited:
Well, because of the delivery failures to date, whatever they use will be examined and scrutinised to the nth degree, that's a given. I'm a big fan of ChatGPT, but man does it have some ways to go as well. I would hope that whatever solution they roll out is not exclusive to the iPhone 17/18 and newer; this would be quite the match meeting the powder keg for iPhone 16 owners.
 
Well, because of the delivery failures to date, whatever they use will be examined and scrutinised to the nth degree, that's a given. I'm a big fan of ChatGPT, but man does it have some ways to go as well. I would hope that whatever solution they roll out is not exclusive to the iPhone 17/18 and newer; this would be quite the match meeting the powder keg for iPhone 16 owners.
I feel very confident that they iPhone 16 will run whatever they release for the next couple of years at least. Sure, the newer phones will run it a bit faster. Maybe in three years there will be one new feature that requires newer phones than the 16 as that's how it's always been.
 
  • Love
Reactions: Boeingfan
I think they should develop some new emojis first.

Seriously, if as of today they don’t even know if they will go with a partnership or use their own LLM, I don’t think we will see anything interesting with 26.4 either.

While the world was using chat gpt for almost a year, there was an article about Federighi saying that it played with GPT and found it interesting.
My grandpa could have figured out that a year earlier than him, without the million dollar paychecks either.
 
AI, LLMs, etc. are in their infancy. We should not expect that the clustermesses of today are necessarily representative of where AI will be in even five years.
Maybe 3 years ago. Now they are a pretty mature tech.
They just SEEM pretty mature compared to three years ago. They are still definitely in their infancy -- see what three more years brings. It won't just be incremental improvements as it hasn't reached the point that all tech reaches where it starts to level off yet.
 
What Apple is trying to do is extremely ambitious. The delays don't surprise me given the complexity of the effort and the non deterministic nature of AI inputs & outputs. I also hope Apple goes all in on agentic, as this would truly move the needle for Apple customers.
 
What Apple is trying to do is extremely ambitious. The delays don't surprise me given the complexity of the effort and the non deterministic nature of AI inputs & outputs. I also hope Apple goes all in on agentic, as this would truly move the needle for Apple customers.
What's ambitious about it? Apple is way behind. Google has a much better AI system and it's integrated in the phone. Apple has had years to fix Siri and be ahead of everyone else. Making Siri an advanced AI should've been the goal since 2012.
 
  • Like
Reactions: Kaos22
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.