Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple still thinking about how to approach this AI thing, meanwhile everyone and their grandma is using ChatGPT, Gemini, Perplexity, and whatever else there is on a daily basis, for free, in real-world scenarios.

And I still can't get macOS to summarize an email if it's not in English despite all LLMs speaking every language known to Man, or to summarize a webpage or to correct the most basic, obvious typos.

Seriously, in an age where ChatGPT can understand what you mean no matter how badly written it is, why can't we have a keyboard that doesn't randomly correct "its" to "it's" for no reason?
 
What Siri? What Apple Intellegience? Siri is worse than it's debut with iPhone 4S, years ago. It's proven. There are bunch of videos about this.
 
  • Haha
Reactions: Lioness~
Google was just hit with 806 million in penalties over privacy issues. This is in addition to the 1.4 billion they paid to Texas in 2024. But don't worry Apple says everything is safe!
I don't want to question Google's faults. I just don't see any possible scenario in which Google, supplying Apple with an LLM running on Apple's infrastructure, is capable of getting access to any data sent through this model. This a purely technical matter.
 
  • Like
Reactions: hans1972
“Let’s build Apple Silicon so we don’t have to pay Intel”
“Let’s build the C1 chip so we don’t have to pay Qualcomm”
“Let’s write Apple Maps so we don’t have to pay Google”
“Let’s pay Google to fix Siri”
Let's pay Google so they do not have burn all the money in a dumpster fire like Google and everyone else is. There is no money it. Just a money pit. Apple has been the. only sensible company where it comes to the AI dumpster fire.
 
Apple often claims it wants to do everything on its own rather than rely on other companies' technology. The reality is different, for better or for worse. macOS is based on UNIX, Silicon is based on TSMC, iPhone and other devices include parts from Samsung, LG, and/or other companies… to name but a few examples. And now, Apple is quietly turning to Open AI and Google to save Siri.
In fact, Apple is like a fancy restaurant, with Tim Cook as the chef (no pun intended): he buys ingredients here and there, makes recipes that aren't new, and his menu doesn't have as many choices as some other restaurants, but he claims his meals are superior and eco-friendly. He's outstanding at pairing the food and wine as well as the courses. He's very good at advertising his restaurant. And although he hasn't reached the level of his legendary predecessor at the helm, he enjoys a strong base of regular patrons and unwavering enthusiasts.
 
Last edited:
  • Wow
  • Haha
Reactions: gusmula and Moukee
“Apple is said to be paying Google to create a custom Gemini-based model that can run on its Private Cloud Compute servers to power Siri.​

Gurman clarified that this doesn't mean Siri will include Google services or Gemini features.

Instead, Siri will simply be powered by a Gemini model in the background, enabling it to deliver the features that users expect with an Apple user interface.”​

The mere mention of Google is not reassuring. OK, let's say that Gurman is right — no Google services included. That doesn't mean there won't be embedded code along with Google mining of data, monetizing privacy, recording searches, creating user profiles, and so on and so forth.

“Private Cloud Computer Services” may or may not mean what we think it does, especially as Apple explores and expands its own advertising functions.

Let us that we are not rudely surprised. Let us also hope that turning off ALL AI functions is still 100% possible!
 
Maybe I don't have all the facts on this but to me the real scandal here is apple selling the iPhone 15 Pro as having the hardware to run local AI operations that included on-screen awareness... and now we are onto the iPhone 17, and there is still no on-screen awareness. There HAS to be a class action lawsuit forthcoming over that.
 
Maybe I don't have all the facts on this but to me the real scandal here is apple selling the iPhone 15 Pro as having the hardware to run local AI operations that included on-screen awareness... and now we are onto the iPhone 17, and there is still no on-screen awareness. There HAS to be a class action lawsuit forthcoming over that.

The iPhone 15 Pro was not marketed that way initially. It was only included as Apple-Intelligence-capable in the iPhone 16 marketing. There already is a class action suit against Apple on this. There's no need to pile on way after the fact.
 
“OK, let's say that Gurman is right — no Google services included. That doesn't mean there won't be embedded code along with Google mining of data, monetizing privacy, recording searches, creating user profiles, and so on and so forth.
LLMs aren’t programs with arbitrary code, nor any embedded code. If Google provides Apple with a version of the Gemini model customized for Apple’s server hardware, then Google cannot do anything like that.
 
Ohh, haha. My experience with Gemini has been nothing but frustration, particularly with Gemini CLI. Of all of the LLMs I've tried, it's by far been the absolute worst by a huge margin. Good luck Siri!
 
  • Like
Reactions: Lioness~
  • Like
Reactions: NervousFish2
TLDR they’re going to fine tune the gemma3 model, which makes sense, because it’s apparently the best by far when it comes to small models

 
Apple still thinking about how to approach this AI thing, meanwhile everyone and their grandma is using ChatGPT, Gemini, Perplexity, and whatever else there is on a daily basis, for free, in real-world scenarios.

And I still can't get macOS to summarize an email if it's not in English despite all LLMs speaking every language known to Man, or to summarize a webpage or to correct the most basic, obvious typos.

Look at the head of marketing and the head of software for the reason. Federighi over promised and underdelivered because marketing wanted to ride the crest of the AI hype wave. Apple isn't playing catch up to the market. They are playing catch up to their own promises–promises no one required them to make.

Seriously, in an age where ChatGPT can understand what you mean no matter how badly written it is, why can't we have a keyboard that doesn't randomly correct "its" to "it's" for no reason?

Hear hear. I can make typos and write incoherent sentences on my own. I don't need Apple's help!
 
How is it possible that apple could be on the catch-up (and by default ,the Losing...) end of the AI revolution?
 
It's going to suck without Anthropic. I'm a programmer and we are the beta testers of the next step in AI: agentic use.

Most people nowadays use AI to find an answer to something, do some research, translate some text.

Agentic use is asking AI to do something for us. We, programmers, put AI to edit files for us (edit code), search bugs and fix them, search for files, write reports, etc.

And this is where Anthropic really is multiple steps ahead of anyone else.

Siri is an agentic tool first. We tell her to set our alarms, put some music for us, etc. Those will suck with Google's model.

Just take a look at Google's Assistant. It sucks, many times it just shows on the screen what it should be doing instead of actually doing the things.
 
They just don’t get it.

I've been following the advances in GenAI very closely, and there are tons of stuff happening outside of the big-name models (Google, OpenAI, Anthropic). Lots of work being done on SLMs (Small Language Models), domain-specific (specialized) models, new non-Transformer architectures for training models, etc.

It makes me wonder if Apple engineers have their heads stuck in the sand, or are following these advances, too? I truly believe Apple can come out ahead in the AI race by taking a unique approach, not just following what others are doing or offering. Announcements like this one tarnish that belief, though.

I still think it's smart that Apple is not rushing into things. Been far too many embarrassing and risky incidents for the big-name models regarding hallucinations, sycophancy, etc. Apple doesn't need that.
 
  • Like
Reactions: fatTribble
Every year that goes on I'm more and more motivated to leave the Apple ecosystem. A once great company has turned into the plodding and rent-extracting Microsoft of old.

If Apple can't develop an AI llm on its own - and it should be embarrassed that it can't do so given the gigantic resources and funding it has - then it should have bought a solution like Perplexity or Anthropic. Going with Google is honestly the worst option.

The hardware is still good, if not over-priced and lacking innovation, but the software including Tahoe Mac OS Tahoe and complete disaster that is Siri are only something a hugely dominant company with a giant cash hoard and high margins on overpriced products can get away with, but even that has its limits.
 
Seems Gurman (and a few others…) need a reality check. Spotlight leaning on Google web search is no different to Siri leaning on Gemini.

If you’ve ever tried to use the Google search on Android to search your phone half the time it just pulls up web results instead of finding a contact or app. My point is that Spotlight is miles ahead of Google’s own on-device search function because it’s not a web-first service. I expect the same from Siri.
 
Every year that goes on I'm more and more motivated to leave the Apple ecosystem. A once great company has turned into the plodding and rent-extracting Microsoft of old.

If Apple can't develop an AI llm on its own - and it should be embarrassed that it can't do so given the gigantic resources and funding it has - then it should have bought a solution like Perplexity or Anthropic. Going with Google is honestly the worst option.

The hardware is still good, if not over-priced and lacking innovation, but the software including Tahoe Mac OS Tahoe and complete disaster that is Siri are only something a hugely dominant company with a giant cash hoard and high margins on overpriced products can get away with, but even that has its limits.
But, and this is the million-dollar question: why does it need to?

They never built a web search engine and did absolutely fine. What difference does it make if they don’t have their own LLM?
 
I've been following the advances in GenAI very closely, and there are tons of stuff happening outside of the big-name models (Google, OpenAI, Anthropic). Lots of work being done on SLMs (Small Language Models), domain-specific (specialized) models, new non-Transformer architectures for training models, etc.

It makes me wonder if Apple engineers have their heads stuck in the sand, or are following these advances, too? I truly believe Apple can come out ahead in the AI race by taking a unique approach, not just following what others are doing or offering. Announcements like this one tarnish that belief, though.

I still think it's smart that Apple is not rushing into things. Been far too many embarrassing and risky incidents for the big-name models regarding hallucinations, sycophancy, etc. Apple doesn't need that.
The most exciting thing in AI right now is Gemma, likely the template going forwards once the dust has settled on the great LLM investment crash of 2026.

Making a super computer do super computer things isn’t very impressive. Making my phone do them without the internet is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.