Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
70,674
42,445


Apple is considering a significant shift in how it operates Siri by potentially running its next-generation chatbot on Google's cloud infrastructure rather than entirely on its own Private Cloud Compute servers, according to Bloomberg's Mark Gurman.

apple-intelligence-black.jpeg

In yesterday's report detailing Apple's plans to turn Siri into a chatbot in iOS 27, Gurman said that the company is in discussions with Google about hosting the forthcoming Siri chatbot on Google-owned servers powered by Tensor Processing Units (TPUs), a class of custom chips designed specifically for large-scale artificial intelligence workloads. The arrangement would mark a major departure from Apple's emphasis on processing user requests either directly on-device or through its own tightly controlled Private Cloud Compute infrastructure.

In a potential policy shift for Apple, the two partners are discussing hosting the chatbot directly on Google servers running powerful chips known as TPUs, or tensor processing units. The more immediate Siri update, in contrast, will operate on Apple's own Private Cloud Compute servers, which rely on high-end Mac chips for processing.

The near-term Siri improvements in iOS 26.4 are still expected to run on Apple's own Private Cloud Compute servers, which the company unveiled in 2024 as a privacy-focused alternative to on-device processing. Private Cloud Compute relies on Apple-designed servers built around high-end Mac chips, and Apple has positioned the system as one where user data is processed temporarily and not retained, not even being accessible to Apple itself. Those claims have been central to Apple's public messaging around Apple Intelligence.

The more advanced Siri chatbot planned for the following major operating system update is expected to rely on a newer and more capable large language model developed by Google. This model is internally referred to as Apple Foundation Models version 11 and is comparable in capability to Google's latest Gemini models. Running such a model at scale may exceed the practical capacity of Apple's current Private Cloud Compute infrastructure, prompting the need to use Google's significantly larger, specialized cloud footprint and AI hardware.

The possibility of running Siri requests on Google servers does not necessarily mean Google would gain access to user data in a conventional sense. Apple already relies on third-party cloud providers, including Google, for parts of iCloud's infrastructure, while retaining control over encryption keys and data handling policies.

Article Link: Apple's Siri Chatbot May Run on Google Servers
 


Apple is considering a significant shift in how it operates Siri by potentially running its next-generation chatbot on Google's cloud infrastructure rather than entirely on its own Private Cloud Compute servers, according to Bloomberg's Mark Gurman.

apple-intelligence-black.jpeg

In yesterday's report detailing Apple's plans to turn Siri into a chatbot in iOS 27, Gurman said that the company is in discussions with Google about hosting the forthcoming Siri chatbot on Google-owned servers powered by Tensor Processing Units (TPUs), a class of custom chips designed specifically for large-scale artificial intelligence workloads. The arrangement would mark a major departure from Apple's emphasis on processing user requests either directly on-device or through its own tightly controlled Private Cloud Compute infrastructure.



The near-term Siri improvements in iOS 26.4 are still expected to run on Apple's own Private Cloud Compute servers, which the company unveiled in 2024 as a privacy-focused alternative to on-device processing. Private Cloud Compute relies on Apple-designed servers built around high-end Mac chips, and Apple has positioned the system as one where user data is processed temporarily and not retained, not even being accessible to Apple itself. Those claims have been central to Apple's public messaging around Apple Intelligence.

The more advanced Siri chatbot planned for the following major operating system update is expected to rely on a newer and more capable large language model developed by Google. This model is internally referred to as Apple Foundation Models version 11 and is comparable in capability to Google's latest Gemini models. Running such a model at scale may exceed the practical capacity of Apple's current Private Cloud Compute infrastructure, prompting the need to use Google's significantly larger, specialized cloud footprint and AI hardware.

The possibility of running Siri requests on Google servers does not necessarily mean Google would gain access to user data in a conventional sense. Apple already relies on third-party cloud providers, including Google, for parts of iCloud's infrastructure, while retaining control over encryption keys and data handling policies.

Article Link: Apple's Siri Chatbot May Run on Google Servers
I don't need all this BS from Apple because I already run Gemini on my Apple devices....
 
  • Haha
Reactions: SFjohn
Now that apple admits that their chips are inferior for AI/ML/GPU workloads, maybe we'll get eGPU support back in MacOS?

And maybe I'll sprout wings.

On the privacy side, I'm sure Apple is *keenly* aware of the perception that this will cause, and will engineer things so that it's safe.
 
I’m getting pretty tired of the name. But I get that it sort of needs to unique. Doesn’t mean they couldn’t change it
 
This is getting better and better all the time. People were defending Apple's decision to take so long with the next generation of Siri, saying "Apple is developing privacy-focused on-device AI, which is way harder than what everyone else is doing". Apple was also encouraging people to upgrade to new iPhones for Apple Intelligence hardware, for software that was nowhere near done. Now we are here, and are slowly seeing it's all just a dumpster fire.
 
Why would the company that cares about privacy run their chatbot on servers owned by the company that couldn’t care less for privacy
Well because business is business and a breach of contract on a deal this major would easily run into the $100s of billions. Apple can dictate to Google how it uses and handles user information from their platform. Furthermore based on the number of AI companies that want Apple as a client, Google showed they aren't willing to give up that much business.
 
This is getting better and better all the time. People were defending Apple's decision to take so long with the next generation of Siri, saying "Apple is developing privacy-focused on-device AI, which is way harder than what everyone else is doing". Apple was also encouraging people to upgrade to new iPhones for Apple Intelligence hardware, for software that was nowhere near done. Now we are here, and are slowly seeing it's all just a dumpster fire.

Hopefully, eventually, people will realize they should just stop defending a 4 trillion megacorp that increasingly can’t execute.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.