Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,495
37,783


Apple plans to power some of its upcoming iOS 18 features with data centers that use servers equipped with Apple silicon chips, reports Bloomberg’s Mark Gurman.

iCloud-General-Feature.jpg

M2 Ultra chips will be behind some of the most advanced AI tasks that are planned for this year and beyond. Apple reportedly accelerated its server building plans in response to the popularity of ChatGPT and other AI products, and future servers could use next-generation M4 chips.

While some tasks will be done on-device, more intensive tasks like generating images and summarizing articles will require cloud connectivity. A more powerful version of Siri would also need cloud-based servers. Privacy has long been a concern of Apple’s, but the team working on servers says that Apple chips will inherently protect user privacy.

Gurman previously claimed that all of the coming iOS 18 features would run on-device, but it sounds like some capabilities coming this year will in fact use cloud servers. Apple plans to use its own servers for now, but in the future, it may also rely on servers from other companies.

Article Link: Apple to Power AI Features With M2 Ultra Servers
 
  • Like
  • Haha
Reactions: SFjohn and d0sed0se
I don't see how any chip could "inherently protect user privacy" -- that is just nonsense. Privacy is primarily a function of the software running on the chip. While there may be features of a chip that could help protect privacy in a multi-tenant environment (like a cloud server), that would be at a very low level such as protecting memory from being read across processes or threads.
 
Seems solid to scale.

  • A14 Bionic (iPad 10): 11 Trillion operations per second (OPS)
  • A15 Bionic (iPhone SE/13/14/14 Plus, iPad mini 6): 15.8 Trillion OPS
  • M2, M2 Pro, M2 Max (iPad Air, Vision Pro, MacBook Air, Mac mini, Mac Studio): 15.8 Trillion OPS
  • A16 Bionic (iPhone 15/15 Plus): 17 Trillion OPS
  • M3, M3 Pro, M3 Max (iMac, MacBook Air, MacBook Pro): 18 Trillion OPS
  • M2 Ultra (Mac Studio, Mac Pro): 31.6 Trillion OPS
  • A17 Pro (iPhone 15 Pro/Pro Max): 35 Trillion OPS
  • M4 (iPad Pro 2024): 38 Trillion OPS
 
Seems solid to scale.

  • A14 Bionic (iPad 10): 11 Trillion operations per second (OPS)
  • A15 Bionic (iPhone SE/13/14/14 Plus, iPad mini 6): 15.8 Trillion OPS
  • M2, M2 Pro, M2 Max (iPad Air, Vision Pro, MacBook Air, Mac mini, Mac Studio): 15.8 Trillion OPS
  • A16 Bionic (iPhone 15/15 Plus): 17 Trillion OPS
  • M3, M3 Pro, M3 Max (iMac, MacBook Air, MacBook Pro): 18 Trillion OPS
  • M2 Ultra (Mac Studio, Mac Pro): 31.6 Trillion OPS
  • A17 Pro (iPhone 15 Pro/Pro Max): 35 Trillion OPS
  • M4 (iPad Pro 2024): 38 Trillion OPS

These are for the NPU alone - the entire chip can perform massively more TOPS than that.
 
  • Like
Reactions: TurboDrooler
Apple plans to power some of its upcoming iOS 18 features with data centers that use servers equipped with Apple silicon chips, reports Bloomberg’s Mark Gurman.
Why pay attention to Bloomberg, this was a MacRumors story from Monday
Apple has started building its own AI servers that use the M2 Ultra chip, Haitong analyst Jeff Pu reports.
 
Last edited:
Maybe just use Nvidia GPU's that are like 50 times faster and are actually built for this type of workloads ?
why pay nvidia 80% margin if you can use your own hardware?

you are also missing the obvious: the M2 Ultra is a niche product, and bigger chips (the so-called M1/M2 Extreme) were cancelled because of that.
now Apple has an incentive to built those as not only they can sell it in Mac Pro/Studio, they can run their own servers on it with massive cost saving compared to what AMD and Nvidia offers.

big chips were given green light if this rumor is true
 
Last edited:
"Handling AI features on devices will still be a big part of Apple’s AI strategy. But some of those capabilities will require its most recent chips, such as the A18 launched in last year’s iPhone and the M4 chip that debuted in the iPad Pro earlier this week. Those processors include significant upgrades to the so-called neural engine, the part of the chip that handles AI tasks."

Seems the latest AI features will be locked to iPhone 16.
 
Maybe just use Nvidia GPU's that are like 50 times faster and are actually built for this type of workloads ?
I have a Nvidia 4090 workstation. My M1 Max runs stuff 4090 can only dream. Nvidia need to up the 24 GB VRAM or lower prices of GPU with more than 40 GB. If Apple can provide 256 GB unified memory, I can lower my cloud costs(already low with M1 Max).
Apple needs to allow multiple studios to have cluster of GPU and memory.
 
Apple plans to power some of its upcoming iOS 18 features with data centers that use servers equipped with Apple silicon chips, reports Bloomberg’s Mark Gurman.
That's great and all, but Apple really needs to work on reducing their services outages



It's been a month since the last one, so we're due for one soon.
 
why pay nvidia 80% margin if you can use your own hardware?
Yup — I think that’s huge. The marginal cost to Apple of another M2 Ultra is just the fab cost they pay to TSMC. I could imagine it would be ten times more expensive to get the same computational power from Nvidia.

I also presume Apple knows very well how to optimize software to their own hardware. So they will have highly optimized software running on much cheaper hardware and the scale support all that effort
 
Yup — I think that’s huge. The marginal cost to Apple of another M2 Ultra is just the fab cost they pay to TSMC. I could imagine it would be ten times more expensive to get the same computational power from Nvidia.

I also presume Apple knows very well how to optimize software to their own hardware. So they will have highly optimized software running on much cheaper hardware and the scale support all that effort
Apple also needs to maintain open sourced libraries in AI/ML space. What better way to keep developing MLX/Core ML than using for their own AI needs.
 
  • Like
Reactions: KeithBN and midkay
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.