Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,722
38,241


Apple is believed to be developing several technological innovations to mark the 20th anniversary of the iPhone, and one key technology it's considering is Mobile High Bandwidth Memory (HBM), according to a report from ETNews.

M3-chip-series-unified-memory-architecture.jpg

HBM is a type of DRAM that stacks memory chips vertically and connects them via tiny vertical interconnects called Through-Silicon Vias (TSVs) to dramatically boost signal transmission speeds. It's primarily used in AI servers today, and is often referred to as AI memory due to its ability to support AI processing alongside GPUs.

Mobile HBM is what the term suggests – a variant of the technology for mobile devices that is designed to deliver very high data throughput while minimizing power consumption and the physical footprint of the RAM dies. Apple is looking to enhance on-device AI capabilities, and ETNews reports that connecting mobile HBM to the iPhone's GPU units is being considered as a strong candidate for achieving this goal.

The technology could be key for running massive AI models on-device, such as for large language model inference or advanced vision tasks, without draining battery or increasing latency.

The report indicates that Apple may have already discussed its plans with major memory suppliers like Samsung Electronics and SK hynix, both of which are developing their own versions of mobile HBM.

Samsung is reportedly using a packaging approach called VCS (Vertical Cu-post Stack), while SK hynix is working on a method called VFO (Vertical wire Fan-Out). Both companies aim for mass production sometime after 2026.

As always though, there are manufacturing challenges. Mobile HBM is a lot more expensive to manufacture than current LPDDR memory. It could also face thermal constraints in slim devices like iPhones, and the 3D stacking and TSVs require highly sophisticated packaging and yield management.

If Apple does adopt this technology for its 2027 iPhone lineup, it would be yet another example of the company pushing the envelope for its 20th anniversary iPhone, which is rumored to feature a completely bezel-less display that curves around all four edges of the device.

Article Link: Report: 2027 iPhones Could Adopt Advanced AI Memory Technology
 
"The technology could be key for running massive AI models on-device, such as for large language model inference or advanced vision tasks, without draining battery"

When I see "massive AI models" and "without draining battery" in the same sentence I automatically default to my wait-and-see attitude.
Wouldn't running "massive AI models" be more about the amount of memory on-device? I mean you can have super-fast memory, but if you only have 256GB in a base model phone, how does that massive model fit on device? So we all need 2TB iPhones now?
 
The M series Max needs all the power it can get for graphics work, engineering, and AI. If they can pull it off, Macs will benefit as well. It will raise all boats.
 
  • Like
Reactions: citysnaps
"HBM is a type of DRAM that stacks memory chips vertically and connects them via tiny vertical interconnects called Through-Silicon Vias (TSVs) to dramatically boost signal transmission speeds. It's primarily used in AI servers today, and is often referred to as AI memory due to its ability to support AI processing alongside GPUs."

HBM has existed for many many years before modern AI, the R9 Fury had it. If it's referred to as AI memory at all in the industry which I'm in and never heard it referred to as, it's just catching the trend lol.

Also more interested in HBM on Macs if they're expanding its use (there's a misunderstanding by some that it already has it by some typographical confusion, it's just LPDDR, they said it's high bandwidth, memory, not High Bandwidth Memory)
 
Last edited:
The next few years of iPhones are going to age rapidly given the steep technology curve we will be on if Apple wishes to run such AI models on device.
 
Wouldn't Apple actually need some sort of "AI" to use this exciting (seems to be a popular word here) technology?
Apple has been using and kind of leading the segment in Machine Learning (especially image processing, segmentation, classification) which is an AI technology.


What you meant was LLM, which is another subset of Machine Learning.
 
I don’t know what some of those technicals words are but they make me thankful to live in a time where I’ll be able to see the outcome of all this years from now. When I read tech history books and articles it always impresses me to read about something in the 70s or 80s that makes me say ‘hey cool we do that today. How could they look so far ahead?’

It’s also a reminder that you don’t have to fully understand something to appreciate its potential.
 
Let's be real, half of this stuff mentioned lately won't be until 2029. They won't release so many things so soon just to keep some people updating every single year. Like me (at least almost every year).
 
AI-memory - really?! 🤣

This is simply FASTER memory, as usual. That AI improves with memory speed doesn't make this "AI-"memory! Every memory driven technology will advance with this! This is just putting the AI-sticker on normal technology progress.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.