Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Based on the A19 Pro slides it seems like we're in for some really good things for M5 Pro / M5 Max. It's a great time for this technology across the board. Unless you're Intel, lol.

The problem with Apple's strategy though is that they have been pushing for faster processors with drawing more power. That's not necessarily wrong, but it makes their devices hotter, less efficient and bulkier. They have pushed for a thinner phone with the iPhone 17, but let's see if that doesn't sacrifice cooling too much.
 
Agree but if the cooling cannot keep up with it, its tech wasted, especially in something so thin like the air

The A19 Pro in the iPhone Air seems to be more of a "race to sleep" mechanism than a computational barn burner. Getting things down quickly and going back to 'sleep' saves power. It is also a path to offload chips with defective GPU core and/or USB 3.0 subsystems ( and instability issues when pushed very hard.) .

If want to light up all the cores to do something that takes substantive time. then can down clock it. Clocked back to previous generations ( A16 / A17 / A18 ) rates it performs 'good enough' while using less power than those previous generations did. If performing 'good enough' at lower power consumption that is not a waste of technology. Especially if have to sacrifice 'everything' just to get the limited sized battery the constraints impose on the device. The system just isn't going to win any 'long drag race' spec benchmark competitions. Apple sells other models for that.

The iPhone Air is more likely a 'bridge' to the fold. ( an even more expensive iPhone option) than setting the baseline for the mainstream iPhone model over the long term.

There aren't any Macs that Apple has thermally crippled to the same degree as the iPhone Air. There is little evidence so far that this will be problem for an M-series on this generation. Almost certainly not a problem with decent engineering for a A19 Pro in something the size of a MBA. ( if Apple mitigates the thermal problem with an iPhone Pro chassis , something substantially larger would be even loss of a problematical issue. )
 
The problem with Apple's strategy though is that they have been pushing for faster processors with drawing more power. That's not necessarily wrong, but it makes their devices hotter, less efficient and bulkier. They have pushed for a thinner phone with the iPhone 17, but let's see if that doesn't sacrifice cooling too much.
Promised myself I’d stay out but just have to say this is incorrect. They very much are not “less efficient” by any sensible reading of that phrase. Power has gone up slightly, but by less than the increase in performance.

They use more power but complete the work more quickly. Therefore they use less energy. More efficient.
 
Last edited:
Here's the truth: Apple had a LARGE lead and blew it. When AMD started making strides on power efficiency, most people here dismissed it saying X86 is a legacy architecture.

The problem is that they STILL make progress, and now they have the AI Max+ 395, which can have up to 128GB which can not only be used for gaming (it runs some games smoothly at 4K!)

I’m not quite sure what you are saying here. Apple CPUs are still about 2x more power-efficient than ones from AMD.

Sure, this memory won't run AI as fast as an actual, dedicated 3D card. But it can be used for large language models at an OK speed run with as little as 45W.

AMD has been developing GPUs for a while, and they have a lot of experience with unified memory solutions. That said, I don’t find current Radeons or the 395+ chip compelling. You make the compatibility argument, which is fair enough, but in every other regard the system is strictly inferior to what Apple is offering.

So sure, if you care about running Windows, maybe it’s a good product. I just don’t see a Mac user getting an AMD SoC just because it runs Windows.

And regarding local ML workloads - M5 Max is coming with MXU accelerators that can likely reach the 5070RTX performance (desktop!) for at least some data formats. That’s like having a Spark superchip on your laptop - one that can actually run everyday software.

The problem with Apple's strategy though is that they have been pushing for faster processors with drawing more power.

Partially, yes, but that’s the name of the game these days. Just a few years ago top desktop CPUs consumed 65W at peak - now it’s sustained power draw of enthusiast mobile chips.

Apples advantage is, however, that they have much more headroom compared to everyone else. So while it’s true that M1 core would draw 4 watts at most while an M4 can draw up to 5.5 watts, the total system consumption is considerably lower than the competition while delivering superior performance.

Another point is that extracting additional performance becomes more and more difficult. It’s much easier for AMD to deliver 10% better IPC - because Apple is already way ahead in IPC. Not to mention that you run into diminishing returns very quickly - M4 is considerably wider than M1 and uses a lot of smart tricks to extract some additional performance from each core, but these are all situational improvements. tel even managed to g
 
Last edited:
They use more power but complete the work more quickly. Therefore they use less energy. More efficient.

Except there are scenarios there isn't a deterministic task to complete. e.g, gaming. And newer macs running hotter in these scenarios is a known issue (see e.g, Linus' YouTube channel).
 
Except there are scenarios there isn't a deterministic task to complete. e.g, gaming. And newer macs running hotter in these scenarios is a known issue (see e.g, Linus' YouTube channel).
Yes but that doesn’t change the fact that calling it less efficient is incorrect.
 
Yes but that doesn’t change the fact that calling it less efficient is incorrect.

It IS less efficient. You are getting a more powerful CPU with more power and heat. Nothing wrong with that; Intel does it too. But in our context, an increase in efficiency means doing more with less power (which is what Apple markets with their Apple Silicon processors).
 
It IS less efficient. You are getting a more powerful CPU with more power and heat. Nothing wrong with that; Intel does it too. But in our context, an increase in efficiency means doing more with less power (which is what Apple markets with their Apple Silicon processors).
No. It uses more power, but can complete a task in less time than the increase in power. If you use all of that power for a set amount of time, then that power gets used more quickly, but presumeably you are also getting better performance in the game or task you are doing. If you reduced the performance target to that of the lower power device, you would then get longer lasting use.

Doing more with less power is exactly what is happening. The M1 would take much longer to complete the task or would require much more power (if possible, depending on the curve) to achieve the same performance level, than the M4.
 
an increase in efficiency means doing more with less power

Or doing more with the same amount of power. I'm not sure if Apple is marketing "doing more with less power". They use TSMC technology and TSMC always describes their process as more speed with the same power or the same speed with less power, never "more speed with less power".

"Compared with TSMC’s industry-leading N2 process, A14 will offer up to a 15% speed improvement at the same power, or up to a 30% power reduction at the same speed."
 
Last edited:
Or doing more with the same amount of power. I'm not sure Apple is marketing "doing more with less power". They use TSMC technology and TSMC always describes their process as more speed with the same power or the same speed with less power, never "more speed with less power".

"Compared with TSMC’s industry-leading N2 process, A14 will offer up to a 15% speed improvement at the same power, or up to a 30% power reduction at the same speed."

If you are doing more with the same amount of power, it also means you are doing more with less power, so we mean roughly the same thing.
 
No. It uses more power, but can complete a task in less time than the increase in power. If you use all of that power for a set amount of time, then that power gets used more quickly, but presumeably you are also getting better performance in the game or task you are doing. If you reduced the performance target to that of the lower power device, you would then get longer lasting use.

I just told you that in many of those tasks, this is irrelevant because more power does NOT mean the task will get done faster. Gaming is one of them. The task will be done when the user finishes playing, which can be at any moment they wish.

I don't understand why you are trying to fly with this definition, as by this definition, ANY processor which is more powerful is also more efficient, as it "finishes tasks faster." And if we went by that definition, we could cherry pick and say that "Intel is more efficient than Apple," because it has processors which are much more powerful on its high range.

However, those processors tend to run very hot and give a much higher power bill, which is one of the reasons people go for quieter and cooler processors in the first place.
 
I just told you that in many of those tasks, this is irrelevant because more power does NOT mean the task will get done faster. Gaming is one of them.
You told me but you are incorrect so what you are telling me is irrelevant.

Are you saying that gaming on an M4 isn’t faster than gaming on an M1? In this context, where time is not variable, better performance is “faster”. Reduce the resolution or effects to that equivalent of an M1 and you will get longer gaming because the M4 can match the M1 at lower power usage.
I don't understand why you are trying to fly with this definition, as by this definition, ANY processor which is more powerful is also more efficient, as it "finishes tasks faster."
If it finishes the task by a greater proportion than the proportionate increase in power yes. It is more efficient.
And if we went by that definition, we could cherry pick and say that "Intel is more efficient than Apple," because it has processors which are much more powerful on its high range.
No, because the amount of power they use to be faster is much higher than the amount they are faster.
However, those processors tend to run very hot and give a much higher power bill, which is one of the reasons people go for quieter and cooler processors in the first place.
Yes, partly because they are less efficient.
 
If you are doing more with the same amount of power, it also means you are doing more with less power, so we mean roughly the same thing.

Not really. Let’s say M4 can encode a video in 1 minute at 10W. Using the example above 15% speed improvement at the same power means M5 could do it in 51s at 10W. 30% power reduction at the same speed means M5 could do it in 1 minute at 7W (saving battery). ”Doing more with less power” according to you would mean that M5 could do it in 51s at 7W. That hasn’t been the case and can never be unless TSMC can make such a CPU/GPU for Apple.
 
I might be reading too much into the IPhone 17/A19, but apple is adding the vapor chamber to the phone for a reason - Its quite possible that the A19 runs warmer then its predecessor. The implications are that the M5 will be a hotter running chip that will need better cooling technology. Again, that's a giant assumption that may not have any basis in reality.

A valid response, to a valid possibility!

Greg Joswiak's response to Mark Spoonauer when he asked about the transition back to Al so-soon after the extreme marketing buildup to the Ti construction previously was . . . I can't say "priceless", as much of what he said about how the the density of AL vs Ti is better-suited to thermal dissipation seemed (to my eye) legit.

Having owned a 2000 Volvo S40 for a number of years, I can definitely attest to the fact that--while Al does get hot--it truly dissipates heat at a faster rate than Fe.

My only direct experience with Ti is in the construction and composition of the Flexon frames I am wont to use for my eye-wear. I am 1000% in support of these, as I've ground frames into gravel with a large truck, and I was able to bend them back to original spec.

Seriously.

As for thermals, my Flexon frames feel just as warm as my older Al frames when left on the dashboard; in the sun....
 
  • Like
Reactions: maflynn
Interesting discussion on the A19 Pro and how its technology may influence the M5 series chips. It’s always the Pro iPhones which seem to reveal this kind of technology first.

I do hope that Apple stays with its philosophy of pursuing performance-per-watt. With these latest generations, M3 and M4 as well, there seems to have been more a tendency towards running warmer, which may just be them pushing the envelope a little to exploit the headroom they gained by optimising the designs in this way.

We haven’t seen Johnny Srouji in the presentation videos in a while…
 
Except there are scenarios there isn't a deterministic task to complete. e.g, gaming. And newer macs running hotter in these scenarios is a known issue (see e.g, Linus' YouTube channel).

And that matters how? Gamers want more performance. If the laptops cooling system is capable of handling it, running warmer is hardly a problem.
It IS less efficient. You are getting a more powerful CPU with more power and heat. Nothing wrong with that; Intel does it too. But in our context, an increase in efficiency means doing more with less power (which is what Apple markets with their Apple Silicon processors).

If you are doing more with the same amount of power, it also means you are doing more with less power, so we mean roughly the same thing.

You appear to be confusing power, heat, and efficiency. Efficiency is usually defined as energy spent to perform a task. If the time saving went up more than power usage, your system became more efficient. Also note that the relationship between power and performance is not linear, so you can’t just assume things. You have to measure them. You are welcome to produce some numbers (and methodology used to record them), which we can discuss.

By the way, going back to your example of AMD as a showcase of improving CPU efficiency. The A19 is 30% faster in code compilation* than the AMD flagship CPU cores, despite running 30% lower clock. I dont know the power draw of these CPU’s from the top of my head, I’d estimate them at 5-7 watts and 20-30 watts, respectively. Hope you can see where I am going with this.


*I picked code compilation because it’s a typical complex CPU workload with a lot of conditional patches and complex data manipulation. AMD will have advantage on some number crunching workloads where their wider SIMD and higher cache bandwidth can contribute.
 
And that matters how? Gamers want more performance. If the laptops cooling system is capable of handling it, running warmer is hardly a problem.
I agree, and while I prefer cooler running machines, gaming is such that you accept the heat.
And newer macs running hotter in these scenarios is a known issue (see e.g, Linus' YouTube channel).
I'm going to say Macs running hotter is more due to Apple's fan curve then anything else. I had a M4 Pro Mini and it was hitting the 90c range and the fans were not ramping up enough to manage the heat. Apple has largely decided quiet operation is better then cool running. With Macs Fan Control Mac owners can manually adjust the fan speeds, or even setup their own fan curves for the most part. I opted for a mac with more robust cooling, i.e., studio
 
I'm going to say Macs running hotter is more due to Apple's fan curve then anything else. I had a M4 Pro Mini and it was hitting the 90c range and the fans were not ramping up enough to manage the heat. Apple has largely decided quiet operation is better then cool running. With Macs Fan Control Mac owners can manually adjust the fan speeds, or even setup their own fan curves for the most part. I opted for a mac with more robust cooling, i.e., studio

To add to this, this behavior has been a cornerstone of Apples thermal management for more than 15 years. It’s perfectly safe to operate modern chips at close to 100C, with no downsides, so why waste power running loud fans if you can just engineer the system to reach peak performance at peak safe temperature?
 
  • Like
Reactions: Bungaree.Chubbins
It’s perfectly safe to operate modern chips at close to 100C, with no downsides
I disagree with that, there's been enough written to show that constant high heat will effect electronic components longevity. Spikes, or short burts of 100c, can be accommodated, but constantly seeing 100c is not good. Plus consider the other components that are more susceptible. If the CPU or GPU is hitting 100c, then the VRMs, SSD, and other various components are getting too hot.

There's a reason why many PCs start throttling and work to keep temps from staying at 90c or above.

What is a normal temperature for a CPU?
However, while the occasional spikes of peak temperatures to 194°F (90°C) are okay in isolated bursts, prolonged exposure can impact performance, and even damage the longevity of the chipset over time.

Safe CPU Temperature Range: What Temp Should My CPU Be?

As a generalization that might help you identify a serious problem, a CPU core temperature that is consistently going over 45- to 50-degrees Celsius while idling is possibly a cause for concern and a temperature that is consistently going over 90- to 105-degrees Celsius while under full load is probably a cause for concern (depending on what CPU you have).
 
I disagree with that, there's been enough written to show that constant high heat will effect electronic components longevity. Spikes, or short burts of 100c, can be accommodated, but constantly seeing 100c is not good. Plus consider the other components that are more susceptible. If the CPU or GPU is hitting 100c, then the VRMs, SSD, and other various components are getting too hot.

There's a reason why many PCs start throttling and work to keep temps from staying at 90c or above.

What is a normal temperature for a CPU?


Safe CPU Temperature Range: What Temp Should My CPU Be?

CPU manufacturers say that temperatures up to 105C are safe and fully covered by warranty. There is academic research showing that while running chips at higher temperatures does reduce the lifespan, you are still looking at 30+ years expected life running at 100C 24/7. Finally, Apple has been running their chips hot since the Intel era and they enjoy the reputation of being one of the most reliable computer brands out there.

So I’d trust these facts more than unsubstantiated opinions of tech journalists and bloggers. And by the way, one number I’ve seen in the article you have linked, 65C, has a significance because that’s the top safe operating temperature of some desktop CPU models when measured at the heat spreader. That temperature is lower than the actual core temperature (which again is 105C).
 
CPU manufacturers say that temperatures up to 105C
Keyword CPU, there's more to computers the CPUs, and if the processor is hitting 100c consistently then the odds are high the rest of the components are too hot.

Its funny, but only on mac sites do we see arguments on how safe 100c is for computers, where as most other people want their PCs running as cool as possible, i.e., sub 80c while gaming.

There is academic research showing that while running chips at higher temperatures does reduce the lifespan,
Please provide these research papers, I'm not doubting you, rather I provided information that backs up my statement, and I'd like see what's been said on the opposite side of things
 
I'll be honest, these two research papers are over my head but googling I found these two papers, and they seem to conclude that, but in all honesty, I could be wrong, i.e., they're over my head.

Physically based models of electromigration: From Black’s equation to modern TCAD models
The Study of the Reliability of Complex Components during the Electromigration Process

Here's one study that seems to suggest constant heat is not good:
Impact of Temperature on Intel CPU Performance
For the average system, our rule of thumb at Puget Systems is that the CPU should run around 80-85 °C when put under full load for an extended period of time. We have found that this gives the CPU plenty of thermal headroom, does not greatly impact the CPU's lifespan, and keeps the system rock stable without overdoing it on cooling. Lower temperatures are, of course, better (within reason) but if you want a target to aim for, 80-85 °C is what we generally recommend.
 
Last edited:
  • Like
Reactions: leman
Im proof that gpu can run almost for hours at around 106C for years...no issues. Had an 295X gpu also that thing was running all the time over 100C...sometimes even reset the system because was hitting the 110C mark...but even after 8-9 years no issues..and i dont think dGpu are different from Cpu on this regard or even if now are part of the same SoC
 
Last edited:
I'll be honest, these two research papers are over my head but googling I found these two papers, and they seem to conclude that, but in all honesty, I could be wrong, i.e., they're over my head.

Physically based models of electromigration: From Black’s equation to modern TCAD models
The Study of the Reliability of Complex Components during the Electromigration Process

Here's one study that seems to suggest constant heat is not good:
Impact of Temperature on Intel CPU Performance

This article is a good summary, in particular the figure 3: https://www.arepa.com/media/zi2lbdvf/arepa-whitepaper-extending-life-expectancy.pdf

The reason why 105C is a magical number is because above it processors start aging very fast. As you can see from the graph, the life expectancy at continuous 105C is way over 100k hours (10 years), but already running at 120C cuts the lifespan dramatically.

In practice, and with more complex semiconductors all these numbers will of course be lower. But for consumer devices the expected lifespan of 10-20 years is more than adequate.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.