Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The fan curve is much too timid, allowing spikes up to 90c before it kicks in. However, the CPU doesn't cool immediately, so it throttles anyway.

This is rather strange since according to both external reviews and my own testing, the MBP doesn't throttle at all under sustained loads — the performance is very stable over practically indefinite time periods. Temperature is at 100C, sure, but there is nothing wrong with that, since this temperature is within normal operating specs. Maybe you got a lemon?

P.S. I never actually monitored clocks, since I don't believe they are even remotely useful. Instead, I was looking at the actual performance and whether it declines over time.
 
  • Like
Reactions: RandomDSdevel
External reviews mention throttling. I haven't seen one that stress tested the laptop without it. In fact, there's an episode of some hardware YT channel where they put the MBP15 into a freezer while it's encoding to increase performance.
 
  • Like
Reactions: RandomDSdevel
That depends on the definition of throttling though. I'd argue that "throttling" is only if a CPU runs below the specified base frequency. The turbo frequencies stated by Intel are a theoretical maximum, and from all the reviews I've read, there are almost no notebooks in production which can run a current Intel CPU at full turbo for extended periods of time. The actual performance of the MacBook Pro compared to a big chunk of the direct competition is actually very good.

Also, I think the fan curve is spot on. Most of my Macs run close to a 100° C when I'm doing work on them. My 2012 Mac Mini even runs at a full 100° C for multiple weeks non-stop, and there are still no problems after six years with it. From what I've experienced, 100° C is a temperature CPUs can run at without problems. I'd rather take a quieter fan than a cooler CPU in most situations to be honest.
 
External reviews mention throttling. I haven't seen one that stress tested the laptop without it. In fact, there's an episode of some hardware YT channel where they put the MBP15 into a freezer while it's encoding to increase performance.

I think the issue is that many people are a bit dramatic when talking about throttling. Almost any computer will get faster when put into a freezer (before it dies due to likely condensation). My criterion is whether performance takes a dip when you push the computer to the limit and what is that performance compared to similar machines. I didn't observe any slowdown over time running multiple heavy-duty computational processes at the same time, which to me means no throttling. Can the CPU potentially run faster, if you put it under ideal conditions? Most certainly. Does it have any practical relevance? Most certainly not. Cooling is limited by ambient temperature anyway, so saying that laptop throttles since it performs better in a freezer is similar to claiming that a car engine throttles if it can run faster on an empty straight race track than in city traffic...

Similarly, saying that MBP throttles since it runs hotter is also a logical fallacy. It runs hotter since it cooling system is optimised to be efficient. Intel CPUs can run safely at 100C without any throttling, so a cooling system that maintains the max temperature at 80C is simply cooling "too much". If running the CPU at 100C gives you the same performance than running it on 80C, there is not much reason to avoid the higher temps.

Speaking about external reviews, the only reviewer I am aware of who does in-depth reviews of sustained performance is notebookcheck.com. Specifically, they run multiple instances of Cinebench in a loop and observe how the performance develops. You'll find this test in the "Processor" section. Here some examples:

Dell XPS with i7-7700HQ — throttles (https://www.notebookcheck.net/Dell-XPS-15-9560-i7-7700HQ-UHD-Laptop-Review.200648.0.html)

Dell Precision 15" with i7-7820HQ — throttles (https://www.notebookcheck.net/Dell-Precision-3520-i7-7820HQ-M620M-Workstation-Review.224112.0.html)

Dell Precision 17" with Xeon — stable (https://www.notebookcheck.net/Dell-Precision-7720-Xeon-P5000-4K-Workstation-Review.279706.0.html)

MBP 2017 with i7-7700HQ - stable (https://www.notebookcheck.net/Apple-MacBook-Pro-15-2017-2-8-GHz-555-Laptop-Review.230096.0.html)
 
Yeah because everybody needs a 60° GPU. All of my friend freak out for no reason if they see anything around 90... lmao.
 

The difference is the XPS 15 can run flat out when you undervolt it and repaste/take care of the VRMs.

"Similarly, saying that MBP throttles since it runs hotter is also a logical fallacy. It runs hotter since it cooling system is optimised to be efficient. Intel CPUs can run safely at 100C without any throttling, so a cooling system that maintains the max temperature at 80C is simply cooling "too much". If running the CPU at 100C gives you the same performance than running it on 80C, there is not much reason to avoid the higher temps."

I'm saying it throttles because it hits 100C and drops clocks to lower than my XPS 15 did in the same loads. Also, saying something is cooling "too much" is, frankly, insane.
 
  • Like
Reactions: RandomDSdevel
I'm saying it throttles because it hits 100C and drops clocks to lower than my XPS 15 did in the same loads.

Then there is probably something wrong with your MBP. I did some more testing during a break, this time using Power Gadget. I ran 4 threads with an intensive numeric simulation. This is the status after approx. 15 minutes, at which point I have stopped the test. As you can see the temperature is stable at around 99C and the clocks are fluctuating between 3.35 and 3.4 Ghz, which is the maximum turbo boost for i7-6920HQ running with all four cores enabled. It doesn't really get any better than this.

Oh, BTW, this was with laptop unplugged from the charger, so it was running entirely off the battery. Ate about 10% of the charge during the testing time :)

jaYACZG.png


Also, saying something is cooling "too much" is, frankly, insane.

Cooling for coolings sake doesn't have much point to it. One cools the CPU so that it can continue delivering high performance. As I showed above, the MBP's thermal design allows it to maintain full clock, without dropping, in intensive sustained workloads. Cooling it down any further wouldn't have increased the performance and can therefore be considered wasteful. Which again shows us how insanely precise engineering this is. The cooling system is designed exactly so that the CPU hits its maximal safe temperature at its peak performance.

P.S. And yes, I agree that with such precise thermal engineering you will run out of headroom if your circumstances are less then ideal, e.g. if your ambient temperature is high or maybe if you have to run heavy loads on both the CPU and the GPU. Then again, not many laptops will do better, save for high-end workstation and gaming machines (which have a very different engineering budget).

P.P.S. Wow, thats a large picture. Is there a way to scale it down? Can't find an appropriate option.
 
The difference is the XPS 15 can run flat out when you undervolt it and repaste/take care of the VRMs.

"Similarly, saying that MBP throttles since it runs hotter is also a logical fallacy. It runs hotter since it cooling system is optimised to be efficient. Intel CPUs can run safely at 100C without any throttling, so a cooling system that maintains the max temperature at 80C is simply cooling "too much". If running the CPU at 100C gives you the same performance than running it on 80C, there is not much reason to avoid the higher temps."

I'm saying it throttles because it hits 100C and drops clocks to lower than my XPS 15 did in the same loads. Also, saying something is cooling "too much" is, frankly, insane.

The other significant factor is the dGPU, once this is also fired up and running hard another 35W of power needs to be dissipated by the cooling system of the MBP. Your always better to have some headroom in the TDP of the cooling system as this allows for variance in the ambient temperature and lower fan RPM's at given loads, and of course helps to rule out throttling. In general you want your notebook/desktop to run as cool as is practicable for obvious reasons Running at the absolute edge thermal envelope, there's only one place to go...

Q-6
 
  • Like
Reactions: RandomDSdevel
Yeah because everybody needs a 60° GPU. All of my friend freak out for no reason if they see anything around 90... lmao.

By the way, there is another reason behind these temperature myths that I just remembered. It has to do with the difference in max temperatures for desktop CPUs and mobile CPUs. Intel has historically specified two different max temperatures: TJunction (temperature at the CPU die sensor), which is 100C and TCore (temperature at the CPU heat spreader), which is usually around 65-70C. The later temperature is not relevant to the mobile CPUs, since they don't have a heat spreader! And since most of information about "proper" CPU temperatures comes from the overclocker desktop community, we of course have a discrepancy. While in fact, having 90C with a mobile CPU is more or less equivalent to having 60C with a desktop CPU since the temperature at the heat spreader is obviously lower...

P.S. Intel have stopped listing the Tase temperature alltogether with Kaby Lake, so I guess these things will become simpler in the future.
 
A lot of chatter about CPUs in here but I wonder if they'll upgrade to a proper 10 bit display for HDR...

Did Apple ever change the display without a redesign? I mean they went from CCFL backlit displays to LED backlit displays when moving from the original MBP to the unibody design. The retina display, i.e. higher resolution + IPS panel, happened when moving to the retina form factor. The brighter backlight and larger color space happened when moving to the current form factor. So I'm afraid we'll have to wait for the next redesign, probably somewhere around 2020, to get a new display.
 
Did Apple ever change the display without a redesign? I mean they went from CCFL backlit displays to LED backlit displays when moving from the original MBP to the unibody design. The retina display, i.e. higher resolution + IPS panel, happened when moving to the retina form factor. The brighter backlight and larger color space happened when moving to the current form factor. So I'm afraid we'll have to wait for the next redesign, probably somewhere around 2020, to get a new display.

They actually switched to LED backlights with the Mid 2007 model, so well before the unibody redesign.
 
  • Like
Reactions: RandomDSdevel
A lot of chatter about CPUs in here but I wonder if they'll upgrade to a proper 10 bit display for HDR...

HDR has nothing to do with 10bit panels which „only“ give you more accurate color representation (less banding). For HDR, you‘ll need individual control over pixel brightness. Not likely to happen before micro-OLED.
 
HDR has nothing to do with 10bit panels which „only“ give you more accurate color representation (less banding). For HDR, you‘ll need individual control over pixel brightness. Not likely to happen before micro-OLED.

You don't need OLED to be able to claim that your display supports HDR. In fact, a full 10 bit color space might be enough, depending on which HDR specification you want to claim.

https://en.wikipedia.org/wiki/High-dynamic-range_video
 
  • Like
Reactions: RandomDSdevel
You don't need OLED to be able to claim that your display supports HDR. In fact, a full 10 bit color space might be enough, depending on which HDR specification you want to claim.

Yeah, but you know, thats where all the marketing speak starts. There is a distinction between "HDR" and "HDR standard". I haven't looked at the exact math of it, but if I understand it correctly, what the HDR standards do is is basically map colors to brightness, hence also the requirement to have 10bit per channel in data specification (its a different question though if the actual panel needs to support 10bit color, no idea what standards say there). I suppose that they also must be losing some color information, since if you want to encode both color and brightness, you'd have to make some sacrifices (but maybe they do it the way that some colours just can't be brighter than other colours, no idea). Anyway, by using so encoded color/brightness pairing, one can describe an image where different parts have vastly higher brightness contrast than with "normal" images. But thats just the way of encoding stuff. If you want, a HDR standard is basically a "compression specification" for color and brightness.

What I am more concerned about is how these brigtness contrasts are actually displayed. What you'd really want here, as mentioned before, is individual pixel brightness control. It seems that high-end TVs achieve this either by using OLED (where brightness can be controlled individually, even thought the range is usually not great) or by using multiple zonal LED backlights instead of just one uniform. And if you don't have this physical capability to control brightness, you can resort to an array of tricks such as adjusting overall brightness per-frame (so that brighter scenes appear much brighter) and/or changing colors (tone mapping) to abuse particularities of human color and brightness perception.

The later tricks were employed by games for quite some time now since all you need is an appropriate shader. You don't actually need to support any of these HDR standards or have any special hardware. Still, I don't see how all this is more then a clever hack, similar to how we used to employ palettes to pretend if we have a lot of colors (while in fact we had only 256 or less). "Real HDR" means floating-point color channel specification for some proper dynamic range and hardware that can actually display full range of brightness in each pixel individually. Once we have tech that allows every single pixel to go between 0 and 5000nits (or more), we'd really real HDR. And only little need for HDR standards :)

P.S. It is very much possible that I completely misunderstood what HDR standards do, so feel more then free to correct me if what I wrote is total nonsense (it often is :D)

P.P.S. And of course Apple already offers APIs to work with HDR (which they more humbly and adequately call EDR): https://developer.apple.com/library...OSX/WhatsNewInOSX/Articles/MacOSX10_11_2.html last item
 
Last edited:
  • Like
Reactions: RandomDSdevel
Yeah, but you know, thats where all the marketing speak starts. There is a distinction between "HDR" and "HDR standard". I haven't looked at the exact math of it, but if I understand it correctly, what the HDR standards do is is basically map colors to brightness, hence also the requirement to have 10bit per channel in data specification (its a different question though if the actual panel needs to support 10bit color, no idea what standards say there). I suppose that they also must be losing some color information, since if you want to encode both color and brightness, you'd have to make some sacrifices (but maybe they do it the way that some colours just can't be brighter than other colours, no idea). Anyway, by using so encoded color/brightness pairing, one can describe an image where different parts have vastly higher brightness contrast than with "normal" images. But thats just the way of encoding stuff. If you want, a HDR standard is basically a "compression specification" for color and brightness.

What I am more concerned about is how these brigtness contrasts are actually displayed. What you'd really want here, as mentioned before, is individual pixel brightness control. It seems that high-end TVs achieve this either by using OLED (where brightness can be controlled individually, even thought the range is usually not great) or by using multiple zonal LED backlights instead of just one uniform. And if you don't have this physical capability to control brightness, you can resort to an array of tricks such as adjusting overall brightness per-frame (so that brighter scenes appear much brighter) and/or changing colors (tone mapping) to abuse particularities of human color and brightness perception.

The later tricks were employed by games for quite some time now since all you need is an appropriate shader. You don't actually need to support any of these HDR standards or have any special hardware. Still, I don't see how all this is more then a clever hack, similar to how we used to employ palettes to pretend if we have a lot of colors (while in fact we had only 256 or less). "Real HDR" means floating-point color channel specification for some proper dynamic range and hardware that can actually display full range of brightness in each pixel individually. Once we have tech that allows every single pixel to go between 0 and 5000nits (or more), we'd really real HDR. And only little need for HDR standards :)

P.S. It is very much possible that I completely misunderstood what HDR standards do, so feel more then free to correct me if what I wrote is total nonsense (it often is :D)

P.P.S. And of course Apple already offers APIs to work with HDR (which they more humbly and adequately call EDR): https://developer.apple.com/library...OSX/WhatsNewInOSX/Articles/MacOSX10_11_2.html last item

I do not disagree with you at all. This is what HDR ideally should be. However, due to marketing and the low requirements to put an "HDR" sticker on a box, there are tons of manufacturers who market cheap edge LED 4k TVs and even some monitors with TN panels as "HDR". I've seen these "HDR" TVs in person, and to be honest, it's just marketing there.

Obviously there is the other end of the spectrum, with some OLED TVs showing what is possible today, and they truly look stunning. The point I want to make is – a display is not any better just because it is "HDR certified" or whatever the marketing says. And I'm pretty sure Apple would find some creative way to make be able to market all of their displays as "HDR". But I don't think that's how Apple wants to market their products - if they promote a better screen, it usually actually is better.

And I do indeed think that the screens of the current MacBook Pros are some of the best screens on the market. From what I have seen, it's actually the brightest screen on any mainstream notebook (500 nits vs. 250 – 450 nits on most competing notebooks). I don't think we'll see 5000 nits anytime soon.

Oh, and one point to add on the full-LED with local dimming technique: I do still have an LG LX9500 TV. This was the highest end TV LG offered back in 2010, and it does indeed support local dimming with its full LED backlight. However, the zones defined are relatively large, so you do get quite a bit of bleeding around light sources in dark scenes. It's still a great TV, but full LED local dimming just isn't up to OLED when it comes to displaying dark scenes if the dimming zones are larger than one pixel.
 
My last macbook pro was stolen by some people in Missoula Montana a couple of years ago, so I am looking to replace it this year.
 
Last edited by a moderator:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.