Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

blackreplica

macrumors regular
Original poster
Sep 28, 2010
105
49
I suppose we all know by now that the 2018 MBP has problems with heat dissipation. Since I use my MBP mostly on my desk in clamshell mode running to my external monitor, I started feeling slightly reduced performance compared to when I was using it by itself so I decided to measure the difference using DXO Photolab and exporting five 42 megapixel images with PRIME noise reduction (a heavy test on the CPU, requiring roughly a full minute per file to export) to peg the CPU at full load until the export completed

Summary of results in the attached image but for my use case:

Slightly lower initial boost clock
Lower sustained clock (dropping even below base clock after a while)
Reduced wattage drawn from the CPU and consequently lower sustained temperatures in clamshell mode (maybe because the dGPU is active with an external monitor?)
An 8% increase in export times using clamshell mode
Performance above base clock is therefore not guaranteed even with the i7 2.6 with maxed CPU-only loads when in clamshell mode

43557434614_a248293671_o.jpg


I did not expect that high of a performance hit, so I thought I'd share it here. In real world use there is a small but noticable difference in snappiness and overall speed using the laptop as-is compared to in clamshell mode with an external display, particularly in high workloads

For the sake of completeness, I decided to run the same test with the screen open while connected to my external monitor. Giving it room to breathe does next to nothing: the same export finishes in 4m25s, a whole second faster than when in clamshell mode. Clocks also drop below base clock after a while, just like it did in clamshell mode.
 
Last edited:
  • Like
Reactions: mh` and Zxxv
I think it's because of thermals. I did some simple tests with my 13" and found that with the machine itself, the CPU package uses 0.5W on idle, where as in clamshell mode it uses at least 4W. This means you've got 10% of your total thermal capability into just running the display output.

I'm guessing it's because the Type-C ports don't connect directly to the GPU, they go to the CPU, which has to somehow redirect the video signal to it, causing additional power drain.

Also in Clamshell mode the top casing is insulated by the air gap between the screen and the keyboard. Where as if you open the laptop the top casing can also dissipate some heat. So for maximum performance, you actually want to run the machine open with no external monitors.

The problem could be worse on the 15" because not only does the dGPU needs to be powered up, it also doesn't have direct connection to the ports, the graphics have to be rendered on the dGPU, sent to the CPU, and then redirected to the ports.
 
Last edited:
In case you are new to computers, clamshell mode has always caused thermal throttling in every laptop that is under 4cm thick and doesn’t have massive fans.

Try elevating the laptop or using a Book Arc type stand. People have used cooling methods like that on laptops for over a decade and that is not going to change for the next decade.
 
I think that both Apple and its customers are going to be happy when MacBooks finally have A class processors in them.

Apple obviously want to ship powerful processors that are power efficient, that don’t run hot. and allow for a sleek chassis - and Intel is the Achilles heel here.
 
I never really understood the appeal of running a MBP in clamshell mode when workload is more intensive. Out of all the Macs the iMac is relatively good value for a complete system with a great screen and offers more performance. With an iMac for more intensive tasks then a cheaper MBP makes sense for work on-the-go. It also gives you some redundancy should one machine fail on you.
 
  • Like
Reactions: haruhiko
I never really understood the appeal of running a MBP in clamshell mode when workload is more intensive.
Its funny, I'm seeing lower temps when I use it for my work related stuff (nothing demanding). I usually connect remotely to my work computer but I'm seeing temps in the 47c range in clamshell mode, and about 57c with the screen up. I was actually surprised surprised to see the temps.

Feeling the screen after being in clamshell mode reveals that its warm, not hot but warm. I'm not sure about the long term affects that will have so I'm not sure I'll keep doing it.
 
For the sake of completeness, I decided to run the same test with the screen open while connected to my external monitor. Giving it room to breathe does next to nothing: the same export finishes in 4m25s, a whole second faster than when in clamshell mode. Clocks also drop below base clock after a while, just like it did in clamshell mode.

I'll update the original post
 
  • Like
Reactions: airlied
For the sake of completeness, I decided to run the same test with the screen open while connected to my external monitor. Giving it room to breathe does next to nothing: the same export finishes in 4m25s, a whole second faster than when in clamshell mode. Clocks also drop below base clock after a while, just like it did in clamshell mode.

I'll update the original post

So actually its not clamshell mode thats causing the reduced performance, but just connecting it to an external display...
 
I think that both Apple and its customers are going to be happy when MacBooks finally have A class processors in them.

Apple obviously want to ship powerful processors that are power efficient, that don’t run hot. and allow for a sleek chassis - and Intel is the Achilles heel here.

Don’t think Apple customers as a whole will ever be happy. The move to A class processors will likely bring teething problems and some specific performance challenges which will likely trigger a demand for movement back to Intel by a portion of the user base.
 
So actually its not clamshell mode thats causing the reduced performance, but just connecting it to an external display...

Pretty much, runnning the DGPU even on an idle desktop takes away from what little thermal headroom remains available such that the processor will throttle below base clock at full load
 
Don’t think Apple customers as a whole will ever be happy. The move to A class processors will likely bring teething problems and some specific performance challenges which will likely trigger a demand for movement back to Intel by a portion of the user base.

With ARM’s recently released roadmap, they are claiming that they’re going to got to i5 (mobile) levels of performance by 2020 but on 5w, I suspect that we’ll see the macbook move over first, then the iMac & that anything with ‘pro’ in the name will be last.
 
Don’t think Apple customers as a whole will ever be happy. The move to A class processors will likely bring teething problems and some specific performance challenges which will likely trigger a demand for movement back to Intel by a portion of the user base.

I think the burden falls more on Apple. Take the Mac Mini for example. They have had FOUR years to perfect and test the design. With the rumors of a new model, all I can say is considering how much time they have had to get it right the thing better run like a Swiss Watch.
 
I don't think that we will see Apple transition to ARM processors because they have a different instruction set that would make Bootcamp impossible. This would alienate the people who rely on being able to dual boot windows/Mac. Dont get me wrong, having ARM processors that can perform as well as Intel processors and use just 5W sounds amazing.
 
I don't think that we will see Apple transition to ARM processors because they have a different instruction set that would make Bootcamp impossible. This would alienate the people who rely on being able to dual boot windows/Mac. Dont get me wrong, having ARM processors that can perform as well as Intel processors and use just 5W sounds amazing.

I agree it likely won't be happening anytime soon... But Windows 10 does have an ARM variant.
 
I don't think that we will see Apple transition to ARM processors because they have a different instruction set that would make Bootcamp impossible. This would alienate the people who rely on being able to dual boot windows/Mac. Dont get me wrong, having ARM processors that can perform as well as Intel processors and use just 5W sounds amazing.

Windows is slowly heading towards running on ARM too - I mean it already does, it’s just that (by all accounts) it’s pretty slow and with a bad third party library of ARM ported apps.

I would suspect that Pro Macs are only going to start to make the transition around 2022 or so & by that time it’s likely that:

- Windows and many of its common 3rd party apps will be running great on ARM
- the ARM MacBook will almost certainly have launched by 2020 and people will be envious of its battery life and that it’s ‘5g ready’
- ‘marzipan’ will become the standard way of making new Mac apps and you can bet that Apple will make it seamless for marzipan intel apps to compile to ARM.
- finally, mac gaming will turn into either iPad marzipan apps and/or streaming.

Note: I suspect the next few years are going to see pc gaming using ultra high specced machines that draw insane amounts of power (even more than now). This is so far from where Apple is going.

So I think if anyone is buying a mac & an eGPU to run pc games on it at ultra 5k settings will be buying a machine that just isn’t designed to do that.

In short, I get people’s concerns. I just don’t think that this will be a problem in a few years.
 
Note: I suspect the next few years are going to see pc gaming using ultra high specced machines that draw insane amounts of power (even more than now). This is so far from where Apple is going.

And why would you think that? your typical gaming CPU now days is 65W~95W, compared to the 100W~130W in the past. And the just announced RTX2080 is done to 215W from 250W.

My gaming rig from 2013 is pushing 250W on the CPU and 300W on the GPU. I can get the same performance today with a 95W CPU and 150W GPU.
 
And why would you think that? your typical gaming CPU now days is 65W~95W, compared to the 100W~130W in the past. And the just announced RTX2080 is done to 215W from 250W.

My gaming rig from 2013 is pushing 250W on the CPU and 300W on the GPU. I can get the same performance today with a 95W CPU and 150W GPU.

Hi, I mean that Apple are seemingly focussed on making small devices with low power consumption & the typical gaming rig’s power consumption is way higher than that.

I’m not saying if this is good or bad - it just seems to be that Apple is focussed on thin/small forms & power/battery efficiency, that’s all.
 
Hi, I mean that Apple are seemingly focussed on making small devices with low power consumption & the typical gaming rig’s power consumption is way higher than that.
Coming from a Razer who's power consumption is higher, the power brick is bigger, and I couldn't charge through the USB. I struggled to get at 5 hours on the Razer, where as the MBP's battery performance is much much better.

Two different laptops, that server two different market segments and do different things.
 
  • Like
Reactions: bluecoast
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.