Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

omvs

macrumors 6502
Original poster
May 15, 2011
495
20
As part of a machine shuffle, I'm debating a rMBP 15", but would be using with an external monitor at work, and I have concerns about the power consumption:

I've noticed my 2011 mac mini (ATI 6630) gets noticeably hotter when an external monitor is connected, even if the machine isn't doing anything. iStat Menus seems to show this is all in the GPU power dissipation - rather than powering down, the GPU seems to stick at ~8 watts.

I did some further tests, and just hooking a monitor to the thunderbolt port cause the power to go up (even if HDMI unconnected). Connecting to only HDMI gives lower power.

My 2011 iMac (ATI 6970) exhibits similar behavior, though the increase in power is more like 20+ watts, and is enough I made it easy to disconnect the monitor to get the machine cooler when I don't need the 2nd monitor.

Do the newer machines solve this issue? I'm hoping it was just a driver or hardware issue with that years ATI cards, but if it still happens I might shy away from the macbook - would prefer not to have the GPU working hard all the time....
 

ValSalva

macrumors 68040
Jun 26, 2009
3,783
259
Burpelson AFB
As part of a machine shuffle, I'm debating a rMBP 15", but would be using with an external monitor at work, and I have concerns about the power consumption:

I've noticed my 2011 mac mini (ATI 6630) gets noticeably hotter when an external monitor is connected, even if the machine isn't doing anything. iStat Menus seems to show this is all in the GPU power dissipation - rather than powering down, the GPU seems to stick at ~8 watts.

I did some further tests, and just hooking a monitor to the thunderbolt port cause the power to go up (even if HDMI unconnected). Connecting to only HDMI gives lower power.

My 2011 iMac (ATI 6970) exhibits similar behavior, though the increase in power is more like 20+ watts, and is enough I made it easy to disconnect the monitor to get the machine cooler when I don't need the 2nd monitor.

Do the newer machines solve this issue? I'm hoping it was just a driver or hardware issue with that years ATI cards, but if it still happens I might shy away from the macbook - would prefer not to have the GPU working hard all the time....

As far as the 15" rMBP is concerned, isn't the dGPU used when an external display is connected? I believe it's automatic. That would use more power and thus generate more heat.
 

NewishMacGuy

macrumors 6502a
Aug 2, 2007
636
0
While I have a 2011 uMBP, I believe that the way in which they drive the ATD is the same, they both use the dGPU when attached to the ATD. I suppose you could use gfxCardStatus to switch to the iGPU if you wanted and that should reduce both power consumption and heat. Other than that it will use more power and get hot.


___
 

ssmed

macrumors 6502a
Sep 28, 2009
875
413
UK
As far as the 15" rMBP is concerned, isn't the dGPU used when an external display is connected? I believe it's automatic. That would use more power and thus generate more heat.

You are correct (also for previous cMPB), you can check in the 'About this Mac' System report which will show whether you are using the discrete card or not.
 

priitv8

macrumors 601
Jan 13, 2011
4,038
641
Estonia
A 15" rMBP most def will switch to dGPU (nVidia) when external monitor gets connected (doesn't matter if TB or HDMI). One can foce the machine to use iGPU using the gfxCardStatus, but as soon as you do it, the external monitors remain blank. I suspect the HDMI/DP connection is physically wired to nVidia only, and the GMux, that was used to switch DP outlet to either iGPU or dGPU in 2008 MBP, is not present in the system anymore.
 
Last edited:

thundersteele

macrumors 68030
Oct 19, 2011
2,984
9
Switzerland
As others have said, connecting an external display activates the discrete GPU which increases power consumption and also the amount of heat generated, even when idle. I'm not sure if the Nvidia GPUs do a better job at least with idle temperatures, but my 2011 ATI GPU heats up the Mac quite a bit.

Is it a problem? When connected to an external display, usually power is not a problem. However if you don't use an external keyboard I can see it being more uncomfortable.
 

omvs

macrumors 6502
Original poster
May 15, 2011
495
20
Is it a problem? When connected to an external display, usually power is not a problem. However if you don't use an external keyboard I can see it being more uncomfortable.

I'm okay with it activating the discrete GPU, as long as it has a reasonable idle powerr. From what I can tell, the GPU goes to near maximum dissipation on my iMac & macmini and stays there (its similar power to running a game) - maybe they're just deactivating all porersaving features. With the 6970 in the iMac, thats a lot of heat coming out, and I'm worried about decreasing the lifespan of the GPU. It does make the room noticeably hotter...

I'm assuming the GT650 can dissipate a fair amount of power, and in a laptop package I wouldn't want to do that all the time. Maybe I need to borrow a machine, install iStat, and see how it behaves....
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
Usually GPUs don't enter the lowest power states when an external display is connected. So they generally run a lot hotter even if they don't do anything.
I know my MBP 2010 runs quite a lot hotter when an external display is connected even if in both cases the 330M is active and it is doing nothing but displaying static content.
330M without external and I need to put at least medium load on the cpu for the fans to ramp up significantly. With an external anything but complete idle workload will push the fans into upper gears.
With the Intel GPU running it is obviously the coolest and requires really high load to get noisy.

In general though you should only use an external display for extended period of times with the notebook plugged in so it doesn't really matter.
And compared to other household devices a notebook needs very little power even with the dGPU active. The 27" Thunderbolt display will consume 2-3 times as much unless the notebook is under heavy load. A bigger TV consumes way more. A stove requires in 1h more than the notebook will need in 200h. And if we start comparing what a car needs when it sits just 1 min idle at a traffic light or what you need even only per person on one flight to your holiday destination.
Seriously if you are concerned about the environment the notebook is not were you should focus your attention too. One less holiday flight and you cannot be wasteful enough with your tech toys to ever make up for that. One drive less to a diner with a car and you can operate your toys for a year.
Buy a smaller TV and more efficient house appliances. Heating and AC usually the thing that reaps the most benefits from optimization.
 

thundersteele

macrumors 68030
Oct 19, 2011
2,984
9
Switzerland
I'm okay with it activating the discrete GPU, as long as it has a reasonable idle powerr. From what I can tell, the GPU goes to near maximum dissipation on my iMac & macmini and stays there (its similar power to running a game) - maybe they're just deactivating all porersaving features. With the 6970 in the iMac, thats a lot of heat coming out, and I'm worried about decreasing the lifespan of the GPU. It does make the room noticeably hotter...
I thought you mentioned something like 20 watts? That is certainly not enough to heat the room. The second screen might be producing more heat than the GPU.

I wouldn't worry about the GPU lifespan. It is designed for this kind of stress. It is like worrying that a ship might sink because you put it in the water. I still understand that the additional heat and fan noise is an unwanted effect.

I'm assuming the GT650 can dissipate a fair amount of power, and in a laptop package I wouldn't want to do that all the time. Maybe I need to borrow a machine, install iStat, and see how it behaves....

It might be worth trying.

Another option would be to look for a machine without discrete GPU.
 

NewishMacGuy

macrumors 6502a
Aug 2, 2007
636
0
I thought you mentioned something like 20 watts? That is certainly not enough to heat the room...

If you can feel the heat radiating off of your MBP, it's heating up the room. If you don't run an AC and the room is smallish, it's noticeable over time.


__
 

Yahooligan

macrumors 6502a
Aug 7, 2011
965
114
Illinois
If you can feel the heat radiating off of your MBP, it's heating up the room. If you don't run an AC and the room is smallish, it's noticeable over time.


__

The ATD is going to put off much more heat than the MBP. That's like worrying about a candle heating up a room while you have a fire going in the fireplace. ;)

If the concern is power/heat, the external display is going to overshadow whatever little extra the GPU causes.
 

NewishMacGuy

macrumors 6502a
Aug 2, 2007
636
0
The ATD is going to put off much more heat than the MBP. That's like worrying about a candle heating up a room while you have a fire going in the fireplace. ;)

If the concern is power/heat, the external display is going to overshadow whatever little extra the GPU causes.

Not sure on that one. Just because something uses more power doesn't necessarily mean that it puts out more heat, just that is has the potential to do so. It really depends on the efficiency of the device (as heat is wasted power for these devices).

I feel a lot more heat radiating from my MBP when the dGPU is fired up than I do from the ATD, especially when you consider that the ATD vents nearly all it's heat out the bottom, whereas the entire uMBP case acts as a heat sink.


___
 

Yahooligan

macrumors 6502a
Aug 7, 2011
965
114
Illinois
Not sure on that one. Just because something uses more power doesn't necessarily mean that it puts out more heat, just that is has the potential to do so.

Power = heat, how the heat is dissipated doesn't change the amount of heat produced.

It really depends on the efficiency of the device (as heat is wasted power for these devices).

No, these are not mechanical devices where heat means wasted energy/reduced efficiency. These are electronic devices with no moving parts, if they consume 20w of power then that 20w will generate heat.

I feel a lot more heat radiating from my MBP when the dGPU is fired up than I do from the ATD, especially when you consider that the ATD vents nearly all it's heat out the bottom, whereas the entire uMBP case acts as a heat sink.


___

I have an ATD and it puts out MUCH more heat than my MBP. It doesn't matter how the heat is transferred away from the parts, that's just passive vs. active cooling.

I have a laser/IR thermometer.

Consider the surface area of both, the MBP and the ATD. My MBP has much less surface area and the hottest spot on it is just under 100F, which is limited to a small area. The rest of it is 85F-95F or so.

My ATD is much larger, the center of the display, which is the coolest, is 103F. The top of the display is 111F, the bottom is 108F.

Heat radiates up, I didn't even need to use my thermometer to know the ATD puts out more heat. All you have to do is place your hand above the top of the display and above the MBP. You tell me which is radiating more heat. ;)

The dGPU does not put out more heat than the ATD.

So, I stick by my statement. The ATD will overshadow whatever extra power/heat is generated by the GPU when connected to an ATD.
 

dusk007

macrumors 68040
Dec 5, 2009
3,411
104
Removing your own self from the room would also radiate much less heat into the room. About 50-75W. Don't be in the room and it is much better off.
Anyway the external screen if it is any bigger than 21" is going to radiate more than the notebook if the latter isn't under medium/heavy load.
 

omvs

macrumors 6502
Original poster
May 15, 2011
495
20
...
Anyway the external screen if it is any bigger than 21" is going to radiate more than the notebook if the latter isn't under medium/heavy load.

Thats the problem - the GPU is effectively going into medium/heavy load, for what seems like a stupid reason -- but perhaps the thunderbolt interface is making this more difficult for some reason..

Sure, it *should* be able to handle it, but I think the GPU designers cut more corners than the CPU designers. I've had a few desktop GPU that I had to *underclock* after 1-2 years to remain stable. Add in fans that go bad, a gpu in a macbook pro i cooked, and my knowledge of whats going on inside the chip (I do chip design by day) gives me plenty of reasons to be paranoid.

Anyway, sounds like I'll just have to get a machine and test.

Thx
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.