Become a MacRumors Supporter for $25/year with no ads, private forums, and more!

Quinoky

macrumors regular
Original poster
Sep 18, 2011
179
0
Groningen, Netherlands
Hi guys,

According to the developers' own FAQ, HandBrake should not utilize any of the GPU's processing abilities at all. However, when converting a film to be watched on my Apple TV2, HandBrake is forcing the dedicated GPU to turn on, thereby increasing the heat and noise output of the machine. If HandBrake does not require any of the GPU's capabilities, why does it force it anyway? Screenshot and MBP specs below.
 

Attachments

  • Screen Shot 2011-12-27 at 9.42.47 PM.jpg
    Screen Shot 2011-12-27 at 9.42.47 PM.jpg
    210.2 KB · Views: 284

thundersteele

macrumors 68030
Oct 19, 2011
2,984
8
Switzerland
I doubt it makes a difference as far as heat and noise are concerned. You could force the integrated GPU and see if it makes a difference. As long as there is no load on the GPU, it will not generate a notable amount of heat.

As to why it activates the dedicated GPU, I don't know. In the end it is OSX that decides which GPU to use, but I don't understand how it makes this decision.
 

macdudesir

macrumors 6502
Jan 16, 2011
357
76
Blacksburg, VA
I doubt it makes a difference as far as heat and noise are concerned. You could force the integrated GPU and see if it makes a difference. As long as there is no load on the GPU, it will not generate a notable amount of heat.

As to why it activates the dedicated GPU, I don't know. In the end it is OSX that decides which GPU to use, but I don't understand how it makes this decision.

It seems like any time the integrated GPU is use more than 5% lol. :rolleyes:
 

GuitarG20

macrumors 65816
Jun 3, 2011
1,020
1
Random programs call API's in the OSX system that cause it to turn the dedicated GPU on. many programs do it that don't need it (see picture: )

Screen%252520Shot%2525202011-11-02%252520at%2525203.45.37%252520PM.png


Basically, if you don't want the dedicated GPU on, force integrated. Just about everything works fine w/ just the integrated (i only turn it on for gaming).

----------

I doubt it makes a difference as far as heat and noise are concerned. You could force the integrated GPU and see if it makes a difference. As long as there is no load on the GPU, it will not generate a notable amount of heat.

my MBP always runs 5-10C hotter (yes, even on the CPU) when I have the dedicated chipset on... I attribute it to the shared heatsink.
 

thundersteele

macrumors 68030
Oct 19, 2011
2,984
8
Switzerland
my MBP always runs 5-10C hotter (yes, even on the CPU) when I have the dedicated chipset on... I attribute it to the shared heatsink.

You're right. I just checked the idle-idle test, I forced the dedicated GPU while browsing. Power consumption goes up by some 5 W, and the CPU temperature also increased by about 5C.

Under load this should be different. If the GPU remains idle, while the CPU is fully loaded, the temperature difference should be small, but maybe still measurable. In that case usually the CPU heats up the GPU through the heatsink. Finally, when both are under load, I guess the CPU temp will be a bit higher again, due to the shared heatsink, as you pointed out.
 

Quinoky

macrumors regular
Original poster
Sep 18, 2011
179
0
Groningen, Netherlands
Under load this should be different. If the GPU remains idle, while the CPU is fully loaded, the temperature difference should be small, but maybe still measurable. In that case usually the CPU heats up the GPU through the heatsink. Finally, when both are under load, I guess the CPU temp will be a bit higher again, due to the shared heatsink, as you pointed out.

In my totally unscientific and probably immensely subjective little research, I've established that:

1. Under heavy CPU load, with the dedicated GPU turned on, fan speed remains at 6200RPM.

2. Under heavy CPU load, with the dedicated GPU turned off, fan speed turns up to 6200RPM, then slowly decreases to a stable ~5000RPM.

I think I can therefore conclude that the sustained higher RPM in case 1 means that the computer has to deal with an increased heat output produced by the CPU and GPU as opposed to case 2. This is all tested with a HandBrake conversion and forcing the GPU on/off with gfxCardStatus.

I'll set gfxCardStatus to integrated only for the time being. I will manually turn the dedicated GPU on when needed, since the decisions the software is making seems to be rather unreliable.

Thanks for the help guys. :)
 

Merkyworks

macrumors 6502
Oct 14, 2008
375
36
I keep my dGPU turned off 95% of the time, i tell gfxCardStatus to stay in iGPU. I have ripped numerous DVD's using handbrake in iGPU and not seen any issues while ripping or playing the encoded movie.
 

grahamnp

macrumors 6502a
Jun 4, 2008
969
4
It is not handbrake that turns the GPU on. OSX is does it when APIs that make use of the GPU are in use by the app.

It is not a perfect system. Use GFXCardStatus to manually turn the discrete GPU on and off.
 

GuitarG20

macrumors 65816
Jun 3, 2011
1,020
1
I'll set gfxCardStatus to integrated only for the time being. I will manually turn the dedicated GPU on when needed, since the decisions the software is making seems to be rather unreliable.

That's what most of us do. However, I take issue with the term "unreliable." IMHO it's completely reliable. When a program calls an API that OSX tags to turn on the dGPU, that GPU turns on. it's done on the "err on the side of safety" method, probably, to make sure there's always enough power to run what the user needs.
 

Quinoky

macrumors regular
Original poster
Sep 18, 2011
179
0
Groningen, Netherlands
That's what most of us do. However, I take issue with the term "unreliable." IMHO it's completely reliable. When a program calls an API that OSX tags to turn on the dGPU, that GPU turns on. it's done on the "err on the side of safety" method, probably, to make sure there's always enough power to run what the user needs.

You're right, but I still find it very confusing why HandBrake would utilize an API that relies on the dedicated GPU.
 

thundersteele

macrumors 68030
Oct 19, 2011
2,984
8
Switzerland
You're right, but I still find it very confusing why HandBrake would utilize an API that relies on the dedicated GPU.

The chrome browser had this issue that it would immediately trigger a switch to the dedicated GPU. A recent update fixed this issue.

It's hard to point a finger and blame someone. Handbrake is probably written such that it makes best use of the available resources. For Chrome it was a big issue since it reduces battery life compared to Safari. I'm not sure if the handbrake devs will care enough about this to modify the program.
 

GuitarG20

macrumors 65816
Jun 3, 2011
1,020
1
The chrome browser had this issue that it would immediately trigger a switch to the dedicated GPU. A recent update fixed this issue.

It's hard to point a finger and blame someone. Handbrake is probably written such that it makes best use of the available resources. For Chrome it was a big issue since it reduces battery life compared to Safari. I'm not sure if the handbrake devs will care enough about this to modify the program.

whoa you're right. i just realized that chrome doesn't swap me to the dedicated GPU anymore. however, sparrow, etc. still do :(
 

grahamnp

macrumors 6502a
Jun 4, 2008
969
4
You're right, but I still find it very confusing why HandBrake would utilize an API that relies on the dedicated GPU.

Handbrake works the way it works. The real question is why Apple decided that those particular APIs required the DGPU. I'm sure they had their reasons but the issues seems to be present across many apps so this method of GPU switching definitely needs some work.

At this point, I'm thinking the NVidia Optimus of saving application profiles would be nice for those users willing to tinker a bit.
 

thundersteele

macrumors 68030
Oct 19, 2011
2,984
8
Switzerland
I did a bit of digging into google chrome, to find out how they managed to improve the behavior. It seems in the end they did need help from Apple to get rid of this issue. They probably told google which OpenGL functions not to call.

In the end I think the current implementation of the automated graphics switching is too aggressive, at least for laptops.

I recently played a bit with GUI programming using kivy, a platform independent python GUI framework. It is based on OpenGL, and of course, even the dumbest "hello, world" application will fire up the dedicated GPU. I'll probably look a bit more into cocoa/objC next...
 

Quinoky

macrumors regular
Original poster
Sep 18, 2011
179
0
Groningen, Netherlands
I recently played a bit with GUI programming using kivy, a platform independent python GUI framework. It is based on OpenGL, and of course, even the dumbest "hello, world" application will fire up the dedicated GPU. I'll probably look a bit more into cocoa/objC next...

Right, it's kind of "trigger happy", isn't it?

So I have an idea here: why not expand gfxCardStatus so that the user can decide for each dependency whether to trigger the GPU or not? Then you'd be able to save these preferences so that gfxCardStatus remembers when not to trigger the dGPU. I'd certainly be willing to pay for this "expansion" if it is possible at all.
 

GuitarG20

macrumors 65816
Jun 3, 2011
1,020
1
Right, it's kind of "trigger happy", isn't it?

So I have an idea here: why not expand gfxCardStatus so that the user can decide for each dependency whether to trigger the GPU or not? Then you'd be able to save these preferences so that gfxCardStatus remembers when not to trigger the dGPU. I'd certainly be willing to pay for this "expansion" if it is possible at all.

if you like that idea, you're welcome to reach out to the developer of the program. I've emailed him; I know he will reply and tell you what he thinks of his idea.
 

thundersteele

macrumors 68030
Oct 19, 2011
2,984
8
Switzerland
I hope that eventually Apple will optimize the switching behavior. The net effect of the current state is a reduced battery life for most users. The forum community is a small subset that has a much deeper understanding of what is going on with their machine.

Apple advertises 7 hours of wireless web browsing, which for most people does include watching a youtube video here and there. If that brings the overall lifetime down to 4-5 hours, this is clearly not satisfactory.

On the other hand, it should also give software developers some thinking. Using all available system resources might not be optimal for something as simple as a web browser. And I would be happy if games would implement a maximal frame rate setting - I'm perfectly happy with 30 fps, and I would be happy to get less fan noise while playing.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.