Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Starfyre

macrumors 68030
Original poster
Nov 7, 2010
2,905
1,136
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?
 
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?

No, it is 100% inaccurate - you can run external displays from iGPU's
 
No, it is 100% inaccurate - you can run external displays from iGPU's

You sure thats the case?

On my Haswell 15" w/ 750M, I set the gfxCardStatus to iGPU. Then I connected the thunderbolt display, and no picture. I could still use the things attached to the Thunderbolt display, but the screen didn't change.

However, when I turned it to dynamic switching, the Thunderbolt display automatically displayed the picture it was suppose to display.
When I try to switch to iGPU when I'm on my TBD, it won't allow it.
 
I think thebrick's explanation from the other thread--that Apple designers have only enabled the "monitor out" for the dGPU and not the iGPU--makes sense. (This is for machines with two GPUs, obviously.)

I've used two different cMBPs that behaved this way--a 2010 MBP17 and a 2011 MBP15. In both cases, if I happened to leave gfxCardStatus in "iGPU" mode and tried to connect to an external display, the display in question refused to detect the connection. The behaviour of the new retina machines seems consistent with this.

Obviously, an iGPU-only machine would never encounter this behaviour.
 
i can confirm that with my iris pro only machine i can simultaneously output with the igpu to the internal monitor, a hdmi monitor and through the thunderbolt a dvi monitor, all at the same time ( i have not tried four (by using both thunderbolts), i dont have that many screens in the house )

so the igpu is definitely capable of it (the intel page even says so), it must be osx causing problems (if they really exist)
 
If you're using an MBP with 2 GPUs, it will always switch to dGPU when plugged into an external display. Obviously it's a different story with the new iGPU-only MBPs.
 
Weird....slightly off topic but on allot of computers certain ports are wired to the dGPU and iGPU....does anyone know if the HDMI port and both thunderbolt ports are connected to the dGPU and iGPU or is the HDMI port only connected to the iGPU? Curious and might help map out what ports to choose.
 
Weird....slightly off topic but on allot of computers certain ports are wired to the dGPU and iGPU....does anyone know if the HDMI port and both thunderbolt ports are connected to the dGPU and iGPU or is the HDMI port only connected to the iGPU? Curious and might help map out what ports to choose.

Good question. In my experience, the iGPU was only used for the built-in display. No matter what port I used for video output, it would always switch to dGPU (if one existed). It's probably safe to say that if you're using a MBP w/ dGPU installed, you'll never see the iGPU powering an external display.
 
Unless Apple changed something this generation, the dGPU is the only one that is connected to the output ports. Everything that you plug into a video port will trigger the dGPU (if there is one). If you hook it up to a projector and show some powerpoint slides, or if you just read a pdf or watch a movie on a bigger external screen.
It is wired to be physically impossible to use the iGPU in these situations. That is also why you couldn't use any Optimus drivers in Windows even if the iGPU would show up without loosing external GPUs. Because Nvidia for Optimus makes the iGPU the sole video output handler.
It is not an OSX problem, it is a hardware problem. Stupid IMO but somewhat intentional.

gfxCardStatus usually recognizes when an external screen is plugged in. At least mine does. Even if set to iGPU only it will automatically switch to dynamic switching or dGPU only as soon as it detects the cable.
The problem is when you unplug again. Many applications that would trigger the dGPU cannot just switch back to the iGPU. You can force it sometimes but the application performance is horrible and there seems to be some software fallback to prevent crashing. You essentially have to restart all the troubled apps if you unplug the projector und want to go back to igpu only.
 
Unless Apple changed something this generation, the dGPU is the only one that is connected to the output ports. Everything that you plug into a video port will trigger the dGPU (if there is one). If you hook it up to a projector and show some powerpoint slides, or if you just read a pdf or watch a movie on a bigger external screen.
It is wired to be physically impossible to use the iGPU in these situations. That is also why you couldn't use any Optimus drivers in Windows even if the iGPU would show up without loosing external GPUs. Because Nvidia for Optimus makes the iGPU the sole video output handler.
It is not an OSX problem, it is a hardware problem. Stupid IMO but somewhat intentional.

I'm not sure if utilizing the iGPU for output would be efficient anyways. At least not when you have dedicated graphics on board.
 
I'm not sure if utilizing the iGPU for output would be efficient anyways. At least not when you have dedicated graphics on board.
Actually the dGPU is often overkill for running an external display, and I'd certainly use the iGPU for running powerpoint on battery if I had the option.

It's nothing new: my soon to be replaced 2010 15" MBP is wired that way: only the dGPU is connected to the output port. I didn't know for sure whether the 15" rMBP would be too, but can't say I'm surprised.
 
I'm not sure if utilizing the iGPU for output would be efficient anyways. At least not when you have dedicated graphics on board.
It would be more efficient.
Just having the dGPU active add some 5+W. That doesn't sound like much but it is if you consider that average total power draw for 8h battery life is about 11W. That dGPU stays in a fairly low power state clocked down to less than 200Mhz it is actually not all that fast.
Just pixels aren't much of a problem for the iGPU which can handle 2x 4k monitors if it had to. The iGPU can clock all the way down to the lowest power state and up almost instantly. A dGPU in the lowest power state sucks way more power and it helps nothing. Smartphone chips today can drive lots of pixels and they are still a long way from Iris Pro performance. Resolution alone really doesn't make a GPU sweat at all.
Driving the 2GB GDDR5 memory the memory controller and the dGPU even in a low power state is significantly less efficient. I am not sure if it is still the case but nvidia and AMD gpus didn't enter the lowest power states with an external screen even if there was no real work to do.

Apple didn't do it the other way because Nvidia was obviously not willing to help them write an optimus driver for OSX and that such a driver is difficult is shown by how long it took AMD to get there own working well. They say it saves a bit of power when the dGPU is active but that makes almost no difference and way more power is lost by unnecessarily running the dGPU way too often. Nvidia would probably have wanted some serious money or secure design wins for many generations and then dictate higher prices because Apple would be locked in.
 
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?

It's an interesting question. But when you use an external display, you will probably have access to the power line, or you are giving a presentation which doesn't run the discrete GPU anyway.
 
Thats why I will buy 13 rMBP

I have a 15 rMBP and I have tried to not to use the dGPU when running on projector. But it does not work.

Running Keynote for presentation give me less then 4 hours. Just showing some slides. I have the laptop screen on very low brightness but it does not make any big differences.

So I will buy 13 rMBP and hope it will give a full day lecturing without power supply.
Perhaps it would be better with new 13 MBA, but if you are used to retina it is hard to settle for less (and of course I do not use retina when doing presentations but there is also other things I want to use the computer for:)
 
Unless Apple changed something this generation, the dGPU is the only one that is connected to the output ports. Everything that you plug into a video port will trigger the dGPU (if there is one). If you hook it up to a projector and show some powerpoint slides, or if you just read a pdf or watch a movie on a bigger external screen.
It is wired to be physically impossible to use the iGPU in these situations. That is also why you couldn't use any Optimus drivers in Windows even if the iGPU would show up without loosing external GPUs. Because Nvidia for Optimus makes the iGPU the sole video output handler.
It is not an OSX problem, it is a hardware problem. Stupid IMO but somewhat intentional.

gfxCardStatus usually recognizes when an external screen is plugged in. At least mine does. Even if set to iGPU only it will automatically switch to dynamic switching or dGPU only as soon as it detects the cable.
The problem is when you unplug again. Many applications that would trigger the dGPU cannot just switch back to the iGPU. You can force it sometimes but the application performance is horrible and there seems to be some software fallback to prevent crashing. You essentially have to restart all the troubled apps if you unplug the projector und want to go back to igpu only.
Thanks! Yeah I use and love gfxCardStatus, I switch to integrated only allot (ie: when just browsing the web). Similar experiences with applications forcing the dGPU though. VLC likes to do it so I have to quit out of it before I switch to iGPU only.
 
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?

Hey so I'm not sure if this has already been answered or if this is relevant, but I also use gfxCardStatus and I can't use an external display using integrated only. I just switch to Discrete Only and then it runs just fine, with no significant penalty to my battery life.
 
If you have 2 GPUs the dGPU powers the external whether you like or like it. If you have a single GPU and its an iGPU then of course...that does the job.
 
... Similar experiences with applications forcing the dGPU though. VLC likes to do it so I have to quit out of it before I switch to iGPU only.

I got annoyed that XQuartz.app would do the same thing. It appears that applications need a key present to allow this. I used the following link to fix XQuartz, and a similar technique might work for VLC (or Mplayer which seems to have the same issue):

https://forums.macrumors.com/threads/1321711/
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.