A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?
No, it is 100% inaccurate - you can run external displays from iGPU's
Weird....slightly off topic but on allot of computers certain ports are wired to the dGPU and iGPU....does anyone know if the HDMI port and both thunderbolt ports are connected to the dGPU and iGPU or is the HDMI port only connected to the iGPU? Curious and might help map out what ports to choose.
Unless Apple changed something this generation, the dGPU is the only one that is connected to the output ports. Everything that you plug into a video port will trigger the dGPU (if there is one). If you hook it up to a projector and show some powerpoint slides, or if you just read a pdf or watch a movie on a bigger external screen.
It is wired to be physically impossible to use the iGPU in these situations. That is also why you couldn't use any Optimus drivers in Windows even if the iGPU would show up without loosing external GPUs. Because Nvidia for Optimus makes the iGPU the sole video output handler.
It is not an OSX problem, it is a hardware problem. Stupid IMO but somewhat intentional.
Actually the dGPU is often overkill for running an external display, and I'd certainly use the iGPU for running powerpoint on battery if I had the option.I'm not sure if utilizing the iGPU for output would be efficient anyways. At least not when you have dedicated graphics on board.
It would be more efficient.I'm not sure if utilizing the iGPU for output would be efficient anyways. At least not when you have dedicated graphics on board.
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?
Thanks! Yeah I use and love gfxCardStatus, I switch to integrated only allot (ie: when just browsing the web). Similar experiences with applications forcing the dGPU though. VLC likes to do it so I have to quit out of it before I switch to iGPU only.Unless Apple changed something this generation, the dGPU is the only one that is connected to the output ports. Everything that you plug into a video port will trigger the dGPU (if there is one). If you hook it up to a projector and show some powerpoint slides, or if you just read a pdf or watch a movie on a bigger external screen.
It is wired to be physically impossible to use the iGPU in these situations. That is also why you couldn't use any Optimus drivers in Windows even if the iGPU would show up without loosing external GPUs. Because Nvidia for Optimus makes the iGPU the sole video output handler.
It is not an OSX problem, it is a hardware problem. Stupid IMO but somewhat intentional.
gfxCardStatus usually recognizes when an external screen is plugged in. At least mine does. Even if set to iGPU only it will automatically switch to dynamic switching or dGPU only as soon as it detects the cable.
The problem is when you unplug again. Many applications that would trigger the dGPU cannot just switch back to the iGPU. You can force it sometimes but the application performance is horrible and there seems to be some software fallback to prevent crashing. You essentially have to restart all the troubled apps if you unplug the projector und want to go back to igpu only.
A poster mentioned that if you disable dGPU with gfxCardStatus and use integrated only, it will not work connecting to a project or external display. Is this accurate?
... Similar experiences with applications forcing the dGPU though. VLC likes to do it so I have to quit out of it before I switch to iGPU only.