Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

tgi

macrumors 65816
Original poster
Aug 29, 2012
1,331
330
Can someone explain how does an iGPU and dGPU work with one another? What determines which one is being used at a certain time? If a computer has a dGPU, what's the point of having an iGPU also?
 

dusk007

macrumors 68040
Dec 5, 2009
3,412
104
dGPU are generally more powerful which means they need more power even if they don't do anything. They have their own memory and memory controller which requires quit a lot of power too. Just because an iGPU simply shares its memory bus with the CPU it is more efficient.
An iGPU is just better at saving power but the shared memory bus limits the maximum performance an iGPU can provide no matte how big they make the chip. That is why the new Intle HD 5200 has a big 128MB L4 Cache to push the iGPU concept a bit further than would otherwise be possible.
GPUs are stupid number crunchers they need to be fed with enough data and that is were an iGPU will fall short.

When is which one active? Depends on the OS.
In general either can do anything but for some stuff one may want the faster dGPU (like games). In Windows there is Optimus which is a sort of virtual graphics driver that sits in front of the actual graphics drivers and decides to which GPU it shall forward any work that the CPU says it needs done.
The driver automatically detects if a given Application needs lots of speed and you can also set it to your own liking. I.e. if you play an old shooter like CSS the iGPU will handle that just fine and keep the fans from going nuts.
The dGPU if active renders the frame and forwards it to the iGPU which has access to the screen for displaying. Therefore with Optimus the dGPU can be completely shut off but the iGPU always has to at least forward the framebuffer. In exchange it can switch hot (while an application is running) because the application ever only talks to the optimus driver and doesn't really need to know which GPU is active.

In OSX it works somewhat differently. There is a chip that switches the output of either GPU on/off. So the iGPU can shut down completely which saves a little bit of power while the dGPU is active. But it makes all the driver stuff more complicated as there is no decent abstraction between application and GPU driverS.
Therefore OSX generally switches on the dGPU always if some programming framework is loaded at application launch that can only be used to in theory demand lots of processing speed. Application developers have to either avoid using these frameworks completely (which is generally a bad idea because they are what makes a graphical user interface fast and allows a few fancy animations) or they have to implement and mark out separate code paths which the iGPU can/should use. Basically the application has to be properly programmed to not crash while switching GPUs. Usually in OSX the dGPU stays active even if the cause is only a badly programmed or simply inactive application in the background.
I.e. iphoto is opened but does nothing other than importing photos, a process which requires zero GPU prowess.
In OSX the only real help is gfxCardStatus which allows to always force the Intel GPU and make the MBP act as if it didn't have any dGPU. The problem remains that you cannot actually switch whillynilly back and forth while applications (that think they must force the dGPU) are active. Also an external monitor always requires the dGPU because the external video ports are only connected to the dGPU.

dGPU and iGPU never work with one another except for AMD systems with AMD CPU and GPU. They allow some asymmetric crossfire but none of this is possible with GPUs from different vendors (Intel + Nvidia). It is always either or. Optimus (or AMD's DS which is the same thing for AMD GPUs) only forwards the framebuffer which puts a small bit of extra load on the PCIe connection and needs the iGPU to not entirely shut down.

The point of an iGPU is just power savings. They are plenty fast enough for movies (even multiple streams in fullhd), fast enough for all the standard GUIs and usually also old games. They don't need to waste 10W just to power their own memory but since CPUs don't need too much of their own memory bandwidth anyway, take what they need from there.
Generally iGPUs are fast enough for 90% of what notebooks are used for and they being able to switch dGPUs off allows manufacturers to put in dGPUs as fast as the cooling system allows without making it an immobile heavy brick that doesn't deserve to be called a notebook anymore.

In German we call it "eierlegende Wollmilchsau" egg laying wool milk pig or as americans say a swiss army knife. If you couldn't shut the dGPU down you would have to decide between buying a mobile notebook with a long battery life or one that can play games but only last 3h if you are lucky.
 

tgi

macrumors 65816
Original poster
Aug 29, 2012
1,331
330
dusk007, thank you so much for taking the time to write your post. You write very well and It was very informative. Cheers.
 

hajime

macrumors 604
Jul 23, 2007
7,832
1,266
In Windows there is Optimus which is a sort of virtual graphics driver that sits in front of the actual graphics drivers and decides to which GPU it shall forward any work that the CPU says it needs done.
The driver automatically detects if a given Application needs lots of speed and you can also set it to your own liking.

Which versions of Windows support this? Does Windows 7 64-bit has this feature?
 

yjchua95

macrumors 604
Apr 23, 2011
6,725
233
GVA, KUL, MEL (current), ZQN
Which versions of Windows support this? Does Windows 7 64-bit has this feature?

Windows has this feature as long as you download NVIDIA's drivers with Optimus.

Optimus will not work with Macs in Boot Camp as Optimus requires an iGPU to work. In Macs with dGPU, the iGPU is disabled via the EFI and there is no way to change it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.