Here's a quote from the Engadget MBP 2010 review:
So Optimus has a list of apps that it gets from an NVidia server to determine when to turn the discreet GPU on/off? And it leaves the integrated GPU on all the time whether it's being used or not? If that's true then it really does sound like the Apple solution is much better. Anyone here familiar with the details of how Optimus works and can confirm it?
Basically, Optimus turns on the GPU if its needed, and then runs both the Intel graphics and the discrete card simultaneously, pushing the GPU-produced imagery through the Intel chip before it hits your screen. Apple's solution actually switches fully between the cards seamlessly, with the Intel graphics on only in a power sipping mode but not in use at all for rendering when the NVIDIA GPU is in play. The other big difference is that Optimus detects its necessity based on a cloud-stored whitelist of apps that NVIDIA has, which could potentially become out of date or at least have difficulty in keeping up with app releases (though users get the flexibility of manually enabling apps). Meanwhile, Apple's solution is based on deeper OS-level stuff, with OS X figuring out what sort of technologies an app is going to call on (like OpenGL, for instance) and turning on the GPU accordingly.
So Optimus has a list of apps that it gets from an NVidia server to determine when to turn the discreet GPU on/off? And it leaves the integrated GPU on all the time whether it's being used or not? If that's true then it really does sound like the Apple solution is much better. Anyone here familiar with the details of how Optimus works and can confirm it?