AnandTech just posted an article about nVidia's new "Optimus" switchable graphics platform. This takes all of the work of switching between integrated graphics and discrete and moves it from hardware into software. Obviously, it would require a re-write of the drivers in OS X, but hopefully nVidia has been working with Apple on this, as they have now implemented it on Windows. In short: Previous 'switching graphics' systems have required a physical electronic switch on the motherboard to handle it, and a hard switch in graphics drivers, requiring a logout in OS X, and similar in Windows. (At best on Windows, you could do it when launching a 3D application, but it required disabling the currently active video driver, which meant you couldn't already be running another 3D app at the same time.) This makes it so that instead of the discrete GPU completely taking over control of the display, it just does the rendering 'blind', and pushes the finished frames over to the integrated GPU for display, using 'overlay' techniques that are already available. (In short, "DUH! Why didn't we think of this earlier?" I had a Matrox m3D back in 1998 that did this.) The major advantage is that you get all the power savings of the integrated graphics 95% of the time, but when you launch an app that would benefit from using a discrete GPU, (the nVidia drivers monitor for this,) it would load just that app onto the GPU. (Or multiple apps, as required, just like if it was running 'native'.) According to nVidia, there is very little overhead for doing it this way, as the major extra bandwidth is GPU-to-system, which is usually low traffic anyway. (And PCIe is bi-directional, so this 'extra' data flow doesn't take away from the system-to-GPU bandwidth.) While at present, I greatly prefer ATI's mobile GPU offerings, (and it doesn't look like nVidia will have anything based on a new architecture for awhile in the mobile-space,) even nVidia's current offerings would be plenty for the Air (and 'plain' MacBook,) when used this way. They could throw in a higher-wattage GPU, knowing it wouldn't be used as often. (Heck, they could even make it so that on battery, it uses integrated graphics for more things that under wall power it would switch to discrete GPU for, or have a slider in energy settings to let you pick how often discrete GPU comes on.) For the Pro line, I'd still prefer to see solely discrete GPUs, preferably from ATI.