I know that nVidia comes with the Optimus technology which is supposed to power off the nVidia chip when not in use (how about the VRAM?), but I never could find any concrete info on temperature and battery life differences between models who have only a CPU with integrated graphics, and models with a discrete nVidia + Optimus but with the nVidia switched off. IN theory, they should be equivalent My engineering background makes me slightly skeptical that the temperature and power draw are the same between two such systems, when nVidia is "off" in the one with a discrete chip... no leakage currents? Are both the nVidia chip and VRAM completely powered off, i.e. no supply voltage supplied to either? Anyone knows of an actual test, or at least a qualitative evaluation (e.g. fans kicking in more often at idle, worse battery life)? Note that I'm interested in the case where the nVidia is kept "off" - if left on "dynamic" then the outcome is obvious. Thanks in advance, Alex.