I really hope my machine doesn't die, but I must contend that the GPU always runs hot. The diode temp is around 55-60 deg in Mac OS, but 75 deg in Windows 2008 (doing nothing). I don't remember ATi's mobile parts running that hot while idling. And of course, NVIDIA has neither said nor done anything about this--they just quietly acknowledged that all of our machines (and all the other x86 laptops out there with contemporary NVIDIA GPUs) are ticking time bombs and aren't going to do a bloody thing about it. I really wish Apple would have gone with ATi, but I think that those days ended with the AMD takeover (Intel would be pissed).
Maybe NVIDIA should stop releasing a bigger, hotter, 1000 Watt, jet-turbine-cooled, big-mo-fo-ho-pro GPU model every other week and focus on getting those chips and drivers to actually work, instead of just theoretically working. In general, computer companies need to learn that when chips run hotter than holy hell, a "cross your fingers and hope nothing blows up" approach is not the way to go. Intel learned that with the Pentium 4 and its 115W thermal output and Microsoft learned it with the Xbox360 "Red Ring of Death."