Heat doesn't make a good energy source...
It is VERY inefficient to change heat into electricity. Otherwise we would be using the world's deserts as power plants during the summer! Generally a very large temperature gradient would be required, say a few hundred degrees. Take a look at this: http://www.hi-z.com/websit03.htm. This is a solid state device that can change heat directly to electricity, but requires about a 200°C temperature gradient for maximum efficiency (typical consumer computers have far less than that). It maxes out at about 14W at only 1.6V (drawing about 8A). Even if a computer had this large of a temperature gradient, this isn't a whole lot of extra energy to put back into the power supply considering the cost of the system.
Way to Think Different, though.
chaos86 said:all this talk of too much heat being released... why not look on the bright side.
why dont we make a tiny powerplant inside each computer, just above the G5, the battery, and any other mass heat producing components, and capture the heat to turn into more energy? like a car's turbo charger for a computer. (think different)
It is VERY inefficient to change heat into electricity. Otherwise we would be using the world's deserts as power plants during the summer! Generally a very large temperature gradient would be required, say a few hundred degrees. Take a look at this: http://www.hi-z.com/websit03.htm. This is a solid state device that can change heat directly to electricity, but requires about a 200°C temperature gradient for maximum efficiency (typical consumer computers have far less than that). It maxes out at about 14W at only 1.6V (drawing about 8A). Even if a computer had this large of a temperature gradient, this isn't a whole lot of extra energy to put back into the power supply considering the cost of the system.
Way to Think Different, though.