Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If you're in a warm room, for instance, you'll have much lower performance, since it requires the differential to work. Of course, maybe the information available isn't wholly accurate, but that's my understanding based on the description.

And therein lies the failure of this idea as a simplifying concept:

When do you need the fan on? When the processor heats up.

Do you want the fan blowing harder or softer when the room is warmer? Harder.

In other words, if I'm sitting out in the cool evening air, I hardly need the fan going at all as the coolness of the air is doing just fine pulling the heat from the CPU. If I'm sitting in 100-degree weather then that fan better be buzzing like a bee to get enough air past the heat sink to effect a suitable heat transfer.

This works in just the opposite: In the cold air, there's a huge differential, so the fan is going full bore, annoying me and all my peace-and-quiet-loving neighbors. In the warm air, it slows to a crawl as the amount of electricity generated approaches the lower limit of sustaining power for the fan. Then it stops. Then my laptop heats up rapidly and the processor dies.

So, you need two additional controls: a bleed for cases when this extra cooling is not necessary, and a backup fan for when it isn't sufficient.

So, we haven't been able to simplify the problem at all, and instead are gaining the (very slight) power savings from not having to run this fan off our battery power (directly) in a mid-temp room. Seems like the R&D and per-unit costs put into this circuitry could be more wisely spent eking a few more milliwatts from the existing circuitry ...
 
Efficiencies better than a thermocouple?

"If you're in a warm room, for instance, you'll have much lower performance, since it requires the differential to work. Of course, maybe the information available isn't wholly accurate, but that's my understanding based on the description."

If the chip operates at a relatively high temperature a differential shouldn't be hard to reach. For example, with the cell operating at 600 degrees there is not much of a differential change between room temperature and plus/minus 10 degrees.

The article hinted at efficiency between twenty to thirty percent. Wow. This would be a huge leap above thermocouple efficiency, such as in radioisotope thermoelectric generators (RTG's), which supposedly are only three to seven percent efficient.

The applications for this are huge and heat sources are readily available. I, for one, would prefer an alcohol powered cell over a RTG in my computer any day. That whole radiation poisoning thing could ruin my bowling average.
 
Wow! what a great concept.

Pretty much like some hybrid cars getting power from when they brake.

The fact that it generates its own electricity, yes, but the same principle no. For light to normal braking the electric motor can act as a power generator, harnessing the kinetic energy of the moving wheels. Brake pads are not involved. Its called progressive braking. Brake pads come into play when you brake hard.

too bad that's not really anything that laptops can do..unless we can attach a generator to the hard drive and use it to charge the battery when it spins down.

it's an interesting concept to use heat..i wonder how they will do it. No steam engine here.
 
... sooo, a thermocouple on a chip? Thermocouples have horrendous efficiency. I don't see how a such a chip in an enclosed environment (like a laptop motherboard) can achieve enough of a thermal gradient to produce enough current to be useful.

I dunno, i'm skeptical.

Skeptical you should be, but these aren't really thermocouples. The same physical principle applies, but thermocouples are really only for temperature measurement. These are thermoelectric coolers. See here.

If you want to power the temperature change yourself, you need a high current. But if you want to generate electricity from them, then just connect them into a circuit with out any powersupply i.e. stick a fan's power terminals on that, stick one side of the TEC on a hot chip or cup of tea etc. to setup the delta T. (temp difference) then the fan will start spinning!


Dan :)

While what you're saying is true in principle, I seriously doubt the practicality of what you're suggesting. TECs are moderately efficient at converting electricity into a temperature differential (or being used as a heat pump), but their efficiency in the other mode of operation (Seebeck effect) is very, very low (typ. < 5%). If you take a chip-sized (~ 1 cm^2) TEC, connect it between a hot processor core at 100 C and ambient temperature at 25 C, you will not have enough power to turn a computer fan at any modest speed. Furthermore, even if you could harvest that electricity and store it, the added energy would be less than 0.1% of a typical laptop battery. :rolleyes:

If you wanted to use a larger TEC module (say 16 cm^2) on top of the 80 C CPU case, then the added energy would be less than 1%.

Estimates based on info here.
 
no thermoelectric device is 30% efficient

first post!

i'm working on my ph.d. in applied physics and our lab researches thermoelectric materials. there is no known material yet that is 30 % efficient, as the article claims. the most common material used in refrigerators (Bi2Te3) is only ~ 5 % efficient. a lot of money is being poured into this field to increase the efficiency, but so far the enhancement has been incremental. companies like general motors want to use the seebeck effect of thermoelectric materials to convert waste heat in automobiles back into electricity. 60% of the energy generated from gas is wasted as heat. NASA has been using thermoelectrics for years as RTG's to power deep space probes. i'd love one to be in my laptop, but the efficiency is being overstated here. it's suspicious. and the small dT values from the chip to the air would also be small, so very little voltage could be generated. although, i heard a rumor that some companies were looking into powering a liquid metal (InGa) liquid coolant system with the computers own waste heat. who knows.
 
right, and thats one of the concerns that folks have. But if Apple somehow manages to integrate one of these into a heatsink and put it right on a CPU's surface, there will no-doubt be a difference between the surface-temp of the CPU and the other side of the chip.

At how much of a loss in heatsink effectiveness? Heatsinks and heatsink gel are formulated to allow very high rates of heat transfer. You can't just stick a little piece of silicon in there (which will undoubtedly act as an insulator relative to the heat sink path) and expect the heat to still leave the processor.
 
For example, with the cell operating at 600 degrees there is not much of a differential change between room temperature and plus/minus 10 degrees.
My Pb.... quite a hot computer runs 140F at the GPU. Id say your pushing no more than 150 at the CPU. No more than 170 on any laptop. 170F
=76 C. 90F = 32 C. 80F = 26C. T = 76-32= 44C. 76-26= 50C. 12% difference. Yes.... quite unnoticeable :rolleyes: . Thats if its a direct relationship... if its a secondary or tertiary relationship.... well then your looking at huge difference being created.

I dont know where you got 600 :rolleyes: or negligible.... but...
 
"600 degrees" :eek: ...degrees what?
600 Fahrenheit.... nah.... nothing gets that hot.
600 Kelvin. whats that like 40 degress celsius. Nope.... 330 Celsius. :eek: Wow thats a lot

But not as much as 600 CELSIUS :eek: :eek: :eek: :eek:

Maybe he invented a system. :rolleyes:


600F : Melting Point of Cadmium.
600K: Melting Point of Lead.
600C: Melting point of Aluminum (so thats why Apple switched from Titanium).
 
600 Fahrenheit.... nah.... nothing gets that hot.
600 Kelvin. whats that like 40 degress celsius. Nope.... 330 Celsius. :eek: Wow thats a lot

But not as much as 600 CELSIUS :eek: :eek: :eek: :eek:

Maybe he invented a system. :rolleyes:


600F : Melting Point of Cadmium.
600K: Melting Point of Lead.
600C: Melting point of Aluminum (so thats why Apple switched from Titanium).


Maybe 600 Rankine? That's only 140F, so at least a reasonable Earth-bound/non-vaporizing-your-skin temperature ...
 
Nothing new. I remember playing around with this as a science experiment in the early 80's. Dissimilar metals sandwiched together--put one end in ice, the other in hot coffee--walaah! Current sufficient to make a fan turn. Hook up the same device to a battery, and one side gets slightly colder while the other gets slightly warmer.

The obvious problem is that the system requires isolated extremes of temperature to do aything. After an hour of use, my MBP's lower case is uniformly warm. Once a thermoelectric device is at a uniform temperature, it ceases to work, if you are using it as a way to convert heat into electricity. If you power the device with electricity in order to cool a laptop CPU, then the other half of the device will be throwing out MORE heat--which the singed hairs on your upper thighs will attest to when you are using your system as a "laptop".

The only time it would work with any effectiveness would be if you took your room-temp cold MBP and, immediately after start-up, tasked a huge Photoshop render file that pounded on the CPUs. At least for a while, the temp differential would give you some electricity back.
 
This works in just the opposite: In the cold air, there's a huge differential, so the fan is going full bore, annoying me and all my peace-and-quiet-loving neighbors. In the warm air, it slows to a crawl as the amount of electricity generated approaches the lower limit of sustaining power for the fan. Then it stops. Then my laptop heats up rapidly and the processor dies.
This isn't a replacement for fans to control temperature--it's simply an attempt to put 'waste' heat to use. Obviously the normal array of heatsinks and fans would still exist to manage the temperatures. There's no conceivable implementation in which your computer would be harmed by the application of this additional device. The fan would hardly be necessary in the cold air, given that the temperature gradient would already be optimized.

So, we haven't been able to simplify the problem at all, and instead are gaining the (very slight) power savings from not having to run this fan off our battery power (directly) in a mid-temp room.
Well, it's not that outrageous. If it adds minimal cost and extends battery life 10% (not unreasonable with some refinement), that could easily equate to 15 minutes with current batteries. The cooling system itself is not affected, and obviously the benefit is greatest with a heavy CPU load, which in turn would maximize its impact on intensive operations which shorten battery life. In other words, this could partially offset the battery time lost by intensive computing, making it a worthwhile investment for professionals on the move.
 
Cool-- forget about laptops, we can use these to delay the end of the universe! All energy eventually becomes heat. This little guy takes some of it and makes it electricity-- which eventually becomes heat. Then this little takes some of it and makes it electricity-- which eventually becomes heat. Then this little guy takes some of it and makes it electricity-- which eventually becomes heat. Then this little guy...
 
In the business world, you need to be able to make a good impression. If you have a flashy website and nothing behind it, you're going nowhere. If you have good substance but poor presentation of it, you can still succeed, but it can be a lot harder than if you've got it presented well.

Sitting down for an hour with GoLive would provide them with a much better front door to the world. Starting a tech company is hard, but it's easier if you excel in all areas of your business. And yes, publicity is one of those areas.

It's not that they didn't take the time, it's just that your website has to look like that if you're going to comply with Every W3C and CSS regulation. :D
 
Thermodynamics makes this idea unlikely.

As a mechanical engineer, I'm not exactly cynical about this application of Eneco's technology, but I remain very, very skeptical. With such a relatively small temperature difference, I would say it is very unlikely that such a device would be economically feasible. A quick visit to Eneco's site shows me that they don't even have lab data for temperature differences of less than 100 deg C!

They obfuscate the issue of efficiency by referring to the Carnot efficiency to inflate the numbers to the uninitiated. Sadi Carnot showed that an ideal heat engine that operated between two infinite reservoirs at temperatures, T(hot) and T(cold) would have an efficiency of ( T(hot)-T(cold) ) / T(hot), and the temperatures have to be on an absolute scale like Kelvin or Rankine. The "Carnot efficiency" compares the performance of the system in question to this ideal heat engine.

Suppose you ran your chip at a very warm 90 deg C (363 K) and could dump the heat to your 25 deg C (298 K) room, your perfect efficiency would be about 18%! This means that for every 5W of heat you dissipate from the chip, you get a little less that 1 W of electric power. Something with an impressive-sounding 50% Carnot efficiency would really have a measly 9% real efficiency.

Unless Eneco sells these things very cheaply and makes them very small, I can't see Apple going through the trouble and expense of adding them to their portables for such a small benefit in recycled power. I remain skeptical, yet open-minded.
 
Wouldn't using the "extra" electricity to power fans to decrease heat lead to less "extra" electricity???? :rolleyes: I hope they really think this through - and I'm sure they will. Of course powering fans isn't the only use for electricity.

LOL!! Good one. They'll cool the chip which will produce less electricity which will slow the fans and produce more heat which will make more electricity to speed up fans to cool the chip which will....:confused: :eek:
 
As a mechanical engineer, I'm not exactly cynical about this application of Eneco's technology, but I remain very, very skeptical. With such a relatively small temperature difference, I would say it is very unlikely that such a device would be economically feasible. A quick visit to Eneco's site shows me that they don't even have lab data for temperature differences of less than 100 deg C!

They obfuscate the issue of efficiency by referring to the Carnot efficiency to inflate the numbers to the uninitiated. Sadi Carnot showed that an ideal heat engine that operated between two infinite reservoirs at temperatures, T(hot) and T(cold) would have an efficiency of ( T(hot)-T(cold) ) / T(hot), and the temperatures have to be on an absolute scale like Kelvin or Rankine. The "Carnot efficiency" compares the performance of the system in question to this ideal heat engine.

Suppose you ran your chip at a very warm 90 deg C (363 K) and could dump the heat to your 25 deg C (298 K) room, your perfect efficiency would be about 18%! This means that for every 5W of heat you dissipate from the chip, you get a little less that 1 W of electric power. Something with an impressive-sounding 50% Carnot efficiency would really have a measly 9% real efficiency.

Unless Eneco sells these things very cheaply and makes them very small, I can't see Apple going through the trouble and expense of adding them to their portables for such a small benefit in recycled power. I remain skeptical, yet open-minded.

Finding efficiency data for temperatures below 100C would be important since the max junction temperature for most processors is below that. Power supply devices max out at about 150C. You just can't get hotter than that and expect silicon to function as a semiconductor.

If the Intel chips burn 100W, then 9% conversion efficiency would generate 9W of electricity. In absolute terms, that's not too bad. You can do a lot with 9W. If you have a 5 hour battery life now, and can use these on all the major power sinks, you'd get 5.5 hours of battery life.

(Those are big "if"s, but putting them in bold seemed a bit too cynical...)

Interesting, but not earth shattering yet... If this became widespread though and we could cut world energy consumption by 10%-- that would be a big deal. Personally, I think there's more to be gained in cars (hotter and less efficient to begin with) than computers, but who knows.
 
Wouldn't using the "extra" electricity to power fans to decrease heat lead to less "extra" electricity???? :rolleyes: I hope they really think this through - and I'm sure they will. Of course powering fans isn't the only use for electricity.

No, it would lead to a greater heat differential between CPU (max Tj stays constant) and the heat sink (getting colder with forced air flow) which would lead to more electricity generated. Under constant CPU load (thus a reduced die temperature because of cooling) the differential would stay the same and the electricity generated would stay the same.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.