Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

emiljan

macrumors 6502
Jan 25, 2010
330
0
Michigan
What ever the tdp might be, i still think there is a higher chance of Apple going with an Nvidia Chip for the next upgrade. Seems they don't want to jump the Nvidia bandwagon even if it means better graphics performace for the macbook line.

If they stay with Nvidia, chances are the MBP's will getter a descrete Nvidia 400 series chip along with an Intel IGP.
 

fs454

macrumors 68000
Dec 7, 2007
1,979
1,825
Los Angeles / Boston
I don't understand why they're sticking with nVidia all this time. Apple and consumers literally got screwed harder than ever with the whole 8600M GT issue and I feel like if any company were to complain about another company that lacks in innovation as much as nVidia, it'd be Apple.


Rebranded video cards over and over and over again while ATI/AMD innovates.
 

Nein01

macrumors 6502
Dec 1, 2009
307
1
Germany
sweet! plenty of time between now and april for apple to build the perfect line of macbook pros. i hope they don't disappoint us.
 

iLog.Genius

macrumors 601
Feb 24, 2009
4,908
452
Toronto, Ontario
I don't understand why they're sticking with nVidia all this time. Apple and consumers literally got screwed harder than ever with the whole 8600M GT issue and I feel like if any company were to complain about another company that lacks in innovation as much as nVidia, it'd be Apple.


Rebranded video cards over and over and over again while ATI/AMD innovates.

Anybody know if there is a contract between Apple/nVidia or does Apple just pick which chip mfg. to use on their next lineup? Just from what I've read with those who follow AMD(ATI)/nVidia chips, it seems that AMD(ATI) has come a long way since Apple used the X1600 in a good way and in every way possible versus nVidia. A lot of those users also believe that AMD(ATI) would actually be better so I'm wondering what's keeping Apple from using AMD(ATI). Of course Apple is very concerned with their profit margins but from what is available to desktop users, AMD(ATI) seem to be cheaper than what nVidia is offering.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
According to the spec sheets, the 5830 is still faster than the 6570. Which is kind of disappointing.

Maybe we can have two 6570's in Crossfire, and a 3D capable screen?

(Yes, that was a joke... But also wishful thinking!)

(EDIT: OK, maybe not a 3D screen. Unless Apple can get one with a decent resolution.)
 
Last edited:

aimbdd

macrumors 6502a
Dec 10, 2008
625
63
East Cost
Amd processor? I would skip it. amd gpu? awesome! I hope they do go for it... Amd has really caught up gpu wise.
 

mark28

macrumors 68000
Jan 29, 2010
1,632
2
There doesn't seem to be official TDPs from AMD but NotebookCheck is reporting 11-30W for 6570M. That should be suitable for MBP.

6370M has 8-15W and would thus be suitable for 13" MBP if iX is used.

In idle, the 6570M is pretty close to the power consumption of the 9400m if those numbers are true. I suppose you can throw in such as card in the 13"?

( the 9400m uses 10.8W in idle I believe )
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
In idle, the 6570M is pretty close to the power consumption of the 9400m if those numbers are true. I suppose you can throw in such as card in the 13"?

( the 9400m uses 10.8W in idle I believe )

Seems very high. I'd be more inclined to believe the 9400M uses 11W under full load. It is a low power integrated card, after all.
 

mark28

macrumors 68000
Jan 29, 2010
1,632
2
Seems very high. I'd be more inclined to believe the 9400M uses 11W under full load. It is a low power integrated card, after all.

http://www.tomshardware.com/reviews/nvidia-ion-atom,2153-10.html

I believe it's maximum is 20 Watts.

TDP doesn't equal the maximum power a CPU or GPU consumes. We have seen that with the i5 vs i7, as both are rated with a TDP of 35W, yet the i7 consumes more energy than the i5. ( it has to do with the amounts of Watts cooling is necessary I believe, but I'll have to look it up what it means )
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
http://www.tomshardware.com/reviews/nvidia-ion-atom,2153-10.html

I believe it's maximum is 20 Watts.

TDP doesn't equal the maximum power a CPU or GPU consumes. We have seen that with the i5 vs i7, as both are rated with a TDP of 35W, yet the i7 consumes more energy than the i5. ( it has to do with the amounts of Watts cooling is necessary I believe, but I'll have to look it up what it means )

That links says the TDP of the 9400 of 12W. And TDP is the predicted real-world maximum power draw.

And it's 11W for the entire Ion platform.
 

mark28

macrumors 68000
Jan 29, 2010
1,632
2
That links says the TDP of the 9400 of 12W. And TDP is the predicted real-world maximum power draw.

And it's 11W for the entire Ion platform.

http://www.notebookcheck.net/Review-Intel-Core-i3-i5-i7-Processors-Arrandale.25085.0.html

Check the benchmarks here.

The i7-620m under maximum loads uses 64.7 Watts, while it has only a TDP of 35W.

TDP != maximum power.

edit: I looked it up. TDP refers to the amount of heat is generated in Watts under load. It's not the same as power output.

So the ATI 6500 might be just as energy efficient in idle as the 9400m, it could be hotter than the 9400m. We'll have to wait for more data.
 
Last edited:

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
The i7-620m under maximum loads uses 64.7 Watts, while it has only a TDP of 35W.

The i7's do fancy power management stuff, like turbo boost. If it's using 65W, and the cooling system (which is what the TDP is aimed at) only deals with 35W, then the CPU gets hotter, until assumedly something happens, like the CPU switches off turbo-boost, or the user does something less power hungry.

The 9400M has no such technology, and if the cooling system was not capable of removing it's full TDP of heat, it would shut down.

edit: I looked it up. TDP refers to the amount of heat is generated in Watts under load. It's not the same as power output.

Heat Generated = Power Output.

Where else does the energy go?

So the ATI 6500 might be just as energy efficient in idle as the 9400m, it could be hotter than the 9400m. We'll have to wait for more data.

Clearly, as we have no reliable TDP data at all.

I am seriously confused as to how the points you are making are supposed to prove that the idle power draw of the 9400M is 11W. It's not 11W. It's a maximum of about 12W. You want to know how I know this? Some batteries for old 15" MBPs are 60Whr batteries. That would mean that based on just the idle draw of the GPU, the battery would be drained in 5 hrs. This is clearly not the case, as you should be able to get 5 hrs of real world usage.
 
Last edited:

mark28

macrumors 68000
Jan 29, 2010
1,632
2
The i7's do fancy power management stuff, like turbo boost. If it's using 65W, and the cooling system (which is what the TDP is aimed at) only deals with 35W, then the CPU gets hotter, until assumedly something happens, like the CPU switches off turbo-boost, or the user does something less power hungry.

The 9400M has no such technology, and if the cooling system was not capable of removing it's full TDP of heat, it would shut down.



Heat Generated = Power Output.

Where else does the energy go?



Clearly, as we have no reliable TDP data at all.

I am seriously confused as to how the points you are making are supposed to prove that the idle power draw of the 9400M is 11W. It's not 11W. It's a maximum of about 12W.

That is non-sense. The i3 doesn't have Turbo boost, and it gets a power output of 46.4W under maximum load. Believe me, TDP does not equal maximum power.

The 9400m has a TDP of 12W and you're making the case that 12W is the maximum power output. That's false, because like I said, TDP != maximum power. I showed you some benchmarks, I'll let the numbers speak for itself.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
That is non-sense. The i3 doesn't have Turbo boost, and it gets a power output of 46.4W under maximum load. Believe me, TDP does not equal maximum power.

The 9400m has a TDP of 12W and you're making the case that 12W is the maximum power output. That's false, because like I said, TDP != maximum power. I showed you some benchmarks, I'll let the numbers speak for itself.

The POINT I'm MAKING is that the idle power draw of the 9400M isn't anywhere near 11W, as you stated earlier.

And TDP is the maximum amount of average power that a component is likely to draw in real world scenarios. Computer manufacturers use the TDP to design cooling solutions. Just because some benchmarker runs some stupid program to push components to the very edge doesn't mean that this is ever likely to happen for more than a few seconds at a time in real world scenarios.

If computer manufacturers design cooling systems to the TDP (as they do), but the components really draw more power than this for sustained periods of time (as you seem to be suggesting) either things shut down, or things break.
 

Hellhammer

Moderator emeritus
Original poster
Dec 10, 2008
22,164
582
Finland
Remember that 9400M is the chipsets so that 12W includes the whole logic board. E.g. HM55 which is used in current 15" and 17" MBPs draws another 3.5W.

FYI;

CPU-World said:
The Thermal Design Power (TDP) is the average maximum power a processor can dissipate while running commercially available software. TDP is primarily used as a guideline for manufacturers of thermal solutions (heatsinks/fans, etc) which tells them how much heat their solution should dissipate. TDP is not the maximum power the CPU may generate - there may be periods of time when the CPU dissipates more power than designed, in which case either the CPU temperature will rise closer to the maximum, or special CPU circuitry will activate and add idle cycles or reduce CPU frequency with the intent of reducing the amount of generated power.

TDP is usually 20% - 30% lower than the CPU maximum power dissipation.

http://www.cpu-world.com/Glossary/T/Thermal_Design_Power_(TDP).html

I have no idea why are you arguing about this as it has nothing to do with the topic...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.