Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I did some more digging and found this, packaging options.

Image

But do you really think Apple will allow us 32GB Ram in the rMBP? 16 right now is 200$.

If anything the cMBP will unofficially support it.

ddr4 allows for higher density ram chips, meaning more ram. But yes its going to be there, how soon? I dont know

that ram is going to be good for igpus, since it gives more bandwidth and starts at 2133mhz, in comparison ddr3 starts at 800mhz, though we usually saw only the 1066mhz variant.

thats to say that i dont have much hope for ddr3 equipped machines to come with 32gb

thanks for the charts its confirming what tom said about bga package and gt3, i really like what i see there
 
Native DDR4 support isn't coming until Skylake. TDP actually goes up across the board on Haswell, so don't expect any efficiency gains, either.

The big gains in PPW will come with Broadwell. Maxwell (Nvidia architecture) is supposed to be very efficient as well. I think the rMBP will really come into its own in 2014.
 
Native DDR4 support isn't coming until Skylake. TDP actually goes up across the board on Haswell, so don't expect any efficiency gains, either.

The big gains in PPW will come with Broadwell. Maxwell (Nvidia architecture) is supposed to be very efficient as well. I think the rMBP will really come into its own in 2014.

you mean 2015, if you are expecting skylake

some broadwell parts will receive ddr4, its still stated that ddr4 will enter the market this year and the next
 
you mean 2015, if you are expecting skylake

some broadwell parts will receive ddr4, its still stated that ddr4 will enter the market this year and the next

I meant 2014 as I'm not terribly concerned about DDR4 at this point. The 13" rMBP would probably benefit the most from DDR4 as it relies solely on integrated graphics. My understanding is that DDR4 support will come with Haswell-EX (server) but won't be in consumer chips until Skylake.
 
There is a reason that the x230 comes with a 90w psu when you equip with the i7 3520m, its simple, it sucks up more power than 65w when under load. And the system in total is actually more power efficient than what apple produces. You can guess how much that cpu consumes.

I'm sorry to say this but... please show me a source that has actually measured the i7 3520M in the Lenovo X230 (just the CPU) to consume 65W under load. If that was the case, at max load, the laptop wouldn't be able to last more than an hour. In fact, I highly suspect that it wouldn't even last 30 minutes.

Anyway, I think Lenovo can choose to include a charger that can provide more power for any reason at all. Maybe the more powerful charger has more current headroom for faster charging. It doesn't have to be related to the power consumption of the machine.

For instance, Apple can most definitely bundle 85W MagSafe with their MacBook Air.

Another problem here is that according to intel that raise to the tdp was due to some of the functions of the pch being passed into the cpu, lowering the consumption and the tdp of the machine itself and raising a little bit on the cpu side.

Yeah, that's the voltage regulator being integrated into the CPU.

But only a small number of them are integrated, and the only reason that's happening is because Intel wants to squash once and for all any attempt at modifying the electrical characteristics of their control chipset.

Once upon a time (last year really), it was possible for people to overclock their CPUs (this is done mostly on desktops, but it's not that uncommon on laptops), and if they needed more headroom, they could ask the motherboard to provide more voltage. Not anymore. Intel has now integrated the voltage regulator into the CPU, thus the CPU itself controls how much voltage it is fed.

Anyway, even if that is to be taken into account, it still means HD 4600 is on par with HD 4000 in power consumption. I'm sure you know of the law of diminutive return when performance scales up... so I'm quite certain HD 5200 will increase power consumption (and TDP) by a good amount.

a i7 3740qm can use up more than 55w when under load, dont want to know how much the 3920xm uses up.

TDP is the worst case scenario. I don't think it'll get any worse than that. I think you're confusing laptop parts with desktop.

Here's the kicker: you can actually guestimate power consumption of the whole laptop and then subtract components accordingly from its battery specs.

For instance, take the Retina MacBook Pro. The battery is rated at 95WHr.

That means that if I'm getting 6 hours out of my Retina MacBook Pro from full charge, then it's consuming about 15W on average. Considering the CPU and chipset use about 10W, I'd assume the screen to use up 5W on average.

Under load (both the CPU and GPU are stressed to their max running a 3D simulation under Windows), I have seen estimates as low as 2 hours. That translates to approximately 47.5W, which is to be expected as the GPU is expected to be 15W and the CPU takes up about 27.5W.

Honestly, you have to take battery specs into account as well. At 55W for just the CPU, and at the "rumored" 30W power consumption for the GT 650M, the Retina MacBook Pro probably won't last over 45 minutes under max load, but that's not the case at all.

You also have to take into account the fact that if the chassis (casing) isn't good enough to dissipate constantly max TDP, then it's not likely for the CPU to be able to keep running at that frequency. I'm sure you know the CPU and GPU throttle down if they're overheat...

There are other components to take into account as well. It's not like the CPU is the only thing that's working in your MacBook after all.

If you don't believe me, you can take a multimeter to your own MacBook and do some measurements. I'm sure you'll be pleasantly surprised at how low the computer measures even under load.

btw according to toms, there are more quads that will come with the gt3, so yes it doesnt show. There is also very little need for the highest quads to have an igpu that is powerful (by intel standards), they are usually coupled with a dgpu, not to mention that would drive the cpu price more than it currently is, for example the 3840qm costs more than 500, quads start at 300 ish, dual cores around half of that.

I don't find quad-core parts with GT3 impossible.

I'm just saying... it's not a guarantee that it'll have GT3 when it's quad-core... because we obviously have mainstream high-end Haswell parts (MX and MQ chips) without GT3.

In fact, looking at one of the lists posted earlier, I'm sure it's clear that GT3 will only be available to certain configurations, most of which are coupled with ULV or more efficient CPUs in order to offset the power consumption and TDP cost that GT3 induces.

It's not like Intel can just make GT3 so much faster than HD 4000 without increasing power consumption after all...
 
I'm sorry to say this but... please show me a source that has actually measured the i7 3520M in the Lenovo X230 (just the CPU) to consume 65W under load. If that was the case, at max load, the laptop wouldn't be able to last more than an hour. In fact, I highly suspect that it wouldn't even last 30 minutes.

Anyway, I think Lenovo can choose to include a charger that can provide more power for any reason at all. Maybe the more powerful charger has more current headroom for faster charging. It doesn't have to be related to the power consumption of the machine.

For instance, Apple can most definitely bundle 85W MagSafe with their MacBook Air.

just take a trip to the lenovo forums, and lenovo forcibly throttle the cpu when on battery (if Im not mistaken to 800mhz), thats why the 24h battery in it is possible. And im not confusing anything, its you that insist on the premise that TDP = power consumption, or your delicate of saying that they are very close



Yeah, that's the voltage regulator being integrated into the CPU.

But only a small number of them are integrated, and the only reason that's happening is because Intel wants to squash once and for all any attempt at modifying the electrical characteristics of their control chipset.

Once upon a time (last year really), it was possible for people to overclock their CPUs (this is done mostly on desktops, but it's not that uncommon on laptops), and if they needed more headroom, they could ask the motherboard to provide more voltage. Not anymore. Intel has now integrated the voltage regulator into the CPU, thus the CPU itself controls how much voltage it is fed.

Anyway, even if that is to be taken into account, it still means HD 4600 is on par with HD 4000 in power consumption. I'm sure you know of the law of diminutive return when performance scales up... so I'm quite certain HD 5200 will increase power consumption (and TDP) by a good amount.

Yes its expected, but them again, wanna take bets on how much? Not to mention the only cpus that we know that have been incremented in the tdp side of things, have been guess what? the ones leaked in that slide that all feature the 4600!

You know you can remove a little of cpu power, and make the chip cooler still maintaining the 47w or 37w tdp envelop. The 3740qm and the 3840qm do run hotter, still intel says 45w

TDP is the worst case scenario.
for heat that needs to be dissipated
I don't think it'll get any worse than that. I think you're confusing laptop parts with desktop.

Here's the kicker: you can actually guestimate power consumption of the whole laptop and then subtract components accordingly from its battery specs.

For instance, take the Retina MacBook Pro. The battery is rated at 95WHr.

That means that if I'm getting 6 hours out of my Retina MacBook Pro from full charge, then it's consuming about 15W on average. Considering the CPU and chipset use about 10W, I'd assume the screen to use up 5W on average.

Under load (both the CPU and GPU are stressed to their max running a 3D simulation under Windows), I have seen estimates as low as 2 hours. That translates to approximately 47.5W, which is to be expected as the GPU is expected to be 15W and the CPU takes up about 27.5W.

Honestly, you have to take battery specs into account as well. At 55W for just the CPU, and at the "rumored" 30W power consumption for the GT 650M, the Retina MacBook Pro probably won't last over 45 minutes under max load, but that's not the case at all.

You also have to take into account the fact that if the chassis (casing) isn't good enough to dissipate constantly max TDP, then it's not likely for the CPU to be able to keep running at that frequency. I'm sure you know the CPU and GPU throttle down if they're overheat...

There are other components to take into account as well. It's not like the CPU is the only thing that's working in your MacBook after all.

If you don't believe me, you can take a multimeter to your own MacBook and do some measurements. I'm sure you'll be pleasantly surprised at how low the computer measures even under load.

I can "guesstimate" that the power consumption maximum of the rmbp is going to be 90w, which is what the charger can provide. I can also put a m14x and "guesstimate" that its going to pull 20w more, because thats what the 650m + i7 quad pulls

Im not even going to address the battery guesstimation, thats just wrong in so many lvls.



I don't find quad-core parts with GT3 impossible.

I'm just saying... it's not a guarantee that it'll have GT3 when it's quad-core... because we obviously have mainstream high-end Haswell parts (MX and MQ chips) without GT3.

In fact, looking at one of the lists posted earlier, I'm sure it's clear that GT3 will only be available to certain configurations, most of which are coupled with ULV or more efficient CPUs in order to offset the power consumption and TDP cost that GT3 induces.

It's not like Intel can just make GT3 so much faster than HD 4000 without increasing power consumption after all...

They can lower the clocks, thats exactly what they did with the 35w quads that they introduced last year.

You see the problem? you get one table that you trust and the other that you dont, while claiming that intel leaks are usually what they are, good info
 
just take a trip to the lenovo forums, and lenovo forcibly throttle the cpu when on battery (if Im not mistaken to 800mhz), thats why the 24h battery in it is possible. And im not confusing anything, its you that insist on the premise that TDP = power consumption, or your delicate of saying that they are very close

Yeah, but that doesn't mean the CPU reaches 65W as you suggested.

Yes its expected, but them again, wanna take bets on how much? Not to mention the only cpus that we know that have been incremented in the tdp side of things, have been guess what? the ones leaked in that slide that all feature the 4600!

Yeah, I'd bet 5200 is 5-10W more.

You know you can remove a little of cpu power, and make the chip cooler still maintaining the 47w or 37w tdp envelop. The 3740qm and the 3840qm do run hotter, still intel says 45w

45W doesn't necessarily translate directly to "hot" or "cool" depending on the condition.

for heat that needs to be dissipated

And also power consumption.

Read:

Both Intel and Advanced Micro Devices (AMD) have defined TDP as the maximum power consumption for thermally significant periods running worst-case non-synthetic workloads

Source: http://en.wikipedia.org/wiki/CPU_power_dissipation

I can "guesstimate" that the power consumption maximum of the rmbp is going to be 90w, which is what the charger can provide. I can also put a m14x and "guesstimate" that its going to pull 20w more, because thats what the 650m + i7 quad pulls

Im not even going to address the battery guesstimation, thats just wrong in so many lvls.

Yeah? Wanna try looking at some actual measurements then?

http://negergy.com.au/blogs/news-reviews/5842594-macbook-pro-laptop-power-consumption-review

Like I said, I actually took a multimeter to the thing and measured it to make sure those guestimations weren't wrong.

But if you insist that somehow a 95WHr battery can power a 90W CPU + GPU combo for 7 hours straight without batting an eye, then... sure, have it your way.

They can lower the clocks, thats exactly what they did with the 35w quads that they introduced last year.

You see the problem? you get one table that you trust and the other that you dont, while claiming that intel leaks are usually what they are, good info

Yes, but here's the point:

Not every quad-core mobile CPU will have GT3 (HD 5200).

Is that clear enough?

I'm not sure why I'm having to clarify these points in the first place. People are getting too defensive about Haswell these days.
 
You know we arent too defensive, and the point of this thread is to clarify what t expect



The basis of your arguments is where the problem lies:

1) TDP = form power consumption


The problem here is that intel doesnt define tdp as power consumption, nor amd. Only nvidia does that.

There was a problem on last years mbp 15 that basically they were starved of power, what happened is that underload the battery has to be used. Much like my mbp 13 under load has to use the battery.

also the m14x does use the same hardware of the rmbp 15, if that thing does have a smaller how it functions? or even the mbp 15? thats magic

2) Take a look at notebookcheck review of last years mbp 15, or just like me put that thing really underload, you will see that the battery gets drained. Now do a logic test, if its getting drained, it must have a need for power, where does it get?

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-15-Late-2011-2-4-GHz-6770M-glare.66918.0.html

3) I dont know why you stated that the 95whr has something to do with the power that is draw by the hardware under load, have you tried gaming on battery? it gets throttled and you will drain your battery under the specified time, I dont even know why you said that.

and btw the 650m is 45w of tdp, and thats the power consumption.
 
1) Intel DOES define TDP as power consumption.

http://www.intel.com/content/www/us...ces-xeon-measuring-processor-power-paper.html

Intel said:
The thermal design power is the maximum power a processor can draw for a thermally significant period while running commercially useful software

2) I did put mine under load.

3) Same as above. No throttling after the last firmware update.

Here's proof:

i5ZAr9QVUfB3Z.png


I get full 3.1GHz processor Turbo Boost and my GPU stays overclocked at 1GHz GPU 1.6GHz VRAM. If you are throttled, then something is wrong.
 
1) it doesnt did you read what you just linked? its at the bottom of page 2

2) I told specifically that the 3840qm and 3740qm, along others more performing cpus in the i7, specifically the i7 3520m do consume more power. Have you read the thread that people with higher cpus than the 3615qm and the 3635qm do have throttling issues when on load? I remember you there, must my memory be playing tricks. I tried to differentiate between those products. Perhaps I shouldve made it more clear. But yes TDP is not power consumption

3) I cant test it anymore I sold my rmbp 15, still keeping my 2011 mbp 13, and yes its the base model, still it gets power from the battery under load.

Again only nvidia uses tdp = power consumption
 
Well, there really is not much power difference between the two i5 and i7 dual cores.

71af460d93.png


test System

Intel Ivy-Bridge-CPUs
Intel HM77 chipset
8 GByte DDR3 RAM (1333MHz)
Nvidia GeForce GTX 670M
Intel-SSD 320 series (80 GByte)
17.3“ FullHD LED display
Windows 7 Home Premium 64 Bit

Even a system with a 17.3 inch display and a 670m can't draw 65 watts behind the power adapter (actually less because of the power adapter inefficiencies).

90 watt vs 65 watt is for some other reason.
 
Generally speaking, coming from a base 2011 13in MBA, what kind of performance increase can I expect by upgrading to a Haswell 13in rMBP?
 
Well, there really is not much power difference between the two i5 and i7 dual cores.

Image

test System

Intel Ivy-Bridge-CPUs
Intel HM77 chipset
8 GByte DDR3 RAM (1333MHz)
Nvidia GeForce GTX 670M
Intel-SSD 320 series (80 GByte)
17.3“ FullHD LED display
Windows 7 Home Premium 64 Bit

Even a system with a 17.3 inch display and a 670m can't draw 65 watts behind the power adapter (actually less because of the power adapter inefficiencies).

90 watt vs 65 watt is for some other reason.

you mean 75W nvidia gpu consumes less power than its specs?

also
midred_pixel.gif

thats sheer magic from the whatever OEM that is, apple should learn from them, thats the mbp 15 2011

here is the rmbp 15

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-15-Retina-2-3-GHz-Mid-2012.78959.0.html

base model

here is the rmbp 13

base model

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-13-Retina-2-5-GHz-Late-2012.84584.0.html



here is the m18x r1 2011 2x 6970m

http://www.notebookcheck.net/Review-Alienware-M18x-Notebook.55194.0.html

55w cpu + 2x 100w gpus, they consume 307w? witchcraft!

m18x r1 2011 with 580m

http://www.notebookcheck.net/Review-Alienware-M18x-GTX-580M-SLI-2920XM-Notebook.61220.0.html

again 305w? not possible!

m17x r4 7970m

http://www.notebookcheck.net/Alienware-M17x-R4-Notebook-Review.75292.0.html

it can be! I must be dreaming!

msi gt70 with 670m

http://www.notebookcheck.net/Review-MSI-GT70-Notebook.74077.0.html

45w + 75w = 220w? hardly possible

g75v

http://www.notebookcheck.net/Review-Asus-G75V-Notebook.73636.0.html

45w+ 75w = 170w?

and btw cirus, you got the last nail to the cofin, thats the rview on power consumption of the cpus alone
 
Last edited:
Well, there really is not much power difference between the two i5 and i7 dual cores.

Image

test System

Intel Ivy-Bridge-CPUs
Intel HM77 chipset
8 GByte DDR3 RAM (1333MHz)
Nvidia GeForce GTX 670M
Intel-SSD 320 series (80 GByte)
17.3“ FullHD LED display
Windows 7 Home Premium 64 Bit

Even a system with a 17.3 inch display and a 670m can't draw 65 watts behind the power adapter (actually less because of the power adapter inefficiencies).

90 watt vs 65 watt is for some other reason.

These are all CPU intensive tasks. So why would you expect the 670M to be drawing much power at all? Fire up a GPU benchmark instead, and it will be a different story.

The CPU can draw 45W. I know my 2011 CPU can, by using Intel's own CPU power monitoring tool. Around 35W for my 6750M GPU if it's actually being used, and an extra 10-20 W for the rest of the system. Not hard to add some numbers together and see that the power consumption will be well over 85W. And the overclocked 650M in the RMBP draws even more than my 6750M.

TDP is nothing more than a guideline to help computer manufacturers to design their cooling systems. If a component has a TDP of X Watts, that doesn't mean it will always draw that amount; obviously if a component is not being fully utilised, its power draw will be minimal. Neither does it mean it can't draw more; Intel's Turbo Boost technology blows that idea to pieces.



----------

Generally speaking, coming from a base 2011 13in MBA, what kind of performance increase can I expect by upgrading to a Haswell 13in rMBP?

Probably ~20% CPU speed boost (my guess), much better graphics performance (I'm hoping Apple tries to use HD5200 across the board), and of course the retina screen. Then just a bunch of smaller things like a bigger and faster SSD, more and faster RAM, etc.
 
its around:

3-5w mobo

3-6w screen

1-2w ram

1-3w hdd

totals 8w or 15w

and Im being very generous, using mobo numbers from desktops and on the ram
 
And losses through the battery, and the power connection. I know my notebook gets hot around the Magsafe port.

I must admit, I don't even know what the argument here is. Intel clearly design their CPUs to fit within whatever power envelope is provided (within reason), which is the whole point of Turbo Boost. Use a better heat sink, the CPU draws more power, and runs faster. Use a worse heatsink, the CPU throttles down to compensate.

On the GPU side, Apple picks a GPU with a TDP around where it needs to be, and then tinkers with the clock speeds to "optimise" its performance and power draw.
 
you mean 75W nvidia gpu consumes less power than its specs?

also
Image
thats sheer magic from the whatever OEM that is, apple should learn from them, thats the mbp 15 2011

here is the rmbp 15

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-15-Retina-2-3-GHz-Mid-2012.78959.0.html

base model

here is the rmbp 13

base model

http://www.notebookcheck.net/Review-Apple-MacBook-Pro-13-Retina-2-5-GHz-Late-2012.84584.0.html



here is the m18x r1 2011 2x 6970m

http://www.notebookcheck.net/Review-Alienware-M18x-Notebook.55194.0.html

55w cpu + 2x 100w gpus, they consume 307w? witchcraft!

m18x r1 2011 with 580m

http://www.notebookcheck.net/Review-Alienware-M18x-GTX-580M-SLI-2920XM-Notebook.61220.0.html

again 305w? not possible!

m17x r4 7970m

http://www.notebookcheck.net/Alienware-M17x-R4-Notebook-Review.75292.0.html

it can be! I must be dreaming!

msi gt70 with 670m

http://www.notebookcheck.net/Review-MSI-GT70-Notebook.74077.0.html

45w + 75w = 220w? hardly possible

g75v

http://www.notebookcheck.net/Review-Asus-G75V-Notebook.73636.0.html

45w+ 75w = 170w?

and btw cirus, you got the last nail to the cofin, thats the rview on power consumption of the cpus alone

I'm not quite reading you. Yes this is a cpu only test (the gpu is not running). Is the only difference in the two lenovo laptops the cpu? If so then looking at cpu only is perfectly relevant.

Please also note that tdp of mobile graphics cards is not accurate (and often not published). For instance, dell lists the tdp of the 660m at 75 watts which it clearly is not (that more than the 62 watts of the higher clocked desktop chip). The 7970m consumes 10 more watts under load compared with the 680m despite both having a 100 watt tdp.

Also despite the fact that any desktop gpu pretty much blows its power budget on furmark. Fermi desktop was pretty bad.

These are all CPU intensive tasks. So why would you expect the 670M to be drawing much power at all? Fire up a GPU benchmark instead, and it will be a different story.

The CPU can draw 45W. I know my 2011 CPU can, by using Intel's own CPU power monitoring tool. Around 35W for my 6750M GPU if it's actually being used, and an extra 10-20 W for the rest of the system. Not hard to add some numbers together and see that the power consumption will be well over 85W. And the overclocked 650M in the RMBP draws even more than my 6750M.

TDP is nothing more than a guideline to help computer manufacturers to design their cooling systems. If a component has a TDP of X Watts, that doesn't mean it will always draw that amount; obviously if a component is not being fully utilised, its power draw will be minimal. Neither does it mean it can't draw more; Intel's Turbo Boost technology blows that idea to pieces.



----------



Probably ~20% CPU speed boost (my guess), much better graphics performance (I'm hoping Apple tries to use HD5200 across the board), and of course the retina screen. Then just a bunch of smaller things like a bigger and faster SSD, more and faster RAM, etc.

The gpu, while not drawing much power is still drawing some power. 20 watts at idle is still fairly high for a 17 inch 1080p laptop. Optimus improves battery life but you can still gain some by deactivating it completely. For instance, the gt70 from the notebookcheck review uses around 12-23 watts at idle with the same gpu where 12 watts is sitting at desktop and 23 watts is surfing the web. The test system here seems to have a higher idle power usage.

Look at ultrabooks if you don't believe me. Simply having a 630m in the system under optimus decreases battery life slightly.

However, our measurement values also include losses at voltage converters and power adapters, which makes such an analysis more difficult.

Thats the power consumption for the whole notebook not just the cpu running cpu only tasks.

I don't think you realize that notebookcheck measures the power consumption before the power adapter. This is the total power going into the device, not the total power the device is using. At 80% efficiency to send 80 watts to the notebook itself will require 100 watts. And unfortunately power adapters differ between notebooks.

The purpose of my post was to say this was not true.

There is a reason that the x230 comes with a 90w psu when you equip with the i7 3520m, its simple, it sucks up more power than 65w when under load. And the system in total is actually more power efficient than what apple produces. You can guess how much that cpu consumes.
 
Cirus you are, in what say, extremely right.

my point of posting those things is to show that tdp is different than power consumption in intel and amd cpus/gpus, as you point out the 7970m does draw more power than the 680m, despite having the same tdps.

however there is a clear reason that my battery is still drained when under load, as to provide power to the whole system. thus indeed while i was generous on that measurement of power consumption for the whole device, the point still stands that those cpus consume more power than their designed tdp, and a 65w adapter when you are under load is still insufficient to provide power for it, thus why the x230 comes with a 90w adapter when selected with i7 cpus.

I wasnt saying that the 35w 3520m draws more power than 65w, i should have written it much better than that, my mistake.

It was a very cyclical argument revolving tdp = power consumption, a lot of things got lost in translation

One thing I forgot to mention, while mobile gpus dont have specified tdp, as do the desktop ones, the 660m is still an underclocked, higher binned 650, and that card does draw more than 75w, since it has a 8 pin connector.

the 670mx and the 675mx are still in that same spot of 75w. but that doesnt matter for nvidia, since tdp = power consumption.

Another thing is that the pcie interface that some notebooks use, is the mxm 3b or c variant and that is still limited to 100w of power consumption. Those cards as the 480m have reached that

the tdp for the 650m is 45w, and probably thats why nobody believes the tdp of the 660m, most people think its in the 60w range. thats exactly one of the reasons that i only got power consumption figures from the tried and true that we know of, the 6970m, 670m/570m, 580m, 7970m, those we know for certain.
 
Last edited:
Yeah, but look at those figures again and you'll see that under load, each of the CPU does consume about the same amount of power as its TDP indicates.

And the 660M is a mobile GPU. What 8-pin connector are you talking about?
 
Cirus you are, in what say, extremely right.

my point of posting those things is to show that tdp is different than power consumption in intel and amd cpus/gpus, as you point out the 7970m does draw more power than the 680m, despite having the same tdps.

however there is a clear reason that my battery is still drained when under load, as to provide power to the whole system. thus indeed while i was generous on that measurement of power consumption for the whole device, the point still stands that those cpus consume more power than their designed tdp, and a 65w adapter when you are under load is still insufficient to provide power for it, thus why the x230 comes with a 90w adapter when selected with i7 cpus.

I wasnt saying that the 35w 3520m draws more power than 65w, i should have written it much better than that, my mistake.

It was a very cyclical argument revolving tdp = power consumption, a lot of things got lost in translation

One thing I forgot to mention, while mobile gpus dont have specified tdp, as do the desktop ones, the 660m is still an underclocked, higher binned 650, and that card does draw more than 75w, since it has a 8 pin connector.

the 670mx and the 675mx are still in that same spot of 75w. but that doesnt matter for nvidia, since tdp = power consumption.

Another thing is that the pcie interface that some notebooks use, is the mxm 3b or c variant and that is still limited to 100w of power consumption. Those cards as the 480m have reached that

the tdp for the 650m is 45w, and probably thats why nobody believes the tdp of the 660m, most people think its in the 60w range. thats exactly one of the reasons that i only got power consumption figures from the tried and true that we know of, the 6970m, 670m/570m, 580m, 7970m, those we know for certain.

The tdp of the 660m is not anywhere close to 60 watts. The tdp of the desktop 650 is 62 watts specified by nvidia.

The 650m probably does not need anywhere near 45 watts. I say this because the macbook pro retina with a 45 watt i7 and a 650m can run still on a 85 watt power adapter fine for the majority of cases.

Generally speaking at normal load (games or 3d mark) an i7 quad and a 650m use about 75-90 watts of power behind the power adapter for the whole system (under stress test its over 100-120). Many notebookcheck reviews show this. Obviously this is less than 45 watts + 45 watts + rest of system at load.

Also I think but are not quite sure that temperature complaints for the crmbp 15 have decreased with kepler and ivy bridge and that the throttling/ using battery issue seems to have decreased.
 
The tdp of the 660m is not anywhere close to 60 watts. The tdp of the desktop 650 is 62 watts specified by nvidia.

The 650m probably does not need anywhere near 45 watts. I say this because the macbook pro retina with a 45 watt i7 and a 650m can run still on a 85 watt power adapter fine for the majority of cases.

Generally speaking at normal load (games or 3d mark) an i7 quad and a 650m use about 75-90 watts of power behind the power adapter for the whole system (under stress test its over 100-120). Many notebookcheck reviews show this. Obviously this is less than 45 watts + 45 watts + rest of system at load.

Also I think but are not quite sure that temperature complaints for the crmbp 15 have decreased with kepler and ivy bridge and that the throttling/ using battery issue seems to have decreased.

They havent decreased, the problem here is that this is the last gen of product (probably) and people want this to be special. the other thing is that in terms of heat the 6770m compared to the 650m is one hot chip.

the use of battery still abounds, even on the 13 models when under stress, and unfortunately somethings that i do put it at that lvl of stress. Take the asus U500, it has the same specs of the rmbp 15, including the form factor, heat is not the problem there (thats one good thing of asus pcs), it throttles, the charger is a 90w one, and when on load it throttles the cpu back to pre turbo boost lvls, or sometimes even less, one fair warning here is that they use the 35w quads.

The samsung series 7 is the same with the 8870m (which has exactly the same tdp of the 650m, although its 30-50% more powerful), it throttles the cpu, not again because of heat, despite the anorexic thin design, but because it uses a 90w charger.

when you measure the m14x with its 120w charger, its actually using more than 90w, so why not put a sufficiently decent charger in those models?

and you know im not meaning furmark + prime, Im meaning games here, and not the abusive things i do to my pcs, poor things. When you get into a lot of AA titles like BF3, or shogun 2 we are faced that these things are quite heavy on the hardware and will make you throttle down because of lack of power, the temps on the gpus stay on the safe 70-80c cpu temps close to the 80-90c, way different than reaching outright close to the Tj

the 650 despite the tdp of 64w, which is 11w lower than what the pcie connector can provide still comes with a 6 pin connector, I used 8 pin before and that was my memory playing tricks, if that is not paying tricks again the problem here resides on the fact that nvidia launches tdp info based only on the core and not the entire board, since tdp for them = power consumption, that should explain the existence of the power pin connector, 11w is way too much to gain on such a mid range gpu
 
They havent decreased, the problem here is that this is the last gen of product (probably) and people want this to be special. the other thing is that in terms of heat the 6770m compared to the 650m is one hot chip.

the use of battery still abounds, even on the 13 models when under stress, and unfortunately somethings that i do put it at that lvl of stress. Take the asus U500, it has the same specs of the rmbp 15, including the form factor, heat is not the problem there (thats one good thing of asus pcs), it throttles, the charger is a 90w one, and when on load it throttles the cpu back to pre turbo boost lvls, or sometimes even less, one fair warning here is that they use the 35w quads.

The samsung series 7 is the same with the 8870m (which has exactly the same tdp of the 650m, although its 30-50% more powerful), it throttles the cpu, not again because of heat, despite the anorexic thin design, but because it uses a 90w charger.

when you measure the m14x with its 120w charger, its actually using more than 90w, so why not put a sufficiently decent charger in those models?

and you know im not meaning furmark + prime, Im meaning games here, and not the abusive things i do to my pcs, poor things. When you get into a lot of AA titles like BF3, or shogun 2 we are faced that these things are quite heavy on the hardware and will make you throttle down because of lack of power, the temps on the gpus stay on the safe 70-80c cpu temps close to the 80-90c, way different than reaching outright close to the Tj

the 650 despite the tdp of 64w, which is 11w lower than what the pcie connector can provide still comes with a 6 pin connector, I used 8 pin before and that was my memory playing tricks, if that is not paying tricks again the problem here resides on the fact that nvidia launches tdp info based only on the core and not the entire board, since tdp for them = power consumption, that should explain the existence of the power pin connector, 11w is way too much to gain on such a mid range gpu

Im not disagreeing with you. There is a difference between an i5 dual core and rest of system using 90 watts and an i7 quad + 650m using 90 watts. The lenovo has only a sv i7 dual and so will consume much less than 90 watts (probably less than 60).

Also note that the 8870m is more efficient than the 650m though under the testing conditions it was much less than 30% more power compared with the 650m (because of drivers). Better drivers than increase gpu utilization to higher levels may use more power.

I don't know what you want me to say about the 650 desktop. Just because it has a 6 pin conector does not mean that it needs the power.

Power.png


Uses less power than a 7750 which does not have a 6 pin connector and gets all its power from the pci express slot. Also note the putting a load on the gpu will also put a load on the cpu (minor but its there) to keep the gpu fed with data as well as the ram, motherboard, etc. And desktop systems are not particularily efficient compared with mobile.
 
thats the thing, lenovo wouldnt put the 90w adapter in the 3520m and the earlier model if it wasnt needed, and as we know the 65w isnt enough to supply power when the 35w is underload, several times i got that when I was playing games.

Im still not saying that the 35w cpus uses more than 65w on its own, but the whole system is.
 
thats the thing, lenovo wouldnt put the 90w adapter in the 3520m and the earlier model if it wasnt needed, and as we know the 65w isnt enough to supply power when the 35w is underload, several times i got that when I was playing games.

Im still not saying that the 35w cpus uses more than 65w on its own, but the whole system is.

Thats unusual but possible.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.