There is a reason that the x230 comes with a 90w psu when you equip with the i7 3520m, its simple, it sucks up more power than 65w when under load. And the system in total is actually more power efficient than what apple produces. You can guess how much that cpu consumes.
I'm sorry to say this but... please show me a source that has actually measured the i7 3520M in the Lenovo X230 (just the CPU) to consume 65W under load. If that was the case, at max load, the laptop wouldn't be able to last more than an hour. In fact, I highly suspect that it wouldn't even last 30 minutes.
Anyway, I think Lenovo can choose to include a charger that can provide more power for any reason at all. Maybe the more powerful charger has more current headroom for faster charging. It doesn't have to be related to the power consumption of the machine.
For instance, Apple can most definitely bundle 85W MagSafe with their MacBook Air.
Another problem here is that according to intel that raise to the tdp was due to some of the functions of the pch being passed into the cpu, lowering the consumption and the tdp of the machine itself and raising a little bit on the cpu side.
Yeah, that's the voltage regulator being integrated into the CPU.
But only a small number of them are integrated, and the only reason that's happening is because Intel wants to squash once and for all any attempt at modifying the electrical characteristics of their control chipset.
Once upon a time (last year really), it was possible for people to overclock their CPUs (this is done mostly on desktops, but it's not that uncommon on laptops), and if they needed more headroom, they could ask the motherboard to provide more voltage. Not anymore. Intel has now integrated the voltage regulator into the CPU, thus the CPU itself controls how much voltage it is fed.
Anyway, even if that is to be taken into account, it still means HD 4600 is on par with HD 4000 in power consumption. I'm sure you know of the law of diminutive return when performance scales up... so I'm quite certain HD 5200 will increase power consumption (and TDP) by a good amount.
a i7 3740qm can use up more than 55w when under load, dont want to know how much the 3920xm uses up.
TDP is the worst case scenario. I don't think it'll get any worse than that. I think you're confusing laptop parts with desktop.
Here's the kicker: you can actually guestimate power consumption of the whole laptop and then subtract components accordingly from its battery specs.
For instance, take the Retina MacBook Pro. The battery is rated at 95WHr.
That means that if I'm getting 6 hours out of my Retina MacBook Pro from full charge, then it's consuming about 15W on average. Considering the CPU and chipset use about 10W, I'd assume the screen to use up 5W on average.
Under load (both the CPU and GPU are stressed to their max running a 3D simulation under Windows), I have seen estimates as low as 2 hours. That translates to approximately 47.5W, which is to be expected as the GPU is expected to be 15W and the CPU takes up about 27.5W.
Honestly, you have to take battery specs into account as well. At 55W for just the CPU, and at the "rumored" 30W power consumption for the GT 650M, the Retina MacBook Pro probably won't last over 45 minutes under max load, but that's not the case at all.
You also have to take into account the fact that if the chassis (casing) isn't good enough to dissipate constantly max TDP, then it's not likely for the CPU to be able to keep running at that frequency. I'm sure you know the CPU and GPU throttle down if they're overheat...
There are other components to take into account as well. It's not like the CPU is the only thing that's working in your MacBook after all.
If you don't believe me, you can take a multimeter to your own MacBook and do some measurements. I'm sure you'll be pleasantly surprised at how low the computer measures even under load.
btw according to toms, there are more quads that will come with the gt3, so yes it doesnt show. There is also very little need for the highest quads to have an igpu that is powerful (by intel standards), they are usually coupled with a dgpu, not to mention that would drive the cpu price more than it currently is, for example the 3840qm costs more than 500, quads start at 300 ish, dual cores around half of that.
I don't find quad-core parts with GT3 impossible.
I'm just saying... it's not a guarantee that it'll have GT3 when it's quad-core... because we obviously have mainstream high-end Haswell parts (MX and MQ chips) without GT3.
In fact, looking at one of the lists posted earlier, I'm sure it's clear that GT3 will only be available to certain configurations, most of which are coupled with ULV or more efficient CPUs in order to offset the power consumption and TDP cost that GT3 induces.
It's not like Intel can just make GT3 so much faster than HD 4000 without increasing power consumption after all...