Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
OK, let's try to make some progress.

As far as I can tell there are exactly three questions:

1) Are there alternatives to the Engadget test that looked at battery life of both systems in a controlled setting?

2) How was the Engadget test done? During the time it took each of the two configurations to drain the battery, did they a) perform exactly the same number of tasks, or b) were the processors of each system running the same amount of time. If the answer is a), the 2.6 GHz processor is 20% less efficient than the 2.3 GHz processor. If the answer is b), the 2.6 GHz processor is only 10% less efficient than the 2.3 GHz processor, because it can perform 10% more tasks in a given amount of time.

3) How much less efficient do processors typically get with higher max. GHz? Efficiency measures of how much of the input power is converted into work, or bang/buck. Obviously the 2.6 GHz processor will consume more power in a given amount of time (that doesn't necessarily make it less efficient), but the question is how much more than 10% more power than the 2.3 GHz processor it consumes. The answer to this question will help us determine if there's something odd going on with these 2.6 GHz processor systems, making them less efficient than they should be.

Let us know if you have any answers to any of these questions. Thanks!

EDIT: monksealpup, thank you for the informative response. I had missed it previously. Good point also about the other components consuming a lot of power making the difference much more than 20%.
 
Last edited:
How many computer science engineers in this room?

How many googled to find out the answer?

Don't get downvoted for spreading the light XD
 
This intellectual debate is most annoying. If you know something to be true why try to convince someone else? You are not getting paid for this so let them stay uninformed and ignorant so that you may take advantage of them in the future.
Agreed. Monksealpup has the best explanation. I'm staying out of this because I suspect I'm being trolled.
 
Ummm, I pointed out its flops per cycle actually. If GHz were the only factor in speed than 1.8GHz would still perform like crap! Its how many flops per second that matters! Sure the higher the ghz the more potential it has to processor more instructions...but power is not the determination of speed....only the minimum determination. A 1.8 GHz proc running 3 gigaflops per second is faster than a 2.6Ghz running 2.6 gigaflops per second. (these numbers have nothing to do with actual statistics...since everyone is so damn literal here, I have to preface that)

Just so everyone is clear, hertz(Hz) is not a measurement of power. Hz by definition is cycles per second.

FLOPS(Floating-Poing Operations Per Second) is not a measurement of power either. It is a measurement of how much work can be done per second.

Watts is a measurement of power. Both the 2.3GHz CPU and 2.6GHz CPU have maximum TDP of 45 watts.

There really shouldn't be a noticeable difference in battery life between the two CPUs.
 
Also worth noting that the CPUs should down clock themselves pretty aggressively when not running intensive applications. They'd probably be at the same frequency at idle.

I know in the desktop world, some CPUs have lower default voltages than others depending on how they tested at the factory. Perhaps this particular 2.3 sample happened to be one with a lower than typical required voltage.
 
Instead of arguing shouldn't we ask in the community that has the 2.3 and 2.6 rMBPs? Getting their time reads would be more accurate instead of 1 person's account of less than few weeks observation.
 
Instead of arguing shouldn't we ask in the community that has the 2.3 and 2.6 rMBPs? Getting their time reads would be more accurate instead of 1 person's account of less than few weeks observation.

The problem with this is that people have vastly different usage patterns (display brightness, CPU-intensiveness of applications, GPU-intensiveness of applications, etc, etc, etc).

The only way to control for these variables is to put both laptops side by side and let them go through the same task list for the duration of the test. Anandtech has such a well-defined task list but unfortunately they only tested the 2.3 GHz model...
 
The problem with this is that people have vastly different usage patterns (display brightness, CPU-intensiveness of applications, GPU-intensiveness of applications, etc, etc, etc).

The only way to control for these variables is to put both laptops side by side and let them go through the same task list for the duration of the test. Anandtech has such a well-defined task list but unfortunately they only tested the 2.3 GHz model...

Actually, the biggest issue is that the type of people who respond to these polls won't reflect an unbiased sample - the type of people who would respond to such a request would be people who have particularly strong views on the topic. And people tend to have stronger views when they're negative. Not to mention that the type of people who get 2.6 may respond to a poll completely differently than people who have a 2.3.

Variation in a sample is fine - it averages out. However, biased variation like we would see on any sort of poll/questionnaire on this thread would not reflect a general opinion whatsoever.

Not to say that it could hurt to see the opinions of various owners - we just couldn't even pretend to be able to draw anything near a conclusion based on a sample - this fact-less theorycrafting would probably be more accurate.
 
That's what I was trying to hint at. For example, it is perfectly conceivable that people getting the 2.6 GHz option spend more time running CPU heavy applications, than those getting a 2.3 GHz option...

Actually, the biggest issue is that the type of people who respond to these polls won't reflect an unbiased sample - the type of people who would respond to such a request would be people who have particularly strong views on the topic. And people tend to have stronger views when they're negative. Not to mention that the type of people who get 2.6 may respond to a poll completely differently than people who have a 2.3.

Variation in a sample is fine - it averages out. However, biased variation like we would see on any sort of poll/questionnaire on this thread would not reflect a general opinion whatsoever.

Not to say that it could hurt to see the opinions of various owners - we just couldn't even pretend to be able to draw anything near a conclusion based on a sample - this fact-less theorycrafting would probably be more accurate.
 
Wow this thread is amazing.
Hz=cycles per second. Anyways...

Aside from the 2.6 GHz processor, there's also the 512gb SSD vs a 256 SSD in the mid-range rMBP. Does anyone know the manufacturer's specifications for the power consumption of the drives? Perhaps the 512 is drawing more Watts (or should I say GHz's LOL :cool:) and that's also contributing to the increased power usage compared to the base model? A quick google about 512 SSD power consumption brought me to an AnandTech review showing some drives use considerably more than others http://www.anandtech.com/show/5147/the-ocz-octane-review-512gb/7

Also have any other third parties coabborated engadget's findings? It's possible the 2.6 model they had may have had a slightly weaker than average battery from the factory, or the 2.3 had a slightly stronger than average battery. A sampling of more than just 2 machines might provide a more fair comparison as I'd be surprised if the chips aren't able to scale down their frequency well below 2.3 / 2.6 when not being used. I had an old P4 running ubuntu that I was able to scale down the frequency on, don't see why not on a brand new i7...

A nice test, if someone were brave enough, would be to test the 2.6 model with the 256 SSD swapped in from the 2.3 to see how the battery life fairs compared to the stock drive.
 
You are not getting paid for this so let them stay uninformed and ignorant so that you may take advantage of them in the future.
ROFL. Like your thinking :)

----------

It's possible the 2.6 model they had may have had a slightly weaker than average battery from the factory, or the 2.3 had a slightly stronger than average battery.
A sample size of one is never really indicative of anything. For some reason, it seems to be enough to proclaim some sort of result on the Internet though, which people then start spreading around pretty quickly.
 
Maximum TDP of 45 watts.

Obviously the 2.3 GHz model isn't pulling as much power as the 2.6 GHz.

Isn't this obvious?

Thing is, Intel CPUs typically reduce the multiplier when the CPU isn't doing anything particularly intensive. At least in Windows. So at idle, both CPUs should be running at the same speed. Two possibilities I can think of: The 2.3 was a "factory freak" that happened to do fine on lower voltages than typical. Or the 512GB SSD takes a lot more power than the 256GB
 
i'd really be interested to see if other 2.3'ers can get a 9+ hour battery life doing light tasks.
 
Maximum TDP of 45 watts.

Obviously the 2.3 GHz model isn't pulling as much power as the 2.6 GHz.

Isn't this obvious?

They're rated for the same maximum TDP. There shouldn't be a huge difference in current draw.

There are a lot of other factors at play here(i.e. usage patterns). I doubt that a small boost in CPU frequency would cause a major difference in battery life.
 
i'd really be interested to see if other 2.3'ers can get a 9+ hour battery life doing light tasks.



Surfing Macrumors, its possible. My brightness was set 6 notches from maximum.
 

Attachments

  • Screen Shot 2012-07-16 at 11.15.39 AM.png
    Screen Shot 2012-07-16 at 11.15.39 AM.png
    571.9 KB · Views: 166
I don't have any expertise in this stuff, but I do have a question. If the 2.6 GHz chip never needs to operate at above 2.3 GHz, will the battery life be the same? Do these chips automatically downclock when the full operating speed isn't needed?
 
Maximum TDP of 45 watts.

Obviously the 2.3 GHz model isn't pulling as much power as the 2.6 GHz.

Isn't this obvious?

Intel, like any modern chip manufacturer, Bins their chips, the low leakage chips are sold as the high end models. This means that the 2.6GHz chip may well draw LESS VOLTAGE at the same frequencies than the 2.3GHz chip.

If you remember back to high school physics, and electrical power:

P = VI

You don't pull power, you pull current, you generate power through the flow of current at a given voltage.

Therefore it is likely that the 2.6GHz draws less voltage and therefore has to dissipate less power than the 2.3GHz CPU. However it is clocked nominally higher and the higher clock speed means it draws more voltage. Ultimately, in terms of aggregate sample sizes I am quite sure that the 2.6GHz draws no more power to perform a given workload than the 2.3GHz model.

In more real terms think of the 2.6GHz as having a voltage of 0.9V and the 2.3GHz as having a voltage of 1.0V. The 2.6GHz is about 10% faster both in terms of base speed and turbo speed; one could reasonably assume the current draw is ~10% higher. Let's say that the 2.6GHz draws 45A and the 2.3GHz draws 40A. From the aforementioned equation one can see that the overall power is similar and since a CPU is not a mechanical part for all intents and purposes all power is dissipated in the form of heat thus arriving at the 45W TDP of the quad core chips.

SSD's have minimal power draw (<10W) and since the MBP has a 95W battery in it, it is reasonable to assume that the difference between a Macbook with no drive at all and a very power hungry SSD is ~10% at the highest. The Samsung 830 draws about 5W peak (512GB model).
 
Maximum TDP of 45 watts.

Obviously the 2.3 GHz model isn't pulling as much power as the 2.6 GHz.

Isn't this obvious?

TDP is how much heat (in watts) is needed to be dissipated for the CPU to work properly. Will people ever learn that?
 
TDP is how much heat (in watts) is needed to be dissipated for the CPU to work properly. Will people ever learn that?

When conservation of energy ceases and TDP stops being a good marker as to the overall power consumption of a CPU when compared to other CPUs of a similar architecture?

:p
 
Wow, some really good and informative responses! Food for thought! It would be awesome if anybody else could do an independent test from the Engadget one.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.