Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
All I read into this is Apple paid consumer reports lots of money to "retest" ahem....
 
A battery is going to discharge at a consistent rate, so running the exact same test 3 times in a row, as CR indicated they did, and getting significantly divergent results would indicate to me, as a computer programmer, that something not normal was going on. Batteries don't change their discharge rate, to that extent, and computer programs don't produce different results unless there is a bug (software or otherwise). I would have investigated it further especially since CR changed a non user facing setting to conduct their test. I would have investigated further before writing the article.

here is a quote from CR "in a series of three consecutive tests, the 13-inch model with the Touch Bar ran for 16 hours in the first trial, 12.75 hours in the second, and just 3.75 hours in the third."
Thanks for the comments but I didn't see the physics law you referenced. Not trying to be a shill but I have a technical background and don't remember running across a law like you mentioned. Course I'm a senior now and may have forgotten it.
One other thing, I have seen batteries change their discharge rate, significantly at times, depending on temperature, state of charge and age. Give these were new batteries I think we can take age off the table. Or maybe these batteries weren't assembled correctly like the recent issue with the iPhone batteries.
 
Good for Consumer Reports, whose rigorous and documented testing methodology brought to light an issue that had been reported anecdotally by users. Apple may say that "people don't use that setting" but in fact there was a bug brought to light.

Consumer Reports isn't perfect, and their perspective on products doesn't always match mine (especially when it comes to comoputers and AV equipment). But they are disciplined, they document their work, and they are willing to retest when new information comes to light or changes are made.
 
  • Like
Reactions: Martyimac
I have no idea what "Energy Impact" mean in term of technical meaning.
And yes, "System Profiler" is still in MacOS.

For the safari, no, I didn't activate flash player. I only browse typical websites and Youtube might be only 15 mins. since I unplugged the charger.
If you go to System Profiler and chose power, if can show the wattage being consumed (or at least it used to show this). I'd be interested to see how high the wattage is with and without Safari loaded. You can press Command+R to refresh the wattage calculation.

There may also be other applications that can give this statistic.

Certainly, 3 h just using Safari doesn't sound good at all. I cannot understand how Consumer reports got 15 hours with a Safari web browsing test. Unless the webpage you are using are much more demanding for some reason. Or there is a bug.
 
Cool. Now they just need to fix the keyboard, the GPU issues, put some descent ports back in there, increase the RAM limitation to 32GB (ideally 64GB), improve the storage options and have a matte screen option and they will be where the competition were 1 year ago!!
 
If using a repeatable process to test similar but not the same products would cause you to flunk, then I would suggest the problem lies with your school and not the experiment.

It does not matter if they go into Safari and change a setting, so long as they go into every browser on every machine they test and make the same change. That's what makes the test worthwhile. It isn't Consumer Reports' responsibility to make sure that Apple gives them bug free software before running their tests.

When Consumer Reports tests cars, do they disable traction control and anti-lock braking systems on the cars before testing for performance, handling and braking? Do they add a thousand pounds to a MacLaren P1 before testing it to compensate for all that lightweight carbon fiber body work? That is effectively what they're doing on these computer tests. Their intent is to even out the playing field, but they're actually intentionally defeating performance engineering before testing for performance. Seems like that would benefit less efficient and less well-designed machines, and particularly hurt Apple, which designs hardware and software together.

It's like the reverse of the VW scandal. VW surreptitiously programmed their cars to function differently during emissions testing in order to yield atypical results. Here, Consumer Reports is the one altering functions and then reporting on atypical results.
 
  • Like
Reactions: moonjelly
A battery is going to discharge at a consistent rate, so running the exact same test 3 times in a row, as CR indicated they did, and getting significantly divergent results would indicate to me, as a computer programmer, that something not normal was going on.

Being professionally involved in testing of various types of products I have similar concerns with CR tests.

1) Tests should be repeatable with the same result under the same conditions. CR tests are not, the variance was very large in the first tests. This indicates a fault in the product or a problem with the test methodology.

Now Apple has fixed a bug, are the results consistent? I have not seen any mention of the variance in the new tests, so to me it is unclear.

2) In testing you would look for "expected behavior". We know that switching of the cache should lead to higher power consumption compared to normal use. Hence it is very peculiar that CR gets much longer battery times than Apple. We also know that Chrome generally consumes more power than Safari. Why is CR getting the opposite results.

The overall impression is that CR test methodology cannot be trusted.

On a side note the "browser tests" seem to be more sensitive to what pages are loaded and with what frequency. For comparison purposes the "play movie in iTunes" test may be the better.
 
  • Like
Reactions: moonjelly
It can be to the right consumer.

As someone who would be carrying a laptop around wherever I go, why wouldn't I want it to be lighter and more portable?

You could say "then get a MacBook Air if you want a thin and light laptop". I would reply "I could, but then I would be compromising power for portability." What if I want both?

What Apple has done here is try to make the tradeoffs more palatable for us average consumers by creating a product which offers the best of both worlds. In the past, I would never have considered a 15" MBP due to its size and weight alone. Now, the 15" MBP is both powerful and more portable than before. The drawbacks don't really bother or affect me.

I am currently still using a 11" MBA, but the new MBP's weight feels reasonable enough that I am actually tempted to pick one up as my next replacement laptop if and when my current Air bites the dust.

I to use a (2013) 11" Air (8GB) and am thinking of buying a new MBP. I know I do not need much power - and if I ever did we have a home Imac (27" 5K with 32GB). I bought my oldest daughter a 2015 13" rMBP (16GB) last year and 2 new 13" (16GB) touch bar MBP laptops for Christmas (2 younger kids). I really appreciate the screens on the retina machines vs my Air. I actually wonder if the MacBook would be good for me - since I use it for mostly spreadsheets and email?
 
  • Like
Reactions: moonjelly
When Consumer Reports tests cars, do they disable traction control and anti-lock braking systems on the cars before testing for performance, handling and braking? Do they add a thousand pounds to a MacLaren P1 before testing it to compensate for all that lightweight carbon fiber body work? That is effectively what they're doing on these computer tests
No, those are false equivalences you made. In both cases (and others as well) they set out a test plan an apply it equally to all vendors they test in the category. I'm not sure why people have such a hard time grasping this? Just because it isn't how you would choose to test doesn't make it invalid.
 
  • Like
Reactions: Martyimac
Here is why the 15+ hour test results are useless: they are UNREACHABLE!

Why would I read a review saying it gets 15 hours of battery life when I get around 5-7? People read these things to know whether to buy them. You know to actually USE their computer. I do not want to know Laptop A can survive 3,000 hours if it does nothing. I want to know real life battery results. How long does it last while USING IT compared to other laptops.
 
Thanks for the comments but I didn't see the physics law you referenced. Not trying to be a shill but I have a technical background and don't remember running across a law like you mentioned. Course I'm a senior now and may have forgotten it.
One other thing, I have seen batteries change their discharge rate, significantly at times, depending on temperature, state of charge and age. Give these were new batteries I think we can take age off the table. Or maybe these batteries weren't assembled correctly like the recent issue with the iPhone batteries.
The point i was attempting to make - but i guess failed in doing so, is that the test made no sense. you cannot, under the circumstances they described, get 4 hours one time and 19 the next. A CPU cannot suddenly require less electricity to do the exact same task (again everything else being equal) unless there is a software bug. I guess it would be the law of physics that governs the amount of energy a particular task requires (I'm sure there is something) ;)
[doublepost=1484329063][/doublepost]
Being professionally involved in testing of various types of products I have similar concerns with CR tests.

1) Tests should be repeatable with the same result under the same conditions. CR tests are not, the variance was very large in the first tests. This indicates a fault in the product or a problem with the test methodology.

Now Apple has fixed a bug, are the results consistent? I have not seen any mention of the variance in the new tests, so to me it is unclear.

2) In testing you would look for "expected behavior". We know that switching of the cache should lead to higher power consumption compared to normal use. Hence it is very peculiar that CR gets much longer battery times than Apple. We also know that Chrome generally consumes more power than Safari. Why is CR getting the opposite results.

The overall impression is that CR test methodology cannot be trusted.

On a side note the "browser tests" seem to be more sensitive to what pages are loaded and with what frequency. For comparison purposes the "play movie in iTunes" test may be the better.
CR is secretive of their testing methodology, as they should be because they don't want manufacturers gaming the system. Because of this, they are asking us, as consumers, to take a huge leap of faith in accepting their results. For example, do they regularly update their tests to reflect the ever changing ways we use computers as the years go on? So, when they show test results that are completely illogical I feel they owe us a higher standard of due diligence.
 
Yeah, I don't get how this is possible. 15-18 hours? I can't imagine the machine will even run that long at idle with the screen off.

But that's what they observed, isn't it? Could the battery in the MacBook Pro, along with all of the other optimizations in hardware and software, really be that good? We will see.
[doublepost=1484330153][/doublepost]
Here is why the 15+ hour test results are useless: they are UNREACHABLE!

Why would I read a review saying it gets 15 hours of battery life when I get around 5-7? People read these things to know whether to buy them. You know to actually USE their computer. I do not want to know Laptop A can survive 3,000 hours if it does nothing. I want to know real life battery results. How long does it last while USING IT compared to other laptops.

Exactly. When I first got my 2013 MacBook Air, I was easily getting 12 hours out of a single charge. Now 3 years later and it barely lasts 3 hours. is the battery any worse? No. I just have a LOT more stuff running on my machine now. My typical work day has 12+ apps running at once, plus many background process and menu extras. No chance of me getting the advertised battery life in real-world usage.
 
The bug, fixed by Apple in macOS Sierra 10.12.3 beta 3, is not one the average user will encounter as most people don't turn off the Safari caching option ...


However, it is safe to say many customers have experienced abnormal battery results in highs and lows—and that if actually doing much of anything they will quickly exceed Apple's finely set tolerances for optimum battery results.
 
I find it so funny that a well known tech site sent out an email saying they wanted consumer reports to retest again and they replied no their confident in there test and stand by their findings. Then Apple contacts them and all of a sudden they are willing to re-test and then when they do the numbers are off the charts. It just sounds so fishy to me and unrealistic numbers for the average consumer and isn't that what their business is all about the consumer? 15 hrs on the 15" come on now!
 
Here is why the 15+ hour test results are useless: they are UNREACHABLE!

Why would I read a review saying it gets 15 hours of battery life when I get around 5-7? People read these things to know whether to buy them. You know to actually USE their computer. I do not want to know Laptop A can survive 3,000 hours if it does nothing. I want to know real life battery results. How long does it last while USING IT compared to other laptops.
Yep, it isn't a useful measure in and of itself but the real value is how it compares against other laptops. I think any rational person wouldn't expect 15 hours of battery for typical usage. However if under the same conditions Laptop ABC rated 10 hours then I'd expect that the MacBook Pro likely has better battery management and capacity. See the difference?
[doublepost=1484332210][/doublepost]
I find it so funny that a well known tech site sent out an email saying they wanted consumer reports to retest again and they replied no their confident in there test and stand by their findings.
To be fair we are seeing one side of the conversation where they get the opportunity to edit and position their side. Besides it makes sense. A small time blog site like 9to5 doesn't have much standing in this case. Apple eventually reached out to CR and they no doubt had a lot of discussions about it. Quite a difference.
 
However, it is safe to say many customers have experienced abnormal battery results in highs and lows

What is abnormal? My understanding:

Several users are seeing results that align with Apple specification.

We have some bugs that lead to abnormal results:
- The cache related bug CR bumped into
- dGPU seems to be active when it should not, reported by several forum members

We have situation where high power consumption should be expected, i.e. this is normal not abnormal
- powering external equipment when on battery
- running heavy computational or graphic loads
- start up effects when initially syncing, spotlight ...

I believe there is something more here but it is difficult to separate the "noise" from the real issues. Many posts just report poor results without more information and deeper investigation.

If we want to help Apple and other forum members to solve this we should do our best to provide objective and detailed information.

/M
 
  • Like
Reactions: gnasher729
No, those are false equivalences you made. In both cases (and others as well) they set out a test plan an apply it equally to all vendors they test in the category. I'm not sure why people have such a hard time grasping this? Just because it isn't how you would choose to test doesn't make it invalid.

Not at all. Apple designs hardware and software together, unlike any other vendor. So when seeking something like optimized battery life, hardware design decisions are made in conjunction with things that can be accomplished through software design. Unlike Dell or HP, they're not saddled with an off-the-shelf OS that can't be fine tuned to a specific set of hardware specifications.

Presumably Apple had a desired target for average battery life and a desired target for form factor and weight on the new MBP. (They could make one that will run for 48 hours on a charge if they were willing to sell a two-inch think machine that weighs a few pounds...) So, they made the battery as small as possible by achieving efficiency through software design, including things like optimizing browser cache, managing screen brightness, etc. Their claims for battery life are then made based on common real-world usage. Consumer Reports, however, created a non-real world stress test and disabled software-based power management, to make things more "equal" among all vendors in the category. The manufacturer that uses brute force to achieve battery life by making a thick, heavy device with a huge battery will win that test every time, even if in real-world usage their battery life is no better than a MBP. That's your false equivalence.
 
I find it so funny that a well known tech site sent out an email saying they wanted consumer reports to retest again and they replied no their confident in there test and stand by their findings. Then Apple contacts them and all of a sudden they are willing to re-test and then when they do the numbers are off the charts. It just sounds so fishy to me and unrealistic numbers for the average consumer and isn't that what their business is all about the consumer? 15 hrs on the 15" come on now!

The CR test is a very low power test. They keep screen brightness low and they basically just do some repetitive web surfing. I'm not surprised that there are power users (or more likely gamers) who are blowing through their battery much quicker.
 
Not at all. Apple designs hardware and software together, unlike any other vendor.
Actually Microsoft basically does the same now. It doesn't really matter, though. From the consumer's perspective they really don't care who made what. They're buying a widget and want to know what to expect. What Apple says or does or claims is rather irrelevant. The actual value from the test CR does isn't very helpful as I said elsewhere. But by keeping a consistent methodology that's repeatable across brands it helps the consumer make a comparison across brands -- and that's what the consumer typically does.
 
Good for Consumer Reports, whose rigorous and documented testing methodology brought to light an issue that had been reported anecdotally by users. Apple may say that "people don't use that setting" but in fact there was a bug brought to light.

Consumer Reports isn't perfect, and their perspective on products doesn't always match mine (especially when it comes to comoputers and AV equipment). But they are disciplined, they document their work, and they are willing to retest when new information comes to light or changes are made.


You are misreporting what happened. CR did not bring to light "an issue that had been reported anecdotally by users." Please amend your post. Consumer Reports, of which I am a subscriber, testing methodology had nothing to do with how users have used their laptops so the obscure and intermittent bug that impacted their testing wouldn't have affected consumers battery life.
 
Yeah - I think it is pretty disappointing. The 15" MBPro could have stayed pretty much the same size/weight and nobody would have cared because there would have been a new 14" ultra thin model to rave about if you needed ultra portability. Apple missed a trick IMO.

The worse part is that there is NO WAY IN HELL that the the MBPro will ever get thicker again. Apple will NEVER release a MBPro that is even 1 mm thicker.
This means MBPro users are now stuck. Battery capacity and ports can only get worse from here on in the Pro model (unless there is a new tech change).

It seems the MBP is more powerful than many people here claims. Appleinsider tested the MBP 15 2016 agains the 2015 model and found that with more demanding workloads the 2016 model performs more that 50% better than the 2015 model.
http://appleinsider.com/articles/16...acbook-pro-with-touch-bar-vs-2015-macbook-pro

'In photo editing, both machines edited smoothly using Adobe Lightroom, but the latest 15-inch Macbook was about 8% faster in converting 50 edited raw photos to JPEG. That's not a big difference, as Lightroom along with most photo editing apps are more dependent on the processor and generally don't max out the machines unless you're applying a series of filters.

For video editing we see much more of a difference. The latest MacBook Pro is on average about 50% faster with heavier tasks showing the bigger improvements. Rendering a 5-minute 1080p video with luts and film grain applied was 52% faster, and the same project in 4K was 54% faster.

A much heavier project with multiple scaled 4K clips with effects was 94% faster. Along with these speed improvements the 2016 Macbook Pro runs cooler and quieter.
'
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.