Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So what's been the consensus with daily usage? I get around 10 hours of usage time on the 6s plus Samsung chip without using battery saver.

I noticed the battery wasn't as good as my normal 6 plus but figured it was because its 10% smaller. But like I said the screen on time has still been good. Is anyone getting horrible usage times on the 6s plus like 5-6 hours?
 
Pretty much the same as my 6plus about 10% an hour or screen on time, sometimes a little more.
 

Attachments

  • image.png
    image.png
    246.1 KB · Views: 244
It seems to me that the Geekbench battery test is some kind of under-stress test since real-life battery life is way longer than what's shown in the tests. Wouldn't it be possible that samsung chips consume less energy in normal usage and only get less efficient when under stress? All the tests I've seen so far are how samsung chips are less efficient when under stress but who would be running benchmark apps or rendering 4k videos all day long? After all real-life performance isn't all about heavy duty efficiency.
 
The A9 processor regardless if it's made by TSMC or Samsung, have the same finfet architecture. People trying to correlate the superiority of _nm's between the two processors are just confused..

The 14nm vs 16nm does not correlate to any physical structure of the chips.. They are approximated marketing terms, with each foundries measuring their processes differently.

The standard metric to measure density is the M1 pitch. With the leading foundries in finfet architecture Intel's 14nm/samsungs 14nm/ and TSMC's 16nm having the same M1 pitch of 64nm, indicates that the density of all three processes are the same.

Regardless, to the end user the transistor density is irrelevant..
 
The Samsung is OK, especially when idling.

This is with about 1.5 hours screen on and 1.5 hours streaming BT Audio to car.
 

Attachments

  • 2015-10-06 15.40.32.png
    2015-10-06 15.40.32.png
    140.7 KB · Views: 244
The A9 processor regardless if it's made by TSMC or Samsung, have the same finfet architecture. People trying to correlate the superiority of _nm's between the two processors are just confused..

The 14nm vs 16nm does not correlate to any physical structure of the chips.. They are approximated marketing terms, with each foundries measuring their processes differently.

The standard metric to measure density is the M1 pitch. With the leading foundries in finfet architecture Intel's 14nm/samsungs 14nm/ and TSMC's 16nm having the same M1 pitch of 64nm, indicates that the density of all three processes are the same.

Regardless, to the end user the transistor density is irrelevant..

Then why is the Samsung one smaller in physical, measurable size?

EDIT: And I did read somewhere that the TSMC one is FF+ instead of FF. While Samsung one is FF. I have NO IDEA what this means in reality, but there seem to be differences.
https://mobile.twitter.com/JoshuaHo96/status/650598118908497920
 
Last edited:
Wouldn't it be possible that samsung chips consume less energy in normal usage and only get less efficient when under stress? All the tests I've seen so far are how samsung chips are less efficient when under stress but who would be running benchmark apps or rendering 4k videos all day long? After all real-life performance isn't all about heavy duty efficiency.

It wouldn't surprise me at all if this turns out to be the case.
 
The 14nm vs 16nm does not correlate to any physical structure of the chips.. They are approximated marketing terms, with each foundries measuring their processes differently.
Well, to be technical, 14nm and 16nm do actually correlate to the physical structure of the chips, very directly. 14nm is a smaller manufacturing process, leading to smaller physical structures on the chip itself.

Usually decreases in CPU manufacturing processes results in higher performance and lower power consumption, but that's not always the case.
 
It seems to me that the Geekbench battery test is some kind of under-stress test since real-life battery life is way longer than what's shown in the tests. Wouldn't it be possible that samsung chips consume less energy in normal usage and only get less efficient when under stress? All the tests I've seen so far are how samsung chips are less efficient when under stress but who would be running benchmark apps or rendering 4k videos all day long? After all real-life performance isn't all about heavy duty efficiency.

I think that this is a great idea. Ideally we should post usage time/standby time and chip to see if there is a difference in real use.
 
I think that this is a great idea. Ideally we should post usage time/standby time and chip to see if there is a difference in real use.

Or maybe we should ask Primatelabs to change Geekbench Battery test to show same result for both chips? Since they are identical.
 
Did anyone run these benchmarks on 9.1 beta?
According to [1] there have been quite a few jumps (at least on the version number front) between 9.0-9.0.2 and 9.1.

Curious to see whether they tweaked stuff there.


[1] 4.1 Version List
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.