Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
yall are so anal.. i mean, i understand but is it really that big of a deal? a phone in 2015, why isn't everyone just happy to have a new iPhone. You pay 2 rent payments for your phone and expect it to be perfect? 2 measly rent checks out of your 80+ year life.
 
What app do you run to burn the phone batteries? The first test with 4g signal on and second test not?

Phones:
iPhone 5s (Space Gray) Verizon - Reference point for those familiar
iPhone 6s (Silver) Sim-Free TSMC
iPhone 6s (Space Gray) T-Mobile Samsung
All were restored to factory - set up as new iPhone on iOS 9.0.2
Once the phones were set up, the network settings were reset, and wifi turned off.
All questions were answered in the same manner with regards to location tracking (no).
Room Temperature:78˚F (25˚C)

Charging was done along with the Battery Doctor app and all phones were left on the charger until 100% and the trickle charge was complete. (Approximately 40 minutes after reaching 100%) - This also allows for the phones to cool off after the charging phase.

At this point there are 3 installed apps:
1. GeekBench v3.4.0 (After 6s support update)
2. Liram Device Info Lite v3.2.4
3. Battery Doctor v2.4

Brightness on all phones were set to 100%. All apps were quit before testing began.

OSnap was used to get a picture every 5 minutes on an iPhone 6 Plus with the camera protruding through the shoebox to get my times while I slept.

In this particular test, There was only a 1-interval difference (5 minutes) between the TSMC and the Samsung phones going off. This would mean that these results can't be relied upon to show that either is more efficient than the other.

View attachment 591976
View attachment 591977

Sources of error:
The two devices are slightly different models. (One T-mobile, the other Sim-Free)
Other parts could be different between the phones such as the battery and screen.
Slightly possible different runs: the Samsung was received on launch. The TSMC was received on October 12th.

At 15 minutes, it was found that automatic screen brightness was left on, so the two 6s’s were slightly less than 100% bright. I solved this by going into settings and setting it back at the same time on each and resuming the phones back to 100% brightness.

Future Study:
In order for this to be conclusive, more results are needed.
It would be useful to have thermal information through something like a FLIR camera.
It might make more sense to do a dim-screen test.
It would be a good idea to plot power usage over time using a power supply and an amp-meter to get an accurate power-draw over time to see if either are drawing more energy.

Conclusion:
In this test, it was only shown to have a 2% difference (±2% due to the checking interval being around 2%) between the TSMC and Samsung. In this particular case study, the 5-minute difference wasn’t found to be a significant enough difference to call either 6s more efficient than the other.

EDIT: It looks like I messed up some of the data entry. I fixed the time scale.

PART 2:
After doing the above test, the same test was repeated while in airplane mode with the brightness turned all the way down. The two phones are the same as sampled above.

The test achieved the following:
View attachment 592079
View attachment 592080

Given these findings, and with the Samsung lasting 72% of the time that the TSMC did, it could be concluded that there is a difference between these two chips.

Overall Conclusion (WARNING: GRAND GENERALIZATIONS HERE)
With the screen brightness and phones out of airplane-mode, the phones almost completely masked the difference between processors. Using iFixit's list of IC's, the following wouldn't be used during these tests because they would (for the most part) have been turned off or not used during my test:

Toshiba THGBX5G7D2KLFXG 16 GB 19 nm NAND Flash
Universal Scientific Industrial 339S00043 Wi-Fi Module
NXP 66V10 NFC Controller
Apple/Cirrus Logic 338S00105 Audio IC
RF Micro Devices RF5150 Antenna Switch
Apple/Cirrus Logic 338S1285 Audio IC
Qualcomm WTR3925 Radio Frequency Transceiver

It would be my grand assumption that when all of these circuits are running or in standby, the power difference between CPU's would be even further masked, and and the 2-3% difference could be considered relatively accurate.

Further study (and what it looks like is on the front page blog of MacRumors):
Confirming that the CPU's difference would be further masked by network services by streaming from a service such as Netflix
Trying 3D gaming on the two devices
Some of the suggestions from Part 1

OVERALL CONCLUSION:
I would still be happy with a Samsung-chipped iPhone 6s
 
You didn't clearly say what apps or test do you do in your 2 tests, I can only tell you turn monitor to 100% in first test and lowest in second test, I have no idea how do you burn the battery
 
I can definitely understand the concern. I would suggest though that if you are unimpressed with the device after a few days over battery issues, it might be a better idea to go with one of the plus models (where the battery is going to last you a day and then some with either chip) rather than exchange until you get a TSMC to see a noticeable difference. Glad to help!

I really think it will be fine, but I agree about going to the Plus if the battery doesn't last. I didn't plan to play the return game till I got the TSMC chip.

By the way, it took longer for one of the pseudo-sophisticates to join the thread than I would have thought. ;)
 
If someone is shelling out a month's pay for an iPhone, they have bigger issues than whether their battery life is 4 hours or 6 hours...

Although I guess if you're going to be homeless, it's best to make sure you have a long battery life, since it might be a long time in between charging sessions.

C
They might be developers ;] Don't be so quick to judge...

which can be spread out for up to 30 months
Buying a phone with monthly rates is just sad. Also after 24 months you'll still be paying for an outdated device...
 
  • Like
Reactions: Broadus
Now that is what you would not call a newbie post!

Thanks for taking the time to do it all and looking forward to your follow up posts.
 
  • Like
Reactions: Broadus
Can someone explain what app or script did he really use in test to compare? He never said
 
Test seems to lack validity without disclosing what app is using the battery. Is it a heavy CPU app or light one?

The second test with a 30% battery deficiency for Samsung is alarming, and proves the soc difference as stated.

This pressure point could potentially be triggered by CPU heavy usage in the real world. Not everybody is Mr Average. More tests are needed, certainly thus far - signs are not boding well for Samsung iPhones.
 
I must say the result is absolute weird.

I know that screen and communication module would draw much power and it will mitigate the influence of CPU, but it still can't explain why two devices are so close in part 1, since the battery life of TSMC version of part 2 is only 150% longer. The performance difference of Samsung A9 is too large to be 'masked' by screen and network module.

If the test procedure of part two is correct and the result can be trusted, one can easily calculate that the battery life time of Samsung version in part 1 should be 168 minutes, which is 84% of the time that TSMC version did. But it's not.

From what we've seen in these two tests, it's clear that there is some 'trigger' in the Samsung version that will draw drastically larger amount of power in certain circumstance. And it seems that trigger is either (or both) screen brightness and/or airplane mode.

Here is my speculation:

1. There is a 'cheating' mode in Samsung A9 designated for scoring; or

2. There is something wrong with Geekbench test, such as wrongfully estimated the percentage of CPU usage. It could be caused by Apple's software (iOS) or firmware; that is, iOS system API would possibly report higher percentage of CPU usage than real case on TSMC version. And this phoneme could be mitigated by task switching, such as Bluetooth or Cellular / LTE networking events.
 
Last edited:
Test seems to lack validity without disclosing what app is using the battery. Is it a heavy CPU app or light one?

Can someone explain what app or script did he really use in test to compare? He never said

He had indicated in the article that he was using Geekbench 3.4. His test result was on par with the ArsTech's Geekbench test result, so we can assume that ArsTech was using similar settings for testing.

But the test result of part 1 makes me quite confusing. If we calculate the battery drain rate based on TSMC version, the Samsung version should last only 168 minutes in part 1. But it's not.
 
He had indicated in the article that he was using Geekbench 3.4. His test result was on par with the ArsTech's Geekbench test result, so we can assume that ArsTech was using similar settings for testing.

But the test result of part 1 makes me quite confusing. If we calculate the battery drain rate based on TSMC version, the Samsung version should last only 168 minutes in part 1. But it's not.

Thanks for explanation. More "real life" tests with some simple script would be more helpful, like a forever loop facebook browsing test.
 
Thanks for explanation. More "real life" tests with some simple script would be more helpful, like a forever loop facebook browsing test.

ArsTech's WiFi browsing test is exactly tested under the condition you suggested. And the difference between Samsung version and TSMC version is insignificant.
 
You didn't clearly say what apps or test do you do in your 2 tests, I can only tell you turn monitor to 100% in first test and lowest in second test, I have no idea how do you burn the battery

Haha! I guess I never explicitly stated that it was GeekBench, but it was GeekBench v3.4.0 (After 6s support update) from the list of apps.

Test seems to lack validity without disclosing what app is using the battery. Is it a heavy CPU app or light one?

The second test with a 30% battery deficiency for Samsung is alarming, and proves the soc difference as stated.

This pressure point could potentially be triggered by CPU heavy usage in the real world. Not everybody is Mr Average. More tests are needed, certainly thus far - signs are not boding well for Samsung iPhones.

Again, I think it was pretty plain that it was the GeekBench app (The only installed benchmarking app). I would argue that while you could potentially see a difference, the difference between the two chips would pale in comparison to the power usage from the Wi-Fi chip, LTE chips, or the screen.

He had indicated in the article that he was using Geekbench 3.4. His test result was on par with the ArsTech's Geekbench test result, so we can assume that ArsTech was using similar settings for testing.

But the test result of part 1 makes me quite confusing. If we calculate the battery drain rate based on TSMC version, the Samsung version should last only 168 minutes in part 1. But it's not.

Interesting counter-theory. I’ll test my masking theory: I'll leave this here for the humor of incorrect math, but please just take it as that XD

First, let’s look at the initial values and their percent difference between Samsung/TSMC:
(205-200)/((205+200)/2)*100 = 2.5% for Brightness+No Sim
(325-235)/((325+235)/2)*100 = 32.1% for Dim screen + Airplane Mode

Give the systems arbitrary power-usage values for Part 2 that represent the same percent difference:
10000 - Samsung system
7232 - TSMC system
This would represent a percent difference relatively similar to the test:
(10000-7232)/((10000+7232)/2)*100 = 32.1%

Now we can find the relative power effect of the brightness to find how much of a difference would be needed to bring us down to a 2.5% difference to get the results of Part 1 strictly though masked processes:
((x+10000)-(x+7232))/(((x+10000)+(x+7232))/2) = .025
simple addition/subtraction simplification:
2768/(x + 8616) = .025
Solving for x gives us 102104.

Now that we have the power value needed to mask such a big difference, we can add the additional power to each system. This would be going from Part 2's low-power to Part 1's high-power.
10000(Sammy)+102104(SIM/Bright)=112104 - Samsung System full brightness SIM-free wireless
7232(TSMC)+102104(SIM/Bright)=109336 - TSMC System full brightness SIM-free wireless
(112104-109336)/((112104 + 109336)/2)*100 = 2.5%

Moving from the second test to the first test should have the same % difference between tests in the same processor as well since the only difference would be the SIM/Brightness. I can use this to see if my calculations so far are completely bogus.
10000(Samsung)
112104(Samsung SIM/Bright)
(112104-10000)/((112104 + 10000)/2)*100 = 167.2%

And they are.

d6a1143f571184db25f94613edd43b40af6d3a629221aba00d9efdcfef5efd84.jpg


Bonus: Here's an example image pulled from the OSnap for the first test. It doesn't have the No SIM on the far left phone as I couldn't fit it all in frame, but that phone was on the No SIM as well.
IMG_0332.jpg


And a picture of the setup when I had the iPhone 5 taking the pictures for Part 2:
IMG_0390.jpg
 
Last edited:
  • Like
Reactions: wchigo
Haha! I guess I never explicitly stated that it was GeekBench, but it was GeekBench v3.4.0 (After 6s support update) from the list of apps.



Again, I think it was pretty plain that it was the GeekBench app (The only installed benchmarking app). I would argue that while you could potentially see a difference, the difference between the two chips would pale in comparison to the power usage from the Wi-Fi chip, LTE chips, or the screen.



Interesting counter-theory. I’ll test my masking theory:

First, let’s look at the initial values and their percent difference between Samsung/TSMC:
(205-200)/((205+200)/2)*100 = 2.5% for Brightness+No Sim
(325-235)/((325+235)/2)*100 = 32.1% for Dim screen + Airplane Mode

Give the systems arbitrary power-usage values for Part 2 that represent the same percent difference:
10000 - Samsung system
7232 - TSMC system
This would represent a percent difference relatively similar to the test:
(10000-7232)/((10000+7232)/2)*100 = 32.1%

Now we can find the relative power effect of the brightness to find how much of a difference would be needed to bring us down to a 2.5% difference to get the results of Part 1 strictly though masked processes:
((x+10000)-(x+7232))/(((x+10000)+(x+7232))/2) = .025
simple addition/subtraction simplification:
2768/(x + 8616) = .025
Solving for x gives us 102104.

Now that we have the power value needed to mask such a big difference, we can add the additional power to each system. This would be going from Part 2's low-power to Part 1's high-power.
10000(Sammy)+102104(SIM/Bright)=112104 - Samsung System full brightness SIM-free wireless
7232(TSMC)+102104(SIM/Bright)=109336 - TSMC System full brightness SIM-free wireless
(112104-109336)/((112104 + 109336)/2)*100 = 2.5%

Moving from the second test to the first test should have the same % difference between tests in the same processor as well since the only difference would be the SIM/Brightness. I can use this to see if my calculations so far are completely bogus.
10000(Samsung)
112104(Samsung SIM/Bright)
(112104-10000)/((112104 + 10000)/2)*100 = 167.2%

And they are.

View attachment 592442

Bonus: Here's an example image pulled from the OSnap for the first test. It doesn't have the No SIM on the far left phone as I couldn't fit it all in frame, but that phone was on the No SIM as well.
View attachment 592443

And a picture of the setup when I had the iPhone 5 taking the pictures for Part 2:
View attachment 592444

Cool, can you do some "real life" test, like facebook app browsing as I mention above?
 
Cool, can you do some "real life" test, like facebook app browsing as I mention above?
As I said before, I already set my phone up as a normal non-restored phone, and at this time I'd rather just spend time enjoying it.

I will, however, be re-trying the power consumption theory as a ∆ rather than a lump-sum. Those calculations are pretty bogus as the difference really should be a for the d%/dt rather than as a full lump.
 
As I said before, I already set my phone up as a normal non-restored phone, and at this time I'd rather just spend time enjoying it.

I will, however, be re-trying the power consumption theory as a ∆ rather than a lump-sum. Those calculations are pretty bogus as the difference really should be a for the d%/dt rather than as a full lump.

I think even you use the phone, when you do testing with same number of app running ang same setting, the error would be super small.
 
I think it's funny people always post about how they wait for the refined S cycle phone to upgrade because all the bugs from the numbered version will be worked out. Hopefully those people shut up now.
 
Interesting counter-theory. I’ll test my masking theory:

Wow you make it too complicate. My calculation is simple (though inaccurate):

Based on part 2, suppose that:
- Battery = 320
- TSMC CPU power drain-rate = 1


Calculate:

- Samsung CPU power drain-rate = 320 / 235 = 1.36

Back to part 1, we have another power drain source "screen&LTE", whose power drain-rate is supposed to be identical on both devices. From the performance of TSMC in part 1, we can calculate the drain-rate of screen & LTE:

- 320 / ( TSMC drain-rate + screen&LTE drain-rate) = 200, so screen&LTE drain rate = 0.6

Now we can calculate the estimated battery life of Sammy version if we leave screen & LTE on:

- 320 / (Sammy drain-rate + screen&LTE drain-rate) = 320 / (1.36 + 0.6) = 163 (minutes)

That's what Samsung version supposed to be.


I trust your test result, so there must be something wrong. Either something triggered Sammy A9 to operate in 'cheat mode', or Geekbench is bugged.

I suspect that Geekbench test is questionable, since it measure battery life by keeping CPU usage at the same level. Please note that the test condition is NOT "same tasks, different CPU usage", but "different tasks, same CPU usage." If the Geekbench reads CPU usage wrong, the test would be seriously biased.

For example, if iOS system API reports higher percentage of CPU usage than real case on TSMC version (due to firmware bug or something,) the test result would show that Sammy's CPU is more power consuming.

I'd say the test should be based on 'same tasks', but not 'same CPU usage' on both two devices. But I don't know how.
 
Last edited:
So I would argue that reported % battery is a bit arbitrary, but let's pretend it's exact for a few minutes:
To get the % difference average, I did a ∆ on all of the values ignoring the 100% measurements and the 0% measurements as they are even more arbitrary with the times they switch values.
∆t = 1 minute (the average ∆t/5)
∆%/∆t TSMC:
(98-3)/(15-200) = -.513513514

∆%/∆t Samsung:
(99-1)/(10-200) = -.515789474

∆%/∆t TSMC Low:
(99-1)/(20-320) = -.326666667

∆%/∆t Samsung Low:
(98-1)/(15-230) = -.451162791

∆%/∆t TSMC -∆%/∆t TSMC Low
(-.513513514)-(-.326666667) = -.186846847 for screen and No SIM

∆%/∆t Samsung -∆%/∆t Samsung Low
(-.515789474)-(-.451162791) = -.064626683 for screen and No SIM

Either way, the ∆% difference doesn't in any way support the masking theory, so something else has to be going on here.

EDIT:
I forgot, the TSMC is the SIM-Free, so it supports band 30. I wonder if this has anything to do with it. (Kinda wish I still had access to the other phone to do another test with airplane mode off and low brightness)
 
- 320 / (Sammy drain-rate + screen&LTE drain-rate) = 320 / (1.36 + 0.6) = 163 (minutes)

That's what Samsung version supposed to be.

Taking it a step further down my trail, if the same ∆ was applied to the samsung as the TSMC when adding the brightness and No SIM, we'd have the following:
∆%/∆t Samsung+(∆%/∆t TSMC -∆%/∆t TSMC Low)
-.638009638
So if I assumed that it was linear from 100% down to 0% after 10 minutes I'd get:
100+(156*-.638009638) = .470496472 (close enough - I'm tired and sig-figs are already messed up)

166 minutes for a Samsung phone that had the Screen/No Sim difference of TSMC. It looks like we're pretty much in agreement.
 
Taking it a step further down my trail, if the same ∆ was applied to the samsung as the TSMC when adding the brightness and No SIM, we'd have the following:
∆%/∆t Samsung+(∆%/∆t TSMC -∆%/∆t TSMC Low)
-.638009638
So if I assumed that it was linear from 100% down to 0% after 10 minutes I'd get:
100+(156*-.638009638) = .470496472 (close enough - I'm tired and sig-figs are already messed up)

166 minutes for a Samsung phone that had the Screen/No Sim difference of TSMC. It looks like we're pretty much in agreement.

Your calculation would be more accurate. 163 or 166, it's still not fit the test result. That's why I'd suspect there are different operation modes for Samsung (or TSMC) CPU triggered by Screen/NoSim.

I got a feeling that it would turn out to be a bug (or feature) in Apple's firmware. Either misinterpret CPU usage, or failed to alert CPU to operate in low power (or high power) mode on one side.
 
Your calculation would be more accurate. 163 or 166, it's still not fit the test result. That's why I'd suspect there are different operation modes for Samsung (or TSMC) CPU triggered by Screen/NoSim.

I got a feeling that it would turn out to be a bug (or feature) in Apple's firmware. Either misinterpret CPU usage, or failed to alert CPU to operate in low power (or high power) mode on one side.

*puts on tin foil hat* First, Apple will wait for most of the devices sold to be past their return period and then they make the Samsung more efficient than the TSMC with a bug fix. :p

That makes some sense. Especially since the GeekBench test keeps the CPU at about 50-60% usage the whole time rather than full petal-to-the-metal 100% of the time, so switching would be a part of that with timer coalescing and such.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.