Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What you are lacking here is the common sense that everyone else is applying to these observations. A 5s plugged into either charger shows no more than a 6w draw. A 6 draws the same amount on the 1a charger screen on or off. Off the 2.1/2.4a chargers it draws 6w screen off/8-10w screen on. What is the most likely thing that is happening there? The charger is suddenly wasting that extra power only with that one device and only when the screen is on? No. The most likely scenario is that the device uses more power with the screen on and the new models have been smartly updated so they can charge at full rate even when in use IF the extra power is available. With the 6+ having a 60% larger battery and drawing 10w screen off, again, what's the most likely scenario? That it is able to charge faster when available. Sure, I'd like to see these hypothesis tested further, but at this point it seems pretty logical. A heck of a lot more than your counterpoint that these chargers are suddenly behaving erratically but only when plugged into a 6 or 6+!!

An assumption is not a conclusion. Period. End of story.
 
An assumption is not a conclusion. Period. End of story.

Oh get over yourself. To make this crystal clear for everyone here, your position is that the iPhone 6 and only the iphone 6 reveals a 'mystery' vampire draw in higher power chargers?

Do I have that right?

Oh, and I should mention that this mystery draw is variable depending on how hard the iPhone is working!! :eek:
 
Last edited:
I can confirm that with a 12W charger you will charge your iphone 6 in 2 hours from 0%

I had the same result with my Anker portable charger, which has a max output of 15W.

1810 mAh / 2 h = 905 mA

0.905 A * 5 V = 4.525 W
 
I had the same result with my Anker portable charger, which has a max output of 15W.

1810 mAh / 2 h = 905 mA

0.905 A * 5 V = 4.525 W

No, your calculations don't mean anything.
For one, the battery is 3.8v. Secondly, charging a battery is never 100% efficient.
 
I never said he put a resistor between the charging pins. Quit making things up (that's stooping even lower than before). The simple fact that you have trouble understanding simple logic shows that you're incapable of any form of human discussion.

BTW, OS X doesn't report 2100 mA is consumed. It says that's what requested. A device doesn't necessarily have to use that much. Reading a number on a screen is NOT the same as doing a measurement. Tossing that "evidence' out the window now.

And "maH" is a capacity, not a current. EE 101.

You mention placing a resistor on the charging pins in your very first sentence. And why would the phones request 2100 mA if it is incapable of using it? No other iPhone to date has done so. The only devices to request that much current in the past were iPads, which do indeed draw that much current under most conditions (obviously current draw would be reduced if the battery is nearly charged and the iPad is in standby). In fact, when using kill-a-watt type devices, the iPads show similar current draw to the new iPhones.

mAh was a typo/brain fart on my part. I think everyone else knew what I meant. I am well aware of the difference between current and capacity


No, your calculations don't mean anything.
For one, the battery is 3.8v. Secondly, charging a battery is never 100% efficient.

Also worth noting that the current drawn isn't constant for batteries. Typically they draw more current when depleted and essentially trickle charge when near the max. So if a battery averages 5W from 0-100%, chances are it's drawing a fair bit more than 5W when at 0%
 
You mention placing a resistor on the charging pins in your very first sentence. And why would the phones request 2100 mA if it is incapable of using it? No other iPhone to date has done so. The only devices to request that much current in the past were iPads, which do indeed draw that much current under most conditions (obviously current draw would be reduced if the battery is nearly charged and the iPad is in standby). In fact, when using kill-a-watt type devices, the iPads show similar current draw to the new iPhones.

mAh was a typo/brain fart on my part. I think everyone else knew what I meant. I am well aware of the difference between current and capacity

Also worth noting that the current drawn isn't constant for batteries. Typically they draw more current when depleted and essentially trickle charge when near the max. So if a battery averages 5W from 0-100%, chances are it's drawing a fair bit more than 5W when at 0%

I am not sure how the iPhone signals to the Mac that it wants 2.1A, but it may be that there is no means of signaling an intermediate value like 1.5A or 1.2A. It's not unreasonable to signal for more power and consume less. After all, the battery in the iPhone 6 Plus is still not that large relative to an iPad, so I think it's unlikely for it to charge at the full 2.1A.
 
Last edited:
I tested my iPhone 6 Plus again from about 53% charge using a genuine 12W charger and lightning cable.

Screen off - 1.5A
Screen on running Zen Garden - 1.7A

I'm not sure why my previous result only showed 1.3A when charging from around 60% the last time. Perhaps just bad luck or a glitch.

Proof attached.
 

Attachments

  • IMG_0080.JPG
    IMG_0080.JPG
    911.8 KB · Views: 274
  • IMG_0081.JPG
    IMG_0081.JPG
    963.9 KB · Views: 260
I tested my iPhone 6 Plus again from about 53% charge using a genuine 12W charger and lightning cable.

Screen off - 1.5A
Screen on running Zen Garden - 1.7A

I'm not sure why my previous result only showed 1.3A when charging from around 60% the last time. Perhaps just bad luck or a glitch.

Proof attached.

What's that device called youre using? I'd like to get one to tinker with. I'm also wondering how much parasitic draw it introduces? Might be documented in the literature or the device itself somewhere.
 
The problem is a combination of assuming (bad) and the generally gullible Internet population jumping on the assumption and running with it.

I'll say this again—measuring twice the wattage at the wall does NOT alone prove that a battery is charging twice as fast. The measurement is not taken too far upstream from the battery, and there is no control. For example, what if the iPhone 6 could truly USE more power but not necessarily CHARGE at that higher rate? Was it off in with one charger and on with the other one? Did the tester know for sure it wasn't backing up to iCloud at the time or doing other intensive background work? Devices CAN split the current between charging and usage, and a simple current measurement at the wall does not take that into account.

The wall-current test is technically inconclusive. Someone posted an actual charge time—now THAT'S getting closer to a true test. At least that person had the intelligence to measure the end result and not the power drawn from the wall.

The problem is the people these days are too lazy to actually read and understand anything. They like to see someone's post and make assumptions instead of thinking, processing, and testing. That's why we have things sensationalized and blow out of proportion like the bending issue.

Also, my questioning of the test does not imply that I don't believe that the result is true (it's indeed very likely that the new phone can take the current). I just know that the test was done with a conclusion already in mind (which is completely backwards and biased). People just need to better understand testing and make sure they know first before disseminating information based on assumptions.

It's just that simple.
 
The problem is a combination of assuming (bad) and the generally gullible Internet population jumping on the assumption and running with it.

I'll say this again—measuring twice the wattage at the wall does NOT alone prove that a battery is charging twice as fast. The measurement is not taken too far upstream from the battery, and there is no control. For example, what if the iPhone 6 could truly USE more power but not necessarily CHARGE at that higher rate? Was it off in with one charger and on with the other one? Did the tester know for sure it wasn't backing up to iCloud at the time or doing other intensive background work? Devices CAN split the current between charging and usage, and a simple current measurement at the wall does not take that into account.

The wall-current test is technically inconclusive. Someone posted an actual charge time—now THAT'S getting closer to a true test. At least that person had the intelligence to measure the end result and not the power drawn from the wall.

The problem is the people these days are too lazy to actually read and understand anything. They like to see someone's post and make assumptions instead of thinking, processing, and testing. That's why we have things sensationalized and blow out of proportion like the bending issue.

Also, my questioning of the test does not imply that I don't believe that the result is true (it's indeed very likely that the new phone can take the current). I just know that the test was done with a conclusion already in mind (which is completely backwards and biased). People just need to better understand testing and make sure they know first before disseminating information based on assumptions.

It's just that simple.

Strawman arguments. You are arguing that because a piece of information is imperfect, it must be completely useless. That is a logical fallacy and is certainly not applicable here. It is perfectly reasonable to draw some conclusions by measuring at the wall.

----------

What's that device called youre using? I'd like to get one to tinker with. I'm also wondering how much parasitic draw it introduces? Might be documented in the literature or the device itself somewhere.

Charger Doctor (easy to find on Google or Amazon).
 
The problem is a combination of assuming (bad) and the generally gullible Internet population jumping on the assumption and running with it.

I'll say this again—measuring twice the wattage at the wall does NOT alone prove that a battery is charging twice as fast. The measurement is not taken too far upstream from the battery, and there is no control. For example, what if the iPhone 6 could truly USE more power but not necessarily CHARGE at that higher rate? Was it off in with one charger and on with the other one? Did the tester know for sure it wasn't backing up to iCloud at the time or doing other intensive background work? Devices CAN split the current between charging and usage, and a simple current measurement at the wall does not take that into account.

The wall-current test is technically inconclusive. Someone posted an actual charge time—now THAT'S getting closer to a true test. At least that person had the intelligence to measure the end result and not the power drawn from the wall.

The problem is the people these days are too lazy to actually read and understand anything. They like to see someone's post and make assumptions instead of thinking, processing, and testing. That's why we have things sensationalized and blow out of proportion like the bending issue.

Also, my questioning of the test does not imply that I don't believe that the result is true (it's indeed very likely that the new phone can take the current). I just know that the test was done with a conclusion already in mind (which is completely backwards and biased). People just need to better understand testing and make sure they know first before disseminating information based on assumptions.

It's just that simple.

I don't think you've read through my posts on this issue very thoroughly. I've had no preconceived notions and not assumed anything. As far as the instrument allows, the tests have been fairly controlled. I realize that some of the discussion has been in another thread. Perhaps you haven't seen it. My post is #99 here https://forums.macrumors.com/threads/1787094/
 
Also, my questioning of the test does not imply that I don't believe that the result is true (it's indeed very likely that the new phone can take the current). I just know that the test was done with a conclusion already in mind (which is completely backwards and biased). People just need to better understand testing and make sure they know first before disseminating information based on assumptions.

It's just that simple.

You actually don't know that because there were no conclusions made until after the observation phase. In fact, bringing the iPhone 5 into the fold was to eliminate baseless conclusions. I find your comment strange coming from someone who's complaining about assumptions. You managed to be all of ironic, ignorant and hypocritical in the span of a single sentence. Impressive. What's not so impressive is the lack of deductive reasoning skills you've showcased here.
 
OK, so I tested the charge rate from around 28% on my iPhone 6 Plus, and now it's charging at around 1.8A. I haven't been trying to test this extensively or I would have seen this sooner, but it looks like the 6 Plus is following a charge rate curve of some sort that covers the whole range of battery charge levels and doesn't just peter off at the end.
 
I only use my ipad chargers to charge the phones.I have since the 4s no problems at all and always a speedier charging time.
 
I only use my ipad chargers to charge the phones.I have since the 4s no problems at all and always a speedier charging time.

I'm not sure that's actually (technically) possible, as all non 6 iPhones restrict the current they draw, regardless of what current it is attached to (in your case 10/12W iPad chargers).

Placebo?
 
I'm not sure that's actually (technically) possible, as all non 6 iPhones restrict the current they draw, regardless of what current it is attached to (in your case 10/12W iPad chargers).

Placebo?

Possibly but it does for the 6 plus
 
So has anyone tested with the phone drained and turned off? That should give you a good idea of the max draw used for charging.
 
So has anyone tested with the phone drained and turned off? That should give you a good idea of the max draw used for charging.

I have not, but You don't necessarily get higher consumption with the battery fully drained.
 
I have not, but You don't necessarily get higher consumption with the battery fully drained.

I think he's just saying that with the phone turned off all the power will be going to the battery rather than being split between battery and device.

It's a good idea to look at.
 
I have not, but You don't necessarily get higher consumption with the battery fully drained.

Well from playing with my iPhone 5 today it pulls a max of ~0.94A (using a USB power meter) while off and fully drained to the point of auto-shutdown. Once it powered back on it went up to 1A. Once it booted I powered it back off again. Leaving it off the current draw slowly goes down as the battery charges. Numbers are the same no matter what charger I used.

My 6+ should be here sometime :)mad:) this week. I'll try the same out with it and see what it does.
 
Got to test this today finally. iPhone 5 and iPhone 6 both pull 1A on a standard charger. No surprise there. On an iPad charger, the 5 pulls ~1A, the 6 plus pulls ~1.5A, and my gen 1 iPad pulls a full 2A. These numbers are with all at ~10% battery and powered off. All measured with a USB power meter so no parasitic power draw in the metering.

So yes, it appears that the 6 plus can probably charge faster off of a 2A charging port. The iPhone 5, however, should not.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.