It's the cable, not the charger.
I posted this over at LifeHacker but figured I'd post it here as well:
I've been testing this for several days now with an ammeter and I have yet to get my iPhone 6 to draw more than 1A off a 2.1A charger (I've even tried 2.4A, but there was no noticeable difference). I don't think that what's happening here is that the iPhone can suddenly draw more than 1A. Instead, I think the 1A charger proves to be inefficient when using 28AWG wires.
To test this hypothesis, I got my 6 down to just below 20% and then put it on a 2.1A charger for 2-3m. Once the screen shut off, the draw dropped to a steady 1A (± 0.05A). I then switched to a 1A charger. The draw was 0.9A (± 0.03A). Just to be sure this wasn't a function of the Apple Lightning cable's length, I used a short, 4" lightning cable from Amazon and got effectively the same results.
In sum, what's happening appears to be that the 1A just can't charge at full efficiency because of, well, physics. When you attach a stronger power source, the loss in current is effectively eliminated. In other words, with a 2.1A charger, your iPhone should charge about 10% faster. (This also means there's no damage to the battery since the iPhone charging circuit appears to be limiting the input to 1A.)
EDIT: I tried several other Lightning cables from different manufacturers. Aside from a few that were more inefficient than I expected, nothing ever gave a reading of more than 1A.
EDIT 2: it's worth noting that if you're charging while using the iPhone, a 2.1A power source will charge dramatically faster since it can provide more than 1A to power the screen, etc. I saw draws as high as 1.5A when I activated the screen (in lock mode).