Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It seems counterintuitive to me that more/faster power won’t generate more heat, but I’m no electrical expert. If the math/physics says it’s so then that’s great!

Edit- I just thought of something. Is it that the total heat remains the same? Since the charging is faster the heat dissipation lasts for a shorter period but perhaps it’s hotter for an equivalent total heat output?
No the heat per unit time remains the same if the current remains the same. Assuming the resistance (of the coils) that are facilitating the power transfer remains the same, then if the time to charge is shorter, the total heat dissipated is also reduced.

EDIT: I did some additional fact checking for my own knowledge. I realized that the aforementioned "handwaving" had to deal with the battery charging itself (which is on a different circuit). Typically lithium batteries have a fixed voltage, so charging at that voltage would require stepping down from e.g. 15V to 5V, which would automatically increase current and heat dissipation.

It seems the solution was simple - OPPO is doing this ref: https://www.oppo.com/en/newsroom/stories/airvooc-world-leading-wireless-charging/ - basically just put the batteries in series. So in the case of 15V charging - assuming each lithium battery charges at 5V, 3 in series will be able to accept 15V without additional heat losses.

There's definitely a bit of engineering effort required, but tldr; it is possible to keep the heat profile down.
 
Last edited:
  • Like
Reactions: subjonas
My friend and I go back and forth on this. He’s a huge fan of wireless charging. I never use it. For me I don’t get the advantage of the huge connector versus the tiny usb-c connector. Everyone is different and it’s good to have options.
My iPad Pro 2018's USB-C port is broken. Magsafe is better in this way to ensure hardware fatigue.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.