True, but also
heat = energy - absolutely fundamental rule of physics! Temperature (how hot it feels) is not "heat" if you use the correct scientific sense of the words.
The question here is how much of that "power × time" energy actually gets stored in the battery, and how much gets wasted as heat. If you had a perfectly efficient charger then, yes, twice the power would charge it in half the time - same energy. With a less efficient charger, not only will some of that energy be wasted in heat, but the rate of heat wastage will quite possibly rise disproportionately to the increase in power. If you imagine a "real" charger as an imaginary "perfect" charger in series with a load of R ohms representing the inefficiency:
Power to the whole charger = voltage × current
Power wasted in load = current
squared × R (comes from "V=IR" and "Power=VI")
...so, say, doubling the power by doubling the current
quadruples the rate of heat wastage which will outstrip halving the charging time.
The actual effect will depend on the workings of the charger and whether the extra "power" actually comes from raising the current, raising the voltage or (more likely) a bit of both - chargers are complicated but, ultimately you're starting with a fixed 110V. (USB-C power delivery offers several different voltages). Generating a stronger EM field for wireless charging is also going to need more current through the coil and/or more turns on the coil (= more resistive heating either way).
So you'd need a lot more information to be sure, but it's very plausible that a more powerful charger will waste a higher proportion of energy.