Sigh. No it's not. It's a percentage. It's not an absolute. 0.1% is 0.1%.
For one mW, yes. But we're not talking about a single milliwatt difference here. Being generous to the advantages of scale, let's say the 110mW operating consumption of 2Gb will only increase by 50mW when doubling to 4Gb. In any other context, that would be a trivial difference.
But here, that increases the RAM share by 10% on the A4, and thus an approximately 2.5% difference in overall battery life. 15 minutes.
As I said, the CPU power consumption fluctuates by more than that.
...which is built directly into the power consumption averages specified by manufacturers. Notice the RAM figures--operating consumption 55mW; peak consumption 85mW. As this illustrates, power consumption at idle is more important to battery life figures, since all electronics spend most of their time waiting. Since the idle power consumption of RAM is strongly correlated with the amount of RAM, this is important.
Don't presume to tell me my mindset. It's an engineering mindset.
At the moment, it's just an absurdist mindset, perpetuating an argument that does not contradict any established point.
Your "engineering" mindset ignores the reality of engineering on this scale. You have presented exactly nothing to refute the uncontroversial point that considerations other than component cost (again, a price difference near as makes no difference in the final product) were involved in this decision, and that Apple's A4 customizations shaved off amounts of power from the ARM core less than the difference of 1Gb vs. 2Gb RAM modules. If RAM power consumption, inter alia, were insignificant, then so too would be Apple's efforts, since the stock Cortex A8 SoC power consumption is only about double the RAM.
Surely you're bright enough to understand that.
I'll ignore the rest of your message since it's more of the same.
Convenient.