The graph is measuring CPU performance as it ramps up in power to it's maximum constraint. However, off the bat the M2 starts off with a higher power rating in consumption. Call it idle or call it start of test. The M2 is consuming more power.
Also, wrong, the M1 != M2 in terms of power and performance. The M2 is starting off at a higher relative performance vs the M1 and also starts of 1W above it as well. Not sure how you can equate both of these?
Frankly, you have completely misunderstood this graph (and all perf/W graphs).
It is
not a CPU "ramping up", there is no "start of test", it is not "idle". Every single test there is
at 100% CPU load. This is not
one test. This is 20+ tests, run at different TDPs.
The M2 does
not start at a "higher power rating". Apple simply did not include the lower power draw data points for the M2. Just like M1, M2, Intel Alder Lake, AMD Zen3, Qualcomm, Arm, NUVIA, etc.: the CPU idles in milliwatts and
after ~500mW (see chart below), the CPU is out of idle and running computations.
There is not a modern laptop CPU on Earth that idles at 1W or more. 1W is
compute/stress-test power draw. Idle power draw is always measured in
milliwatts.
To make this plainly obvious,
M1
1W - Apple refused to share (it is in between 0% to 40%)
2W - 40% perf
3W - 50% perf
4 W - 70% perf
M2
1W - Apple refused to share (in between 0% to 65%)
2W - Apple refused to share (in between 0% to 65%)
3W - 65% perf
4W - 75% perf
5W - 85% perf
They idle in
milliwatts. That is the base power draw: all power added more than idle is directly used for compute (powering the CPU cores, the IMC, the caches, the NoC, the I/O, etc.).
A more complete perf-watt graph example (note the Apple A9). Note how the performance
flattens at very low power draw: CPU firms don't like to show this low-power part because it makes their product look "not as great".
The removal is not a technical point; they have the data. All modern CPUs can scale under 1W. Any removal of data is
pure marketing
.
//
Apple's refusal to share is also common.
Intel did the same thing to Apple. It's pure marketing shenanigans. Think of it like a semiconductor's version of a beauty filter. There are imperfections and "straight looking curves" at very low power draw, so they just cut it out.