Moore may have been speaking strictly about CPUs, as that's what his employer, Intel, was manufacturing but the observation has been applied to GPU, RAM, and storage since I started working IT in the 90s.
Your assertion was: "the corollary to Moore's law states you won't really notice a bump in specs (whether GPU, CPU, RAM, storage, etc.) until it doubles."
Moore's was: "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year."
There are several steps between complexity and performance implications. And the doubling
he spoke of is something completely different than the one
you are speaking of.
Even before the days of Intel and Moore, this was known, whether they invoked a "law" or not. Commodore didn't make a C72 or a C96.
Moore's observation predated Commodore's computers.
And your "C72 or a C96" part is funny, because Commodore almost did launch a C65.
If what you're talking about is "RAM always doubles", that isn't true either. The M3 Pro MacBook Pro starts at 18 GiB RAM, not 16 or 32. The current iPhone 15 has 6 GiB RAM, not 4 or 8.
But, sure, in some cases, having RAM be a power of two can be beneficial, due to techniques such as dual-channel.
Why do you think Apple sells memory and storage in factors of 2?
They often do not.
And in any cases, if they did, Moore's Law wouldn't be why.
Anyway you don't have to believe it, maybe you have some skill where you can perceive the difference between an 8GB machine and a 9GB, but the typical users and the mass markets don't.
Powers of two being commonplace in computing is unrelated to consumer perceptions.