I agree, mostly, except it used to be true that computers gained 50% a year.
Then we hit about 3GHz, and physics pretty much hit a brick wall. Speed improvements are now about parallelism and pipelining, with those going only so far for most uses.
Think about it for a minute. We're talking about machines that execute multiple instructions in the time it takes light to travel 3 inches. The CPU isn't much smaller than 1 inch, has a ridiculous number of wires tangled therein, and needs a little time for signal changes to settle down from transition to the next.
Yes, if we throw out the standard chemistry which the entire industry is built on, we can make some improvements. That still leaves us not far from the theoretical limits of switch physics (but still room for some pretty awesome reductions in power consumption).
To make the same kind of computing progress I've seen over 30+ years, we'll have to throw out not just silicon, but deterministic electrical physics.