What he had to do to create an image of computational power progression was to continously reduce applicability. As you move to vector processing, to multicore vector processing, to GPUs to NPUs, you can demonstrate nice progression in computing ”power” or operarations per second. What you are also doing is reducing the applicability of the technology to ever more limited use cases.
There is a certain amount of co-evolution going on of course - if computing power is only significantly growing in a certain small area of computing, software guys are going to see if they might be able to leverage that for at least a portion of their code. That’s better than seeing almost no progress at all, but not only does it suffer from the narrowing applicability of the underlying hardware resource, but also from the mathematical fact of Amdahls’ law.
Bottom line - us oldtimers have already seen the rate of progress slow down tremendously and there is nothing really indicating that this ”pushing your way up an exponential rise” won’t be the future as well. Smaller improvements, slower, at higher cost.
To what exent this constitutes a problem depends on where your interest in the industry lies.
Excellent post thank you.
It's interesting to see where software is going. Like you, my first few computers were not internet connected, so everything was an app, and there was no such thing as a browser.
Now for most non-content-creation specialists, by far the heaviest app they run is the browser and the various tasks they do in it. It's remarkable to see how many individual apps now exist in both app form and as an in-browser window (mostly to cater for individual preferences and slightly different use-cases).
(Side note 1: Kind of a return to Job's original vision of an app-less device, everything running through the browser. It was the wrong time for the iPhone & thankfully he abandoned that vision. But looks to be coming somewhat true in disguised form.)
(Side note 2: Outside the office, looks like most of the world's computing is now on single-window devices - also a callback to computing's early days
Heavyweight specialist apps still exist for content creation, film editing, CAD (if it still exists on Mac OS) etc. But these are the specific usecases that benefit from the specialised on-chip hardware Apple is putting in. Nothing new about that, Intel & AMD have done it for ages, especially for video codecs.
What it points to is that likely there's nothing left in general purpose mass-market computing (*with one exception) that really pushes a M1 chip. Numerous reviews of the M1 machines have commented on how hard it is to make them chug. So from this point forward, what's left for Apple M-series chips? More on that later.
The exception? Games. All devices - desktop & laptop & mobile struggle with high-powered gaming, especially in 4K / VR.
(Side note 3: It's been a weird journey to see iOS become probably the biggest gaming platform in the world, while Mac gaming remains dead & will probably stay dead for the future.)
How is that being solved? Cloud gaming. Instead of spending $1500+ on a RTX3080 GPU, subscribing to a service like GeForce Now's 'RTX3090 Tier' gets you a RTX3080 level service on all your devices for $150-odd a year. I have a sub and it is excellent on cable, but improved dramatically when cheap gigabit fibre came to my street a few months ago with 1ms latency. I cannot see myself ever spending $1000 on a big GPU again. It's still early days for cloud gaming and it's not suitable for or accessible to everyone, but its been a big change in the last 12 months.
So what does the future hold for Apple's M-series chips? General mass computing is solved. Gaming is close to being solved by cloud gaming. Specialised content creation workflows are being boosted by specialist hardware paths in the chips. They always need more power, but they are coming close to being effectively solved.
The only 'hard' domains I can see left are power / battery consumption, AI, and VR.
VR: I have a VR headset and like it, but it is far, far from becoming a mass market general purpose tool. Apple has a long way to go on this. Check back in 5-10 years.
AI: Apple is pumping huge resources into on-board AI. I studied AI many moons ago & have forgotten all of it. AI today is like night and day. I type in 'cat' on my phone and it gives me all my personal photos that have cats in them. Fookin magic. General purpose AI that arranges my day for me & does my shopping for me is far from reality (but there are apps & frameworks that try to do it). Check back in 5-10 years.
Power / battery consumption: The M-series chips are a huge jump forward, but still a long way to go. I want an Apple Watch / iPhone / MacBook that can last a whole week without charging. They will come but will need insanely powerful, insanely power-sipping M-series chips with insane software and hardware integration. Apple's on the right path here. But check back in 5-10 years.