But really we haven't come that far. If we took ideas from today back to 1999 we could reproduce what we have now without too much effort. The storage would be smaller, the screens worse and the battery life worse, but it wouldn't be a whole lot different.
This is an interesting and provocative take and really got me thinking about the changes I've seen in 30 years of programming. I'll offer up some thoughts, though more in the spirit of discussion than argument.
In addition to what you listed above, networks have gotten so much faster, ubiquitous, and reliable. This has enabled so much. And computational horsepower has incrementally inched up as costs (and power) per unit have trended down. It's wild that I can do what I do on relatively cheap commodity hardware. 25 years ago I had an expensive Sparc 20 (or RS6000 ... this really is memory lane) on my desk and a frame relay drop. Now it's a three year old entry level MBA literally wherever I happen to be. I think we might be blasé about writing and testing SQL stored procs against a million row table while flying 38,000' over the Pacific ... and then pushing the code to Github.
I also think that Apple was pretty forward thinking with the early OSX. The framework approach doesn't seem revolutionary today and there was never (for me anyway) one big moment but little by little frameworks like CoreData came into play and really changed how devs wrote code. Apple did a lot to get the abstractions correct and was able to sell it to developers. Functionality like what CoreData provides used to be developed ad hoc per application. What we have now requires a ton less churn and has provided many folks with the ability to develop and sell excellent apps. This sort of thing is largely invisible but matters a great deal.
Both of these things are a natural progression and were probably inevitable but a point in time snapshot 25 years apart does seem to show that we've made incredible progress. (And that really doesn't take into account how cheap it all is. That sparcstation I mentioned was at minimum a $12k purchase in 1995 dollars.)
All that said, I still use a terminal emulator with the same commands that I typed into the VT100 in my dorm room. (I had two of them -- and a blazingly fast 2400 baud modem -- because they'd overheat and I'd have to switch them out every so often.) And I still sling Erlang, C, and SQL in Emacs; all this has changed remarkably little. Though I do think it's interesting that literally anyone with a laptop can do the same. This really wasn't the case when I started.
I spent some time this year half way around the world and had a couple of experiences with Apple gear that really wowed me. (I'm not claiming these things are Apple only, I don't have experience with Windows or Android.)
First, I was able to take a picture of a Thai word I didn't know, highlight it, and have it translated, instantly and on my phone. It seems commonplace now but consider how many cumulative hours went into making this possible: academic research, programing, hardware development, network infrastructure build out, and so on. There's a reductionist view that says there's nothing particularly groundbreaking here but consider that we actually spent all those hours (and money) and didn't stop when it got hard (or expensive).
The second moment when I was struck by how far we've come was watching a TV show over FaceTime with a friend back in the US. It's not only pretty wild that this is possible but that resources were dedicated to making this possible. To be sure, there was videoconferencing back in the 90s but it was with expensive specialized gear in the hard to book conference room and kind of didn't work very well. Now we can do this from pretty much whatever device is at hand with no incremental cost. And this isn't so that we can work harder or make more money (though that too) it's so we can watch TV with friends 12k miles away. This, to me anyway, is astounding.
I can't help but feel that it takes one type of person to build something (and Jobs was certainly that) and another, like Cook, to iterate over that and maintain a vision, even if this lacks the big paradigm shifting moments.