Ah, well, back years ago when the Mac was Apple's main product, Apple would indeed put out updates or new models every year.
But the Mac is no longer Apple's breadwinner; the iPhone is. And, as such, the iPhone is now what gets updates or new models every year.
Nowadays, Mac hardware is no longer competitive, because Apple just isn't bothering to compete. That is what gets fans up in arms here...
There'd be a lot less whinging if Apple did a yearly minor change as I suggested - price cut, or basic spec boost (e.g. ram, storage, warranty). No engineering required, just a change in SKU. They could plan this into the product life span upon introduction. It'd be like a Mac Mini S
Phone buyers in the UK expect a different product every year at the very least. The consumer won't accept anything less often and Apple have to listen as it's their cash cow now.
Macs aren't their cash cow so our voices aren't going to be as important. The overall speed increase between generations of Intel chips is in single digits now, since before Sandy Bridge CPUs were out, and performance per dollar isn't that impressive between generations.
Apple rightly identifies GPU as the next part that will distinguish Macs between generations but neglects to put one in most of their range and even then only out of date ones - based on AMDs motivation for market share by discounting it would seem. This is what's caused the
Oculus Rift guys to claim that Apple don't make a 'decent' PC. But then it launches Mac pro with sub-optimal and out of date GPUs and abandons it for 3 years. It releases iMacs which need graphics cards to run the 5k retina screens but despite effectively annual updates they remain out of date too.
You can only really argue that Apple are interested in GPU for their compute ability (e.g. video rendering) and not their ability to play Overwatch on a Mac, even the ability to run a 5k panel using one USB-C cable isn't going to move Apple.
Obviously Tim Cook has latterly said that
VR (virtual reality) isn't as interesting as AR (augmented reality), but AR requires a camera (and therefore he's only talking about iPads or iPhones). We shouldn't expect top of the line GPU in a standalone Mac any time soon. Iris Graphics will continue to be adequate enough for Apple.
Here's a nice parallel with Sony. The PS4 was launched in winter 2013 and has only just been effectively obsoleted because they have launched a 4k version. During that time they introduced a version with a bigger hard drive and retailers added packs with various games etc but prices have always drifted downwards as consumer electronics usually do. The UK price at laugh was apparently £349, it's now dropped closer to £249 with games years later.
The 4k version has been announced with uprated graphics (they call it the PS4 Pro - oh the irony!) and there's a slim version of the original PS4 available people will generally expect the old model to continue the decline in prices.
The user base gets higher as people jump on the band wagon with steadily decreasing prices. Sony get their cut of games (like Apple get their 30% from the App store).
So Sony haven't bothered fiddling around with the specs too much, they just let the price slide to keep interest up because they get their cut from the games.