... I have worked in studio environments as well as in my own for the last decade... I know these basics
But If we apply that yardstick one should never update because what works now will work forever until it becomes obsolete... or redesigned workstations take over which circles us back to the reason all these talks over a Mac pro is taking place... so it just works doesn’t ring true if what worked earlier is no longer an option.
No. You're confusing yourself. Hardware is rarely if ever going to be faulty coming out of a factory. If it's going to fail, it's going to fail within initial operation hours. Software, OS updates and drivers are very different. You can upgrade hardware, and unless it's faulty, it's going to work the first time and will continue to work until it dies of old age, so to speak.
I'm referring to people, mainly Windows people, who'll update graphics, sound, third-party hardware drivers any chance they get without sitting back and reading up on any bug reports people have submitted. People know MacOS is buggy and the quality has come down in recent years, but they'll still update and then spend anywhere from a few hours to a week troubleshooting and restoring from an old image if they can't address or rollback the update.
With Windows users, the major issues I've seen are sound drivers causing crashes. Bad graphics drivers that cripple performance or introduce visual bugs. And up until a few years ago, SSDs on Windows had an issue with mobos that supported hot plug n play. SSDs would sometimes turn off and crash the system. Later updates addressed these issues. No damage to the SSDs, but it made people grumpy.
I've brought up the rarity of faulty modern hardware before elsewhere and someone linked me to an article about Sony burning through Mac Pros for some Marvel movie. Which I would assume wouldn't be the case if they used a cluster. The article, IIRC, made it seem as if sections were edited and rendered to final product on single Mac Pros.
Yeah, the graphics processor on the nPro is iffy, but so is whatever Sony did.
Looping back now. If MacOS Sierra is working just fine for you and your NLE is working smoothly without a single hiccup, why dick around with updating MacOS to High Sierra when it reaches gold status? It's still going to be buggy for some or cause major issues that hadn't popped up with testers.
Wait 2-3 months and let others suffer before your workload, your stress and most importantly, your business suffer. Now do you understand why I said what I said?
FWIW, I recently updated my graphics cards' drivers after having not done so for 3 years. Because nearly every release between the time I bought them and just two days ago was filled with bugs that either crippled performance or introduced bugs in various software ranging from gaming to video editors.
To me, buying a high end workstation or building one myself, means I want to control when I do an update and I'll let others take the plunge before I update. So I know what I may encounter down the road. And, most importantly, if I can rollback or restore an image if things go south.
Edit: Not completely related, but I use two Dell IPS on my self-built Windows based workstation. In the time I've had these monitors (bought at around 1200 each IIRC), I've put 30,096 hours according to their information page. They've had software and or driver updates from Dell and Microsoft. Never installed them. It works. Color is accurate. Backlight is in great shape because I run them at 40-45% brightness 90% of the time.