How come Microsoft seems to get it's development techniques all wrong?
This is how microsoft did it:
Windows 1.1 > Windows 3.1
Generally just a GUI change, still based on DOS. New graphics and sound engine...
Windows 3.1 > Windows 95
Generally just a GUI change, some new apps. Everything people were used to changed.
Windows 95 > 98
Generally just new things like auto device detecting (didn't that work well?

) and, WOW, they can integrate HTML into explorer! How useful! Everyone needs that!
Windows 98 > ME
Bill Gates kids changed the bits they thought needed changing.
Windows ME > 2000
GUI still the same as 6 years ago. Oh no, all the buttons have moved around, no-one knows where anything is for another 6 months.
Windows 2000 > XP
New ugly GUI! Windows 2000 with a bit of NT. Great. Windows no longer based on DOS (15 years after Windows first came along).
Why can't Microsoft get people used to something in an OS, and just upgrade the bits that need upgrading? Apple got it right in the first place with System 1, and every OS up to OS 9 changed the things that needed changing (colour introduced, better app support, better integration etc...) until OS X, when they changed it to a different but recognisable and understandable GUI with completely different innards.
So all together Apple have really had 3 main OSes, System 1, System 7 and OS X. They all had similar features, the finder worked off the same principles (okay, maybe system 1 was a bit wierd), and the basic idea behind the OS remained the same, one toolbar at the top which changes for the app you're in.
Windows seems to have changed so much on the top over the years, that most people who use XP wouldn't be able to use 3.1, or even 95.
If you got someone who was used to OS X, I bet they would also be able to use a System 1 based computer with no real difficulties...