I agreed with everythig you wrote, except this.
Here's some background: The 386 chip is the one which adds 32bit support, and a proper implementation of protected mode. (The 286 had a gimped ptoected-ish mode). Before that, chips ran in real mode. Protected mode is what allows for segregating processes from each other, and the O/S. It also keeps processes from being able to use some instructions, do I/O, and handle interrupts. This forces processes to go through the operating system.
32 bit support in the 386 is fully backwards compatible. An instruction is assumed to be 16 bit unless is has a special prefix to bump it up to 32 bit, or there can be a global mode bit too. When the operating system does a context switch from a new 32 bit app to an old 16 bit app, all it has to do is properly set that mode bit, and everything works properly.
There is a third mode the CPU can be in, beyond real and protected, called V86 or Virtual 8086, which means that a process sees the system as if it were in real mode, but things it does get trapped by the O/S, and mapped to their equivalent protected mode counterparts. This is what happenned in Windows95 when old MSDOS games were run in a "DOS box".
So, the CPU specifically supports backwards compatibility of old applications, inside a new O/S.