You're saying Apple had very specific hardware in 2006 to do a different version of the same UI stuff they've been doing for years. Somehow StageManager/Shrinkydink needed a different set of hardware than Expose, Mission Control, App Switcher, etc? Not a chance. It's moving windows around. The same stuff they've been doing for eternity. There's no way an ancient desktop chip has something magical for moving windows that Apple's A chips don't.
Never heard of Shrinkydink but all of the things you're talking about are desktop computer processors that run desktop OS'es. The A-series was not designed to be a desktop processor but to run iPhones and older iPads, so they left out all the stuff iOS didn't need. Why would you waste millions of transistors on valuable die space for a feature you never plan to use? Nobody puts stuff on a chip they don't plan to use because it's too expensive and pointless. The reason I think it was virtual memory is because desktop OS'es have supported that for decades, but iPhones have never had it, and therefore Apple never bothered to put in controllers for that on the A-series chips.
It's not like Intel or AMD or Qualcomm, either, who have to make generic processors for a wide audience. Apple's hardware team makes their chips for one customer and the requirements are exact. Support feature A, B, and C for iOS, but we don't need D or E because iOS doesn't need them. So Apple engineers put in circuitry into their SoC's for A, B, and C. But D is needed on a desktop processor, so the M1 was born, which has all the circuitry a desktop needs. Hence why the initial requirements were M1-only. Now Apple figured out a way to make Stage Manager work using circuitry already on the A12X/Z chips that don't have the required controllers. It won't work as well, but it still works "acceptably". If I were to guess, they were experimenting with various algorithms that used the main cores rather than an actual virtual memory controller. The main cores are general purpose where the phrase, "jack of all trades, master of none" can be applied.
Put it this way. Have you ever mined crypto? Less common crypto currency can be mined on a graphics card. But not all of them can, such as bitcoin and dogecoin, though they used to be when difficulty was low. Originally bitcoin could be mined on a cheap CPU. Their difficulty level has gotten so high, CPU's and GPU's are now useless. They could run the algorithm but it would take decades to mine a single coin. Mining those coins requires ASICs (application-specific integrated circuits), special chips designed for one purpose and one purpose only. They can't do much of anything else but for that one task, they are super optimized to do it. A controller is the same thing. A memory controller, for instance, is designed to move things in and out of memory. The design is specific for that task. You can't use the memory controller to calculate PI to the 5 billionth digit. That's what the general purpose cores are for or special math co-processors that some computers had in years past. Whatever it is that was present on the M1 but not on the A12Z required Apple to find an alternative that could help run SM in an acceptable manner that wasn't complete dog s**t. General purpose cores are designed to do all sorts of things, but none of them as good as a dedicated ASIC.
As I said, if the circuitry wasn't required on the A12X/Z for iOS, Apple didn't include it. Apparently whatever was needed for SM wasn't included on those chips, so they had to find a workaround. It didn't matter if windows have been movable for thrity years. For one thing, you're assuming it had to do with windows, when it's more likely something that allows them to run four apps simultaneously. For SM, they need all four to be awake. That's why i think it had to do with virtual memory. I could be wrong, but that's an educated guess as a programmer. The M1 has all the stuff a desktop OS needs, so it can run 8 apps simultaneously and easily. Apparently this workaround has its own limitations.