Yeah, the artists... I am aware but that a niche category as is neuroscience. There are very few apps in all these areas combined. Apple preparation for transitions is nothing special. The main efforts is with app developers. It's manageable when app/driver ecosystem is as limited as macOS ecosystem. The ecosystem for Windows is orders of magnitude bigger.
But ultimately, what matters for app developers is how hard supporting the new architecture is. That, firstly, depends on what their development tools are. One of the reasons that it took a while for the big players to recompile for Intel Macs is that is that many of them were using CodeWarrior, and CodeWarrior got abandoned rather than updated to produce Intel binaries. The same big players, having been forced much more closely into Xcode, Cocoa, and all the other Apple development tools since the Intel transition, were much, much quicker to move to Apple Silicon a decade later.
Then you have the completely lazy developers. I don't think it's a coincidence that the overwhelming majority of non-Apple Silicon-native software remaining today (at least on my MBP) appears to be written in Electron. I doubt it's hard to make a Universal 2 Electron app, but when your judgment is so impaired that you think Electron is a good idea, it doesn't surprise me that you don't see a problem with your app running in emulation. Even if making a Universal 2 version of their app would take them 20 minutes, those guys WILL NOT do it until macOS 16 or whatever version it is drops Rosetta 2, or at least throws a nasty, nasty dialog box before opening the app in Rosetta 2.
The big difference is not app developers as much as it is a chicken and egg problem in Windowsland. In Windowsland, neither Intel nor Microsoft sell the hardware. Dell, Lenovo, HP, etc. do - and they are the ones who decide what chips they'll buy from Intel and what software they will license from Microsoft. Their interest isn't in some big theoretical transition with theoretical benefits - their interest is in competing with each other, largely on price, to sell something that meets the immediate expectations of customers, i.e. something that runs the current/previous version of Windows and is compatible with the software customers care about.
For example:
- the response to Vista, Windows 8, Windows 10 (in the early days) and Windows 11 (today) by Lenovo, Dell and HP was, at least in the business world, to buy a licence for the current version of Windows, then use the downgrade rights to pre-install the older version that business customers prefer.
- Dell/Lenovo/HP/etc sabotaged the Windows 8 vision of migrating Windows to a touch platform - they took the view that their customers were unwilling to pay extra for a touch screen, and to the extent that Microsoft's newest OS ran poorly on a non-touch-screen machine, not their problem, they were going to keep shipping tens of millions of touch-screen-less systems. And a decade after Windows 8, touch on Windows remains largely a joke.
- Intel wanted to migrate the PC world from legacy BIOS to UEFI... and it took until, oh, 2015 or 2016 for most systems to ship with full UEFI boot enabled, at least on the business side - people wanted Windows 7 downgrade systems, Windows 7 is simpler to boot BIOS (I am pretty sure you CAN boot it UEFI if you try hard enough), so let's enable the UEFI CSM backwards compatible mode, partition the boot drive MBR, and off they go. Oh, and if you want to migrate from BIOS/MBR to UEFI/GPT, at least around the time Windows 10 came out, you had to do a
full repartition of the drive followed, obviously, by reinstalling Windows from scratch. I think Microsoft later added a tool to convert drives to GPT... but originally, you couldn't do an in-place upgrade from 7 installed in BIOS/MBR mode to 10 without leaving 10 in BIOS/MBR mode after the upgrade.
- VGA ports. I think Intel wanted to get rid of VGA ports for years, but again, every PC OEM kept sticking VGA ports on things, offering the lowest tier of docking stations as VGA-only, etc. (And every cheap PC monitor manufacturer largely only put VGA ports on their cheap models) Finally, Intel took out the ability to output VGA from their on-processor GPUs... and I am pretty sure that HP, at least, found a way to continue offering a VGA output port from those machines! (Presumably by effectively soldering a DP or HDMI to VGA adapter on the motherboard or in the docking station?)
- SSDs. SSD adoption in Windowsland was massively, massively slowed down by the fact that Lenovo/HP/Dell were terrified of putting a system with 128/256GB of (SSD) storage on the shelf next to a system that had 1TB of (slow HDD) storage. They figured, probably rightly, that the sales people at worst buy couldn't explain the benefits of SSDs to consumers, so the consumer would pick the one with more GBs and that was that.
The Dells, Lenovos, and HPs of the world are, more than anything, terrified of returns, especially on the consumer side (on the business side, I presume they are more worried that an IT manager will look at an overly avant-garde HP and decide to order 500 more conservative Dells instead because they don't want to replace 500 monitors or whatever). They don't want someone to take their new PC home, find that they can't plug it into their VGA monitor or parallel port printer, put in a DVD (it amazes me that they were still spending the money to include a DVD-RW drive on low-end laptops in 2018 or 2019, so they must have thought consumers wanted it) or run some eight year old software or whatever else, and then return it.
(It's worth noting that this is something that has never scared Apple - certainly Apple has never had a problem saying to people "oh, you want to plug a thing that uses connector X into your new Mac? well, here is the adapter from the modern port to X for USD$29" - and in the late 1990s, they didn't even offer the adapters that they did in the early 2010s. No Apple-branded USB-to-serial or USB-to-ADB adapter, no SCSI solution other than a PCI card option for your G3/G4, no external floppy drive that was compatible with Apple's legacy 400/800K floppies, etc. At least when they dropped FireWire, DVI, 30-pin dock connector, headphone jacks, mini-DisplayPort, DVD burners, wired Ethernet, etc, they offered Apple-branded adapters...)
The only way you have ever gotten a transition in Windowsland is if Intel/AMD offers a chip that is BETTER at doing today's workloads AND adds some additional abilities that are completely underused in the short term. That was the genius of AMD64 (and previously, the 386, etc) - no one bought Athlon 64s, Pentium 4s/Ds, or the first generation of Core 2 Duos/Quads to run a 64-bit OS. People bought them, and Dell/Lenovo/HP bought/sold them, because those those processors ran 32-bit Windows (or Linux, for that matter) better than the 32-bit-only chips that preceded them. Then, 5-7 years later, hardware vendors start to think "gee, there looks like there's an installed base of amd64 chips out there, maybe we should start writing drivers for the amd64 version of Windows" And, by about 2009 or 2010, ordinary consumers can migrate to 64-bit Windows with relatively little disruption (though I seem to remember my mom having to throw out a printer with no 64-bit drivers), so HP/Dell/Lenovo start preloading the amd64 version of Windows, at least on consumer systems. And a few years later businesses start making the move...
And when you look at ports - other than audio/Ethernet/wifi becoming built-in to most Windows machines, the only port that really got added across the board that I can think of was USB-A. And that's undoubtedly because USB is built into the Intel chipset and therefore, at least for the iteration that's built-into the chipset, very cheap to support. But other things like IEEE 1394/FireWire (which I've had on various Windows systems) never, ever reached critical mass in Windowsland because it was just too expensive for Dell/Lenovo/HP to include it on systems where customers didn't insist on them. Even USB-C remains shaky in 2022 - there are a lot of low-end Windows systems, including business ones, that only offer USB-A ports. Relatively high-end businessy systems will give you one USB-C port, maybe two. The only Windows laptops that will give you more USB-C are a few high-end MacBookPro-wannabes like the Dell XPS.
Frankly, in Windowsland, cheap and compatible always, always, always carries the day. It has, at the very least, since the big mutiny against Vista, which I continue to believe was a perfectly fine operating system... on "good" hardware. But no one wanted to pay for good hardware anymore, so HP/Dell/Lenovo sold it on underpowered hardware with Intel/Microsoft's reluctant blessing and people were upset at how poorly it performed. And the lesson everybody learned from that debacle was to never push the envelope on anything - just keep the hardware requirements roughly steady as she goes, don't introduce new driver models or anything else that might cause compatibility issues, don't take out any ports or anything that anybody might rely on, etc.