Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah, the artists... I am aware but that a niche category as is neuroscience. There are very few apps in all these areas combined.
My point is that the Mac has a lot of niches with a lot of specialized software each, which adds up to a lot of mission-critical apps that need to be able to handle each transition gracefully. Apple generally makes it *look* easy on their end, but it’s absolutely not (see: the lack of Windows Mobile compatibility on Windows Phone at an app or API level, Microsoft’s repeated attempts at pushing devs from the Win32 APIs to UWP).
Apple preparation for transitions is nothing special. The main efforts is with app developers. It's manageable when app/driver ecosystem is as limited as macOS ecosystem. The ecosystem for Windows is orders of magnitude bigger.
I think the key difference between macOS and Windows in terms of macOS’s ability to handle transitions is that Apple has never been afraid to deprecate and phase out APIs or technologies that are at odds with their game plan (e.g. Carbon and 32-bit in Catalina). Being willing to drop 32-bit compatibility made the Apple Silicon transition the next year much easier, since they didn’t have to add 32-bit instruction support into their chips. Once you only have to worry about x86_64 compatibility your transition gets a lot easier.

This approach means extra work for devs, sure, but Apple also puts a ton of work into making those app/API updates as painless as possible and signaling their intentions far in advance (e.g. deprecating OpenGL in Mojave, even though it’s still supported 4 releases later).

Microsoft, on the other hand, has built their brand and reputation on long-term API support and backwards compatibility, so they don’t have quite the same flexibility. That means you can run most Win95 apps fine on Windows 11 while Macs haven’t been able to run OS 9 apps since the Intel transition, but it also means the system and software ecosystem is a lot more disjointed (e.g. modern color-accented Metro-style settings pages giving way to Windows 98 control panels when you click an “Advanced” button).
 
Apple preparation for transitions is nothing special. The main efforts is with app developers. It's manageable when app/driver ecosystem is as limited as macOS ecosystem. The ecosystem for Windows is orders of magnitude bigger.
The word is "convoluted". The PC ecosystem even failed to move away from PS/2 ports. They only ever add to the clutter. It's not a transition, if you don't leave the old technology behind. Otherwise the drawbacks of being compatible with everything will slow you down. Right now macOS is delivering universal binaries, but before Windows on ARM can say "we're ready" Apple will have eliminated every trace of x86 legacy code from their entire ecosystem. Think about it. If Microsoft was able to make transitions, Android wouldn't even exist. It's actually normal that every company is only at the forefront of one technological revolution and goes bankrupt with the next. Steve Jobs was absolutely right to highlight that fact at the beginning of the iPhone keynote in 2007.
 
  • Like
Reactions: AlphaCentauri
Yeah, the artists... I am aware but that a niche category as is neuroscience. There are very few apps in all these areas combined. Apple preparation for transitions is nothing special. The main efforts is with app developers. It's manageable when app/driver ecosystem is as limited as macOS ecosystem. The ecosystem for Windows is orders of magnitude bigger.
But ultimately, what matters for app developers is how hard supporting the new architecture is. That, firstly, depends on what their development tools are. One of the reasons that it took a while for the big players to recompile for Intel Macs is that is that many of them were using CodeWarrior, and CodeWarrior got abandoned rather than updated to produce Intel binaries. The same big players, having been forced much more closely into Xcode, Cocoa, and all the other Apple development tools since the Intel transition, were much, much quicker to move to Apple Silicon a decade later.

Then you have the completely lazy developers. I don't think it's a coincidence that the overwhelming majority of non-Apple Silicon-native software remaining today (at least on my MBP) appears to be written in Electron. I doubt it's hard to make a Universal 2 Electron app, but when your judgment is so impaired that you think Electron is a good idea, it doesn't surprise me that you don't see a problem with your app running in emulation. Even if making a Universal 2 version of their app would take them 20 minutes, those guys WILL NOT do it until macOS 16 or whatever version it is drops Rosetta 2, or at least throws a nasty, nasty dialog box before opening the app in Rosetta 2.

The big difference is not app developers as much as it is a chicken and egg problem in Windowsland. In Windowsland, neither Intel nor Microsoft sell the hardware. Dell, Lenovo, HP, etc. do - and they are the ones who decide what chips they'll buy from Intel and what software they will license from Microsoft. Their interest isn't in some big theoretical transition with theoretical benefits - their interest is in competing with each other, largely on price, to sell something that meets the immediate expectations of customers, i.e. something that runs the current/previous version of Windows and is compatible with the software customers care about.
For example:
- the response to Vista, Windows 8, Windows 10 (in the early days) and Windows 11 (today) by Lenovo, Dell and HP was, at least in the business world, to buy a licence for the current version of Windows, then use the downgrade rights to pre-install the older version that business customers prefer.
- Dell/Lenovo/HP/etc sabotaged the Windows 8 vision of migrating Windows to a touch platform - they took the view that their customers were unwilling to pay extra for a touch screen, and to the extent that Microsoft's newest OS ran poorly on a non-touch-screen machine, not their problem, they were going to keep shipping tens of millions of touch-screen-less systems. And a decade after Windows 8, touch on Windows remains largely a joke.
- Intel wanted to migrate the PC world from legacy BIOS to UEFI... and it took until, oh, 2015 or 2016 for most systems to ship with full UEFI boot enabled, at least on the business side - people wanted Windows 7 downgrade systems, Windows 7 is simpler to boot BIOS (I am pretty sure you CAN boot it UEFI if you try hard enough), so let's enable the UEFI CSM backwards compatible mode, partition the boot drive MBR, and off they go. Oh, and if you want to migrate from BIOS/MBR to UEFI/GPT, at least around the time Windows 10 came out, you had to do a full repartition of the drive followed, obviously, by reinstalling Windows from scratch. I think Microsoft later added a tool to convert drives to GPT... but originally, you couldn't do an in-place upgrade from 7 installed in BIOS/MBR mode to 10 without leaving 10 in BIOS/MBR mode after the upgrade.
- VGA ports. I think Intel wanted to get rid of VGA ports for years, but again, every PC OEM kept sticking VGA ports on things, offering the lowest tier of docking stations as VGA-only, etc. (And every cheap PC monitor manufacturer largely only put VGA ports on their cheap models) Finally, Intel took out the ability to output VGA from their on-processor GPUs... and I am pretty sure that HP, at least, found a way to continue offering a VGA output port from those machines! (Presumably by effectively soldering a DP or HDMI to VGA adapter on the motherboard or in the docking station?)
- SSDs. SSD adoption in Windowsland was massively, massively slowed down by the fact that Lenovo/HP/Dell were terrified of putting a system with 128/256GB of (SSD) storage on the shelf next to a system that had 1TB of (slow HDD) storage. They figured, probably rightly, that the sales people at worst buy couldn't explain the benefits of SSDs to consumers, so the consumer would pick the one with more GBs and that was that.

The Dells, Lenovos, and HPs of the world are, more than anything, terrified of returns, especially on the consumer side (on the business side, I presume they are more worried that an IT manager will look at an overly avant-garde HP and decide to order 500 more conservative Dells instead because they don't want to replace 500 monitors or whatever). They don't want someone to take their new PC home, find that they can't plug it into their VGA monitor or parallel port printer, put in a DVD (it amazes me that they were still spending the money to include a DVD-RW drive on low-end laptops in 2018 or 2019, so they must have thought consumers wanted it) or run some eight year old software or whatever else, and then return it.

(It's worth noting that this is something that has never scared Apple - certainly Apple has never had a problem saying to people "oh, you want to plug a thing that uses connector X into your new Mac? well, here is the adapter from the modern port to X for USD$29" - and in the late 1990s, they didn't even offer the adapters that they did in the early 2010s. No Apple-branded USB-to-serial or USB-to-ADB adapter, no SCSI solution other than a PCI card option for your G3/G4, no external floppy drive that was compatible with Apple's legacy 400/800K floppies, etc. At least when they dropped FireWire, DVI, 30-pin dock connector, headphone jacks, mini-DisplayPort, DVD burners, wired Ethernet, etc, they offered Apple-branded adapters...)

The only way you have ever gotten a transition in Windowsland is if Intel/AMD offers a chip that is BETTER at doing today's workloads AND adds some additional abilities that are completely underused in the short term. That was the genius of AMD64 (and previously, the 386, etc) - no one bought Athlon 64s, Pentium 4s/Ds, or the first generation of Core 2 Duos/Quads to run a 64-bit OS. People bought them, and Dell/Lenovo/HP bought/sold them, because those those processors ran 32-bit Windows (or Linux, for that matter) better than the 32-bit-only chips that preceded them. Then, 5-7 years later, hardware vendors start to think "gee, there looks like there's an installed base of amd64 chips out there, maybe we should start writing drivers for the amd64 version of Windows" And, by about 2009 or 2010, ordinary consumers can migrate to 64-bit Windows with relatively little disruption (though I seem to remember my mom having to throw out a printer with no 64-bit drivers), so HP/Dell/Lenovo start preloading the amd64 version of Windows, at least on consumer systems. And a few years later businesses start making the move...

And when you look at ports - other than audio/Ethernet/wifi becoming built-in to most Windows machines, the only port that really got added across the board that I can think of was USB-A. And that's undoubtedly because USB is built into the Intel chipset and therefore, at least for the iteration that's built-into the chipset, very cheap to support. But other things like IEEE 1394/FireWire (which I've had on various Windows systems) never, ever reached critical mass in Windowsland because it was just too expensive for Dell/Lenovo/HP to include it on systems where customers didn't insist on them. Even USB-C remains shaky in 2022 - there are a lot of low-end Windows systems, including business ones, that only offer USB-A ports. Relatively high-end businessy systems will give you one USB-C port, maybe two. The only Windows laptops that will give you more USB-C are a few high-end MacBookPro-wannabes like the Dell XPS.

Frankly, in Windowsland, cheap and compatible always, always, always carries the day. It has, at the very least, since the big mutiny against Vista, which I continue to believe was a perfectly fine operating system... on "good" hardware. But no one wanted to pay for good hardware anymore, so HP/Dell/Lenovo sold it on underpowered hardware with Intel/Microsoft's reluctant blessing and people were upset at how poorly it performed. And the lesson everybody learned from that debacle was to never push the envelope on anything - just keep the hardware requirements roughly steady as she goes, don't introduce new driver models or anything else that might cause compatibility issues, don't take out any ports or anything that anybody might rely on, etc.
 
  • Like
Reactions: ahurst
Like I have said before, Apple has brought the iMac back to what it originally was made for, a consumer level computer. If Apple releases a 27 inch iMac, I think the only upgrade will be screen size. The iMac has gone back to a consumer device. The Mac Studio and display is the Pro-Sumer device and the Mac Pro will continue to be the ultimate pro workstation (when it gets updated)
What’s your reasoning? One product (M1 iMac) that replaced an already consumer-level product?

We can’t really extrapolate much of anything over the last 3 years because of a pandemic, supply shortages, chip transition, etc.

What we do know for certain is that things have not gone to plan as they tried to transition within 2 years and have not met that goal. So in my opinion, we can’t say “Apples plan is to make iMac consumer-only because they have only released one Apple silicon iMac” - as we just don’t have enough info. The MBP, mini, studio, iMac all have only had one generation so far. We need to give it time before we assume what Apples real vision is for any of these lines.

When Apple is 3-4 generations into apple silicon and there still is only a consumer-level iMac I will agree, but frankly there isn’t enough to go off of right now to really know what their plan is. Though calling it the iMac 24” does sort of imply that there are going to be other sizes and there have been rumors from historically credible sources that point to some sort of higher-end iMac.
 
Last edited:
I think the key difference between macOS and Windows in terms of macOS’s ability to handle transitions is that Apple has never been afraid to deprecate and phase out APIs or technologies that are at odds with their game plan (e.g. Carbon and 32-bit in Catalina). Being willing to drop 32-bit compatibility made the Apple Silicon transition the next year much easier, since they didn’t have to add 32-bit instruction support into their chips. Once you only have to worry about x86_64 compatibility your transition gets a lot easier.

This approach means extra work for devs, sure, but Apple also puts a ton of work into making those app/API updates as painless as possible and signaling their intentions far in advance (e.g. deprecating OpenGL in Mojave, even though it’s still supported 4 releases later).

Microsoft, on the other hand, has built their brand and reputation on long-term API support and backwards compatibility, so they don’t have quite the same flexibility. That means you can run most Win95 apps fine on Windows 11 while Macs haven’t been able to run OS 9 apps since the Intel transition, but it also means the system and software ecosystem is a lot more disjointed (e.g. modern color-accented Metro-style settings pages giving way to Windows 98 control panels when you click an “Advanced” button).
Two words: Windows Vista.

That was the last time Microsoft tried to push the envelope. Big under-the-hood improvements, new driver model, serious tightening up of security, new graphics engine that relied on the GPU, etc.

I used to be a Windows guy and an early adopter. (I suppose I'm still a Windows guy and an early adopter on that side, but I also have 3 current + 1 vintage Macs) Ran Win95 on launch day; Win98 a few months after launch (long story involving a free upgrade offer with a PC); XP a few months after launch. Vista was no worse than any of those operating systems in the first year - probably better, in fact (pre-SP1 XP was lousy, most people stuck with Windows 2000 or even 98). But Vista was late and XP was the current version of Windows for over five years, which means people had gotten comfortable with it, all hardware supported it, Moore's law meant it ran fast on 2006-era low-end hardware, etc, and they had forgotten how rough the first year of a new version of Windows always was.

And Windowsland basically mutinied at what was actually a fairly normal OS transition. Just said "no thanks, we don't want to have to replace a few peripherals with some that have Vista-compatible drivers. We don't want to throw out our two-to-four-year-old low-end-systems with lousy Intel onboard graphics that, quite intentionally, did not support the features that Vista was known would require. We don't want annoying security prompts when we use software that was wrongly designed to always be run with admin privileges, and we sure don't want to update that software to newer versions with properly-implemented security. We want to keep buying the low-end systems that still barely support the graphics features required for Vista. Just leave us alone with our XP, stop trying to take our money with this fancy hardware we don't need, these updates to third-party software that worked fine, and that's it."

And, long-term-wise, the Vista mutiny killed Windows as a platform. Maybe they would have had a chance to recover when Windows 7 smoothed over the hurt feelings, except then they came out with Windows 8 and the touch interface that was utterly ill-suited to how people actually used Windows. But the bottom line is that software developers could not move beyond XP as a baseline for... many, many years, probably about 2015. (Imagine if Macland in 2014 was still trying to write software compatible with 9.2.2 and/or 10.1!) And so Windows software began to stagnate. Microsoft's new frameworks all flopped. People stopped writing new software for Microsoft's frameworks (either new or old), instead writing web-based things targetting Chrome. Everybody became terrified of increasing the hardware requirements on anything. Etc. Macs became cool. Windows users became more and more entrenched in their ways. Etc.

The disjointedness of the interface is a different issue and has more to do with Microsoft's schizophrenic attitude to UI design. If you look at, say, System Preferences in OS X, I think Ventura is the first release that has fundamentally changed the paradigm. Otherwise it's probably been the same since 10.0.

While Apple may be aggressive with changes under the hood, they are actually shockingly conservative when it comes to UI. Keyboard shortcuts, key UI paradigms, etc haven't changed since the 1984 Mac. Some things may have changed once or twice in the entirety of the OS X era (e.g. I think the Apple menu is largely the same as it was in 10.0). It actually can mean that people without classic Mac experience and extensive Windows experience are a bit confused trying to transition to a Mac because, well, there are some things that are silly that macOS Ventura does because they have always been done this way - e.g. why do you have to hit Cmd-O to open a document in the Finder, while return puts you into a file renaming mode? I don't think that is the most intuitive way of doing it, but it's what Finder 1.0 did in 1984, so it's still doing it today, 38 years later. Meanwhile, I went from extensively using a Mac SE with 6.0.5, dabbling with a IIsi running 7.5.3, back to the Mac in 2015 (Yosemite, I guess? I forget) and... I could recognize a lot of the UI paradigms that hadn't significantly changed from then! And when I picked up an MDD G4 a few months ago, well, Tiger isn't that different from Yosemite - maybe a little closer to classic Mac OS in some way (e.g. the lack of Launchpad, so the expectation is that you open software by going to the appropriate folder and double-clicking on it, just like you did it in System 6 or pre-Launcher versions of 7), but not that different.

Meanwhile, Microsoft keeps wanting to change things. There was nothing wrong with the Control Panel introduced in Windows 2000. They started rejigging some of the things in Vista/7, but clearly ran out of time to redo everything, so some things stayed with the Windows 2000 look. Then in Windows 8/10, they wanted to switch to a mobile-style settings app, but again, trying to put all the settings into that was probably impossible, so they only did the ones that they thought touch users would care about and hid the rest behind "advanced" options. Then in Windows 11, they finally migrated all the settings... to a different settings app that behaves somewhat differently from the one in 10. Etc. Part of the problem, I think, is that Microsoft seems to still be trying to find a "better" interface (and failing, and not realizing that even if the old interface is worse, everybody had figured out how to use it), while Apple takes the view that the design that Steve Jobs personally approved and that everyone has been using for two decades is Good Enough.

One final thought - there is something that makes a Mac a Mac. You put a 68K 9" B&W Mac running System 6 next to an M1 iMac running Ventura and think "these two machines are siblings" when in reality, they don't have a single hardware component in common (or even loosely related), they don't have a single line of code in common, they can't run the same software, etc. But they are both Macs. They have the same UI paradigms, the same keyboard shortcuts, the same keyboard layout, etc. Meanwhile, what defines Windows, an IBM PC compatible, etc is hardware legacy, not design or UI. Put a Win3.1 machine next to a modern desktop PC and... the most likely commonality you will observe is that they have similar-sized cases with 5.25" bays (though those are increasingly vanishing from many Windows machines). And if you open "Command Prompt", yeah, that looks familiar too. But otherwise the true birth of modern Windows is Win95, and even then, there have been a LOT of (unnecessary) UI changes since then.
 
  • Like
Reactions: ahurst
But ultimately, what matters for app developers is how hard supporting the new architecture is. That, firstly, depends on what their development tools are. One of the reasons that it took a while for the big players to recompile for Intel Macs is that is that many of them were using CodeWarrior, and CodeWarrior got abandoned rather than updated to produce Intel binaries. The same big players, having been forced much more closely into Xcode, Cocoa, and all the other Apple development tools since the Intel transition, were much, much quicker to move to Apple Silicon a decade later.

Then you have the completely lazy developers. I don't think it's a coincidence that the overwhelming majority of non-Apple Silicon-native software remaining today (at least on my MBP) appears to be written in Electron. I doubt it's hard to make a Universal 2 Electron app, but when your judgment is so impaired that you think Electron is a good idea, it doesn't surprise me that you don't see a problem with your app running in emulation. Even if making a Universal 2 version of their app would take them 20 minutes, those guys WILL NOT do it until macOS 16 or whatever version it is drops Rosetta 2, or at least throws a nasty, nasty dialog box before opening the app in Rosetta 2.

The big difference is not app developers as much as it is a chicken and egg problem in Windowsland. In Windowsland, neither Intel nor Microsoft sell the hardware. Dell, Lenovo, HP, etc. do - and they are the ones who decide what chips they'll buy from Intel and what software they will license from Microsoft. Their interest isn't in some big theoretical transition with theoretical benefits - their interest is in competing with each other, largely on price, to sell something that meets the immediate expectations of customers, i.e. something that runs the current/previous version of Windows and is compatible with the software customers care about.
For example:
- the response to Vista, Windows 8, Windows 10 (in the early days) and Windows 11 (today) by Lenovo, Dell and HP was, at least in the business world, to buy a licence for the current version of Windows, then use the downgrade rights to pre-install the older version that business customers prefer.
- Dell/Lenovo/HP/etc sabotaged the Windows 8 vision of migrating Windows to a touch platform - they took the view that their customers were unwilling to pay extra for a touch screen, and to the extent that Microsoft's newest OS ran poorly on a non-touch-screen machine, not their problem, they were going to keep shipping tens of millions of touch-screen-less systems. And a decade after Windows 8, touch on Windows remains largely a joke.
- Intel wanted to migrate the PC world from legacy BIOS to UEFI... and it took until, oh, 2015 or 2016 for most systems to ship with full UEFI boot enabled, at least on the business side - people wanted Windows 7 downgrade systems, Windows 7 is simpler to boot BIOS (I am pretty sure you CAN boot it UEFI if you try hard enough), so let's enable the UEFI CSM backwards compatible mode, partition the boot drive MBR, and off they go. Oh, and if you want to migrate from BIOS/MBR to UEFI/GPT, at least around the time Windows 10 came out, you had to do a full repartition of the drive followed, obviously, by reinstalling Windows from scratch. I think Microsoft later added a tool to convert drives to GPT... but originally, you couldn't do an in-place upgrade from 7 installed in BIOS/MBR mode to 10 without leaving 10 in BIOS/MBR mode after the upgrade.
- VGA ports. I think Intel wanted to get rid of VGA ports for years, but again, every PC OEM kept sticking VGA ports on things, offering the lowest tier of docking stations as VGA-only, etc. (And every cheap PC monitor manufacturer largely only put VGA ports on their cheap models) Finally, Intel took out the ability to output VGA from their on-processor GPUs... and I am pretty sure that HP, at least, found a way to continue offering a VGA output port from those machines! (Presumably by effectively soldering a DP or HDMI to VGA adapter on the motherboard or in the docking station?)
- SSDs. SSD adoption in Windowsland was massively, massively slowed down by the fact that Lenovo/HP/Dell were terrified of putting a system with 128/256GB of (SSD) storage on the shelf next to a system that had 1TB of (slow HDD) storage. They figured, probably rightly, that the sales people at worst buy couldn't explain the benefits of SSDs to consumers, so the consumer would pick the one with more GBs and that was that.

The Dells, Lenovos, and HPs of the world are, more than anything, terrified of returns, especially on the consumer side (on the business side, I presume they are more worried that an IT manager will look at an overly avant-garde HP and decide to order 500 more conservative Dells instead because they don't want to replace 500 monitors or whatever). They don't want someone to take their new PC home, find that they can't plug it into their VGA monitor or parallel port printer, put in a DVD (it amazes me that they were still spending the money to include a DVD-RW drive on low-end laptops in 2018 or 2019, so they must have thought consumers wanted it) or run some eight year old software or whatever else, and then return it.

(It's worth noting that this is something that has never scared Apple - certainly Apple has never had a problem saying to people "oh, you want to plug a thing that uses connector X into your new Mac? well, here is the adapter from the modern port to X for USD$29" - and in the late 1990s, they didn't even offer the adapters that they did in the early 2010s. No Apple-branded USB-to-serial or USB-to-ADB adapter, no SCSI solution other than a PCI card option for your G3/G4, no external floppy drive that was compatible with Apple's legacy 400/800K floppies, etc. At least when they dropped FireWire, DVI, 30-pin dock connector, headphone jacks, mini-DisplayPort, DVD burners, wired Ethernet, etc, they offered Apple-branded adapters...)

The only way you have ever gotten a transition in Windowsland is if Intel/AMD offers a chip that is BETTER at doing today's workloads AND adds some additional abilities that are completely underused in the short term. That was the genius of AMD64 (and previously, the 386, etc) - no one bought Athlon 64s, Pentium 4s/Ds, or the first generation of Core 2 Duos/Quads to run a 64-bit OS. People bought them, and Dell/Lenovo/HP bought/sold them, because those those processors ran 32-bit Windows (or Linux, for that matter) better than the 32-bit-only chips that preceded them. Then, 5-7 years later, hardware vendors start to think "gee, there looks like there's an installed base of amd64 chips out there, maybe we should start writing drivers for the amd64 version of Windows" And, by about 2009 or 2010, ordinary consumers can migrate to 64-bit Windows with relatively little disruption (though I seem to remember my mom having to throw out a printer with no 64-bit drivers), so HP/Dell/Lenovo start preloading the amd64 version of Windows, at least on consumer systems. And a few years later businesses start making the move...

And when you look at ports - other than audio/Ethernet/wifi becoming built-in to most Windows machines, the only port that really got added across the board that I can think of was USB-A. And that's undoubtedly because USB is built into the Intel chipset and therefore, at least for the iteration that's built-into the chipset, very cheap to support. But other things like IEEE 1394/FireWire (which I've had on various Windows systems) never, ever reached critical mass in Windowsland because it was just too expensive for Dell/Lenovo/HP to include it on systems where customers didn't insist on them. Even USB-C remains shaky in 2022 - there are a lot of low-end Windows systems, including business ones, that only offer USB-A ports. Relatively high-end businessy systems will give you one USB-C port, maybe two. The only Windows laptops that will give you more USB-C are a few high-end MacBookPro-wannabes like the Dell XPS.

Frankly, in Windowsland, cheap and compatible always, always, always carries the day. It has, at the very least, since the big mutiny against Vista, which I continue to believe was a perfectly fine operating system... on "good" hardware. But no one wanted to pay for good hardware anymore, so HP/Dell/Lenovo sold it on underpowered hardware with Intel/Microsoft's reluctant blessing and people were upset at how poorly it performed. And the lesson everybody learned from that debacle was to never push the envelope on anything - just keep the hardware requirements roughly steady as she goes, don't introduce new driver models or anything else that might cause compatibility issues, don't take out any ports or anything that anybody might rely on, etc.
Guys, when you are talking about app developers you forget about most of them. Most developers do not develop commercial software. Most software is developed for internal use and in support of different hardware. Nowadays, there is also a lot of web-related software. In many cases, once such software is developed, the developers get moved to new projects. Redeveloping such software requires a lot of time (years) and resources. Microsoft understands it and acts accordingly. Apple has never been a player in the enterprise and unless they change their ways, they never will be.
 
Guys, when you are talking about app developers you forget about most of them. Most developers do not develop commercial software. Most software is developed for internal use and in support of different hardware. Nowadays, there is also a lot of web-related software. In many cases, once such software is developed, the developers get moved to new projects. Redeveloping such software requires a lot of time (years) and resources. Microsoft understands it and acts accordingly. Apple has never been a player in the enterprise and unless they change their ways, they never will be.
Agreed, but...

It feels like most of that software in the past decade and a bit has been developed for Chrome. Before then, IE6 (there's a reason I think Microsoft is still shipping an IE compatibility mode in Windows Edge...), and before then, non-web-based Microsoft platforms like Visual Basic.

(For the record, I do not understand why such things are written for Chrome. Chrome changes every month and the vendor reserves the right to flip secret settings remotely mid-month. You'd think developers in the type of environment you've just described would want to target a more... stable... platform, wouldn't you?)

If Chrome is the main platform for most of these things, then... that barrier to entry to Apple playing in those segments of the enterprise is basically gone. Google supports Chrome on macOS just as much as they do on Windows...

It's the same thing as what I'll call the Citrix effect - if you are running your legacy internal-use-only-etc can't-afford-to-modernize-it developer-left-town-a-decade-ago custom Windows software on a Citrix server running the oldest possible version of Windows you can get away with, which I think is an increasingly common way of doing things, then... you don't need Windows on the end-user's computer. Citrix Workspace is just as well-supported on Mac as on Windows, though their Apple silicon rollout was somewhat of a mess (and why do they advertise an "Apple silicon" version that is actually universal, and an Intel version that's Intel? The people who write their web site do not understand any of the jargon around Apple silicon...).

That being said, there are plenty of other reasons why Macs wouldn't be particularly welcome in the enterprise. But interestingly, I think iOS largely muscled its way into the enterprise... (then again, not that there was much competition - BlackBerry, which understood the enterprise the way few people other than IBM or maybe Microsoft do, basically imploded when they botched their attempt at an Apple-style OS transition; I don't think Android is more enterprise-friendly than iOS, probably worse and harder to lock down if anything. And everything else was a colossal flop...)
 
Does Apple make more money if I buy a MacMini and a third party screen, than by selling me a 27" iMac? I am totally dumbfounded by the logic behind discontinuing the 27" iMac unless it is more profitable for them to sell a MacMini than the iMac.

No. Configure the base model Mac Studio with the base configuration of Studio Display. Compare that to the cost of a 27" iMac with the same amount of RAM (32GB) and the same SSD size (512GB). Funny story: THE COMBINATION OF THE MAC STUDIO AND STUDIO DISPLAY ARE AT THE SAME PRICE POINT! Up the RAM and storage from there. SAME PRICE POINT AGAIN! The only area where it's a downgrade is that you can't use unarguably less expensive third party RAM and after the fact. But if you're comparing stock Apple RAM, it's the same price. Remove the Studio Display and it becomes WAY LESS EXPENSIVE than the 27-inch iMac was prior to its final discontinuation.

The only "inconvenience" is that the monitor and the computer are not all the same thing. And even then, that's a difference of two power cables and enough space on one's deskA And given the way larger and way more vocal crowd of people that have been dying for a desktop Mac of the caliber of the 27-inch iMac that is neither Mac mini nor Mac Pro nor confined to an all-in-one design and has the flexibility to be used with any display whether Apple's or otherwise.

Maybe they'll be releasing a 27" iMac in 2023.

They're not done transitioning over to the ARM processor, so I think its too soon to say that the 27" iMac is dead

They seemed pretty clear about the Mac Studio + Studio Display combo being the replacement to the 27-inch iMac. With similar RAM and storage configurations, that combo is priced pretty much the same as an iMac (Retina 5K, 27-inch, 2020) was. What do people need from an all-in-one that you don't get from this combo? And no, user-replace-able RAM ain't coming back (but that has absolutely zero to do with the fact that it isn't an iMac and everything to do with the fact that this is how Apple Silicon works at a fundamental level).

Mac Studio plus Studio Display is the replacement for the iMacPRO it is way overkill for t a bottom of the range iMac with 27" screen and over double the price.

You are correct in that a base model iMac (Retina 5K, 27-inch, 2020) entailed a 6-core 10th Gen Core i5, a paltry 256GB SSD, and 8GB of RAM and that a base model Mac Studio (2022) entails a 10-core M1 Max, a 512GB SSD, and 32GB of RAM and that comparing the low-end of one doesn't at all equate to the low-end of the other.

However, where you are incorrect is that a base model Mac Studio (with those same entry level specs) with a base model Studio Display would be comparably priced to what a 2020 iMac configuration with an 8-Core 10th Gen Core i7, 32GB of Apple-supplied RAM, and 512GB of RAM, with 10 Gigabit Ethernet and one of the middle-of-the-road GPU options was priced at. That's kind of the point.

Yeah, you're missing something on the lower-end of the 27-inch iMac range for those that would've bought a base model 10th Gen Core i5 with 8GB of RAM and the 256GB SSD. But really, what you're looking for is an M1 Mac mini or iMac at that point.

I think the problem is we all got used to the 27" iMac including a nearly free 5K display. It was such a good deal for so long that Apple couldn't help but disappoint us when it ended. I think they've tried to stop it from affecting new 27" iMac sales by not offering one for a few years, so we'll all give up and move up to Studio+Studio Display or down to 24" iMac. Sucks for photographers like me, though. We bought the late 2014 and subsequently the 2020 27" iMac because it is hard to beat the 5K display for photo editing! I just can't expect the 2020 to continue getting updates as long as the 2014 did.

Honestly, no. You got used to upgradeable RAM and not having to factor in having the computer not be separate from the box. Otherwise the combo of the Studio Display and a base model Mac Studio is priced nearly identically to a 27-inch iMac with 512GB of SSD and 32GB of RAM. Like, they literally just made it so that you had the choice to pick a display other than Apple's and to not be screwed when/if anything happened to the display when the computer was otherwise perfectly functional.

I think Apple wants you to buy a Mac Mini and a Studio Display.

Those two together have a 2.3k base price. Didn't the 27" iMac have a 2k base price? Considering inflation that sounds pretty comparable to me in terms of price.

Sure you loose the sleek all in one, but you gain some additional modularity for future upgrades.

I think this is the point people are missing here. People have been wanting the kind of Mac that the Mac Studio is for decades now. There's just WAY more versatility with something like that compared to a 27-inch iMac. I'm as much of a fan of the Intel 27-inch iMac as anyone on here, but even I can't deny that it was the right move for Apple to make when transitioning that part of the Mac line over to Apple Silicon from Intel.

27 inch had a 1.5k base price, so that's pretty much surpassing inflation. Sugar coat it if you want, but it's a loss for consumers. Also, the 27 inch was user upgradable, so if I wanted 16 or 32 gigs RAM, I came our even further ahead.

The 2020 models started at $1800. Not $1500. And for your $1800, you got a 10th Gen 6-core Core i5, a video card with only 4GB of VRAM, 8GB of RAM, and 256GB of 100% un-upgradable T2-Security-Chip-driven SSD storage. Literally the only people out there who would be making use of such a machine are people that would've been fine with the grunt of a 21.5" iMac, don't install things, but wanted a larger screen OR people that want to play ONE game.

Point being, that there are better (and cheaper) options for consumers for whom the i7 and i9, 8-16GB VRAM GPUs, and 512GB and higher SSDs (that you can't upgrade after the fact) were overkill for. And if you really want a 27" 5K display attached to them, you now have choice.

Also, on a 27-inch iMac, you could only ever upgrade the RAM. Nothing else in that machine was user-upgradeable. Also RAM won't ever be user-upgradeable again with Apple Silicon. That's inherent and won't change even if they brought the form factor back from the dead.

It is very bizarre, seeing as they still sell the i5/i7 mini.
Why not keep the 27" iMac available until it's changed to AS.

IT WAS CHANGED TO APPLE SILICON! IT'S CALLED THE MAC STUDIO!

I'm not sure why this is evades the grasp of most of the people in this thread.

Apple's version of the same question: Do we make enough profit continuing to sell a whole Mac and a whole 27" screen at a "starting at" price below $2K? Apparently the answer to that was NO. Thus, the death- probably temporary- of a very popular iMac 27"

Then Apple was able to test to see if the faithful would pay just as much as that "starting at" price for the monitor alone. "We" did, even evangelizing it to all others like it is the one & only monitor for anyone with a Mini, Studio, (Clamshell) MBpro, etc.

Now that Apple has established the 27" monitor at the old "starting at" price for the same monitor PLUS an entire Mac + Keyboard + Mouse, when they then re-launch what will probably be called iMac Pro at 27"-30", it will probably be re-priced like the former iMac Pro. My guess: "starting at... $3499"... probably requiring another approx. $1000 to get "nicely configured."

How could we possibly rationalize paying that much for an iMac when they used to cost (starting at) BELOW $2K? Well Apples ideal/perfect/one & only monitor is pretty well established at about $2K when you choose a stand option. So add the tech guts of the 14" MBpro to put a Mac back in there, add a keyboard + mouse to the box and boom: $3499 (or so).

Don't like that price? Buy yourself a Mac mini + third party monitor. Or maybe Mac mini + Studio for about $500 LESS? Or base studio + third party? Or used base studio + used studio monitor.

I'm confident Mac Mini has the margin in it that Apple wants for that sale. Who knows if that is less than or more than the margin that was in the former "bargain" of a whole Mac + 27" monitor + keyboard + mouse in iMac 27"? Depending on that answer, they MAY make just as much (PROFIT) selling you a Mac Mini instead of an iMac 27", even if you buy a third party monitor.

However, after they roll out iMac (Bigger) PRO for (my guess) starting at $3499 this year or next, they will get their new profit per unit sold target for that Mac and likely make much more than selling you only a Mac Mini. If you can't justify paying that much for the new iMac "bigger," they still get their profit target out of Mini or Studio, even if you go third party monitor, keyboard and mouse.

iMac 27" was perhaps the best overall Mac value offering. Modern Apple Inc. wants maximum possible profit out of every little thing they sell. New iMac "bigger" will reflect that when it re-launches, probably towards about TWO THOUSAND more than the price we fondly remember. In my best guess opinion, that's WHY iMac 27" was "retired"- not for all of the other reasoning slung around, including the "separates" argument by Apple themselves. I suspect for the word "separate" to apply, it's more about separating much more, high-margin revenue from buyers when they purchase iMac "bigger" PRO coming soon.

The Studio display starts at $1600. The iMac (5K Retina, 27-inch, 2020) started at $1800 for a configuration no one in their right mind would've ever purchased without upgrading the hell out of the storage and RAM. Yes, the Studio Display is overpriced. So was the hardware inside that iMac. Go back in time, price out a 2020 iMac with 32GB of RAM, 512GB SSD and something better than the stock i5, and your price tag will be within the same bathroom stall as the base model Studio with the base model Studio display.

The 27 inch iMac was discontinued because they want that segment of people to go with the Mac Studio and studio display. The iMac became more than what it was intended for and Apple cleaned up the line.

THIS DUDE GETS IT!

Instead you're stuck with an old Mac Studio, which isn't fast enough for high-performance computing (HPC) anymore and doesn't even have a still good 5K display to offer. Good luck selling that to anyone! The appeal of older high technology is not the part which aged the quickest. You don't buy a 10-years-old Porsche motor for your Toyota Corolla chassis, but you might drive a 10-years-old Porsche and enjoy the whole ride. An older iMac is still desirable precisely because of that 5K display.

You really don't know how the used Mac market works, do you? The integrated 5K display doesn't do crap for the resale value especially one the rest of the iMac dies or becomes useless. Also, iMacs devalue just as other Macs do. The Mac Studio will be no different. This nonsense of "you'll be stuck with an old Mac Studio if your display fails" makes no sense whatsoever.


Why do you think they wont release a new 27” iMac?

You don't think it highly coincidental that Apple discontinues the 27-inch iMac on the same day that they announce the Mac Studio with a display and offer the combo at the same price point as a similarly configured 2020 27-inch iMac?

Check again!

Nope. Intel is still alive and in no danger of going out of business. They're eschewing businesses that were never their bread and butter to begin with while they try to repair their manufacturing business. Not the same as dying.

That's a corporate computer, not a personal one. Those are often horrible trash, because the buyer (manager) is disconnected from the user (worker). Lenovo is just the abandoned business machine devision from IBM after loosing the user market to Apple.
You clearly do not work in IT and don't know what you're talking about when it comes to business class PCs. IT buys machines that are free from hassle because if there's hassle on the user's part, then there's hassle on IT's part. The "buyer" in your statement is a department full of technology people that know painfully well what sucks about the kinds of PCs you buy at Best Buy and Costco. They are chosen for stability first and foremost. Are the aesthetics lacking? Maybe on some. But who seriously cares if the machine is rock solid stable, reliable and durable?


I find that modern Lenovos are often hit or miss, but otherwise that particular market segment of PCs usually yields quality hardware that's easy to deploy and manage.
 
You really don't know how the used Mac market works, do you? The integrated 5K display doesn't do crap for the resale value especially one the rest of the iMac dies or becomes useless. Also, iMacs devalue just as other Macs do.
I regard all Intel Macs as useless, worth $0 to me. But I’d be willing to put up with that, if I liked the display and form factor. The resale value of iMacs 2014-2020 is entirely the display.
You don't think it highly coincidental that Apple discontinues the 27-inch iMac on the same day that they announce the Mac Studio with a display and offer the combo at the same price point as a similarly configured 2020 27-inch iMac?
The Studio Display is a stopgap until the new large iMac. It’s not a forever replacement.
Nope. Intel is still alive and in no danger of going out of business. They're eschewing businesses that were never their bread and butter to begin with while they try to repair their manufacturing business. Not the same as dying.
Intel used to be half of the industry dominating WINTEL monopoly. You couldn’t build a computer without paying them. Now it isn’t even present on smartphones and tablets, the most prevalent kind of computers. Even without the competition of AMD and AS in the ever shrinking PC sub-market, it is a dying company. And ignoring that means extinction is guaranteed.
IT buys machines that are free from hassle.
You mean, those who "just work™".
The "buyer" in your statement is a department full of technology people that know painfully well what sucks about the kinds of PCs you buy at Best Buy and Costco.
Windows, that’s what sucks on PCs.
They are chosen for stability first and foremost. Are the aesthetics lacking? Maybe on some.
Aesthetics and stability are the same. Things look good, if they work good. Aircraft engineers say: "A beautiful aircraft flies best." If the wings look too small to carry the plane, they probably are!
But who seriously cares if the machine is rock solid stable, reliable and durable?
People who buy Apple Silicon Macs.
I find that modern Lenovos are often hit or miss, but otherwise …
I find that Apple has yet to release a bad AS Mac. They all run amazing!
 
The PC ecosystem even failed to move away from PS/2 ports.
They did, a long time ago. I haven't seed a PS2 port on a machine for quite some time. They use USB just like everyone else, but tend to have more ports than an Apple machine.
 
  • Like
Reactions: Yebubbleman
Vista was no worse than any of those operating systems in the first year - probably better, in fact
There was a nasty bug in Vista at launch that messed up quite a few people's PC's, (breaking the partition table on the disk) but otherwise it was good. Windows 8 was the ugly one to me...
 
I haven't seen a PS/2 port on a machine for quite some time.
MSI MPG X570 Gaming Pro
s-l1600.jpg
 
You don't think it highly coincidental that Apple discontinues the 27-inch iMac on the same day that they announce the Mac Studio with a display and offer the combo at the same price point as a similarly configured 2020 27-inch iMac?
It wasn't a coincidence, it was planned, but that still doesn't mean there wont be a larger iMac in the future.
 
There are some badly programmed apps that don't work well with scaling, but if properly developed using Windows API's, it's really good these days.
Some older admin level Windows UI screens that are horrible with scaling. I don’t like seeing fuzzy buttons and UI.
 
  • Like
Reactions: ahurst
Well, it's not surprising that Apple fan would not know about the advantages of PS/2 ports for gaming.
I don't care about the reasons. The fact remains that the PC couldn't even leave one old non-universal legacy port behind. Everything needs to be backward compatible forever. Look at this mainboard and compare it with the logicboard in the chin of the 24" iMac. That's why we can't have nice things (without paying Apple tax).
 
I don't care about the reasons. The fact remains that the PC couldn't even leave one old non-universal legacy port behind. Everything needs to be backward compatible forever. Look at this mainboard and compare it with the logicboard in the chin of the 24" iMac. That's why we can't have nice things (without paying Apple tax).
This isn’t about legacy support. This is about improved performance. Mainstream PCs haven’t included PS/2 support for years. It’s boutique gaming motherboards that include it, since lag is 0 ms.

Your argument is barking up the wrong tree. You “don’t care” about the reasons because it contradicts your argument.
 
They seemed pretty clear about the Mac Studio + Studio Display combo being the replacement to the 27-inch iMac.
Especially when they stated the Apple Silicon transition was nearly complete, leaving just the Mac Pro. So it sounds like the Mac Pro will be the next and last thing. Could a 27” iMac (or larger) release in a few years? It’s possible. Only Apple knows. But their current plan is no more 27” iMac.
 
  • Love
Reactions: Yebubbleman
They did, a long time ago. I haven't seed a PS2 port on a machine for quite some time. They use USB just like everyone else, but tend to have more ports than an Apple machine.
My gaming motherboard that houses a 10th gen i7 and a 3080 Ti still has a PS/2 port. I was shocked because my older gaming system with a 5th gen i7 and a GTX 1080 doesn’t have it.
 
  • Like
Reactions: Gudi
I don't care about the reasons. The fact remains that the PC couldn't even leave one old non-universal legacy port behind. Everything needs to be backward compatible forever. Look at this mainboard and compare it with the logicboard in the chin of the 24" iMac. That's why we can't have nice things (without paying Apple tax).
It's not a legacy port. USB is a multi-purpose interface, not a keyboard/mouse interface. There is no replacement for PS/2 interface with similar or better characteristics. PC vendors removed PS/2 interface on all budget and business computers long ago but for specialized hardware (something Apple ecosystem has no notion of) they use the most appropriate port available.
 
Well, it's not surprising that Apple fan would not know about the advantages of PS/2 ports for gaming (no gaming on Macs). These ports are still used on advanced gaming MBs because they offer lower latency. For more details, read this.
I game and I don’t use it. I know twitch streamers that don’t use it. I know a friend that plays competitively and doesn’t use PS/2. What use case is better than usb?
 
  • Like
Reactions: Gudi
I game and I don’t use it. I know twitch streamers that don’t use it. I know a friend that plays competitively and doesn’t use PS/2. What use case is better than usb?
As mentioned, PS/2 has zero ms lag. However, if you spend enough on USB you can get like 1 ms lag. Most gamers will do the latter, but a few diehards stick to the former, since PS/2 is still technically measurably faster. These motherboards cater specifically to the former. Whether that is truly necessary or not* is a different question.

The point is, it's not about legacy support. It's about catering to a very small niche group of gamer geeks that is willing to pay the premium for that extra 1 ms of reduced latency.

*Probably not, and the high end gaming mice are all USB anyway.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.