Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Please read the entirety of my post, including the text'd I'd originally quoted.

The core problem is macOS incorrectly using YPbPr rather than RGB, with no reasonable option around this (unless you're willing to disable SIP). This has nothing to do with a lack of HDMI 2.1. It's been an extremely longstanding issue with macOS (>10 yrs): https://www.mathewinkson.com/2013/0...x-the-picture-quality-of-an-external-monitor/

Using DisplayPort fixes the problem and has macOS use the correct colour mode.

As an added benefit, DisplayPort also unlocks higher refresh rates, but that's beside the point: HDMI shouldn't present a garbled mess regardless of refresh rate. I'd be happy to live with 4K@60Hz so long as colours were fine; that way the DP input might be freed up for my primary machine. (I prefer macOS but require Windows for work...)

I did read the entirety of your post, honestly, the way you wrote it could be interpreted in the way I understood it.

Anyway, regarding YCbCr over HDMI: HDMI has always been geared towards content consumption and if it was RGB, you wouldn’t be able to get 4K 60hz Wide Color (required for HDR) out of it (HDMI 2.1 ports can have enough bandwidth to allow this, but Apple is not using them yet). YCbCr is what game consoles use, for example.

Now, Apple could’ve given us the option to use RGB, but I assume they didn’t want to confuse people (“Just works” is often at odds with power users) because they would switch to RGB and not understand why they can’t do Wide Color/HDR when attaching the Mac to their TV. You can argue you don’t need HDR, but a lot of people want to hook their Mac to their HDR TVs.

Honestly, if you need a professional external display and care about differences between RGB and YCbCr you shouldn’t be using HDMI anyway. It’s a port aimed at content consumption and many monitors have HDMI ports so that they can be used with various entertainment devices like game consoles or Blu-ray players.

TLDR: this is probably not a bug, but a design decision made by Apple. Use DisplayPort for external displays and HDMI for TVs and Projectors.
 
  • Like
Reactions: MarkC426
Just go out to the first page of this very forum's index and there are loads of threads about monitor issues, and this isn't even the accessories forum.

Maybe solved the flickering issues on the external screens
M1 Air ghosting, flickering with external display
M2 MacBook Air + LG G2 TV HDR Question
HDCP Doesn't Work With External Monitors on MBP M1 Max
M1 MacBook Air: External display not recognized after waking from sleep
Mac mini: 4k @ 120hz?
Mac Mini M1 - Iiyama 5K screen

I'm sure it's possible to find 3rd party screens that work well on Mac, but it's definitely also possible to end up having a total nightmare.

I don’t doubt a lot of people have legitimate issues, but I don’t like to draw conclusions from anecdotal evidence. Personally, I never had any issues with external monitor support (and I used a bunch of 3rd party screens with my Macs over the years) - so I found this surprising.
 
but I don’t like to draw conclusions from anecdotal evidence.
People basically said the same thing about 2011 MBP GPU issue, and butterfly keyboards failing. I recall having debates with other members who defended apple to the bitter end, stating that the butterfly keyboards were not designed poorly or defective but rather it was the user who was typing wrong (seriously). We know how well that ended.

As for the displays, many people smarter then I am pointed out many things, stating that it's most like a decision on apple for a variety of reasons. All of us, who have sub-4k monitors (I have an ultra-wide) incur an inferior experience on my Mac because the Mac fails to handle sub-pixel aliasing or something to that effect - like I said a lot of smart people offered some fantastic information Here's a bit of info on it as well: This M1 Mac display hack is a must for 1440p monitors
 
TLDR: this is probably not a bug, but a design decision made by Apple. Use DisplayPort for external displays and HDMI for TVs and Projectors.
I am of this opinion also.....👍
I have always used display port (previously used dvi).

My original MacPro GPU only had dvi/dp.
The only reason my current GPU (RX580) has hdmi is because it's a PC card.
 
They don't beat a current i9 desktop.
Yes they do. I was able to get my work done much faster than my i9 iMac could with even the base M1 Mac mini.

It might not for YOU, but it did for ME. And there is a lot of video evidence out there if you want to just search for it beating out i9s.
 
It may be beyond you to consider but many of us disagree with that sentiment. Btw, the latest intel and amd CPUs are significantly faster then M1. Don't get me wrong, the M1 is a fantastic product, but M1's success doesn't mean X86 suddenly stopped working.
M1 was released two years ago at this point. I have an almost two year old Mac mini. Latest > a couple year old processor. Lets see what M2 Pro and Max and Ultra bring to the table to compete.
 
I'm not defending Intel, but I definitely can't understand people who think they are dead! You're just not matching reality there. x64 is all I see on the desktop and laptops these days.
I have 7 computers I use to do my work. Most of the Macs are places with M1 variants because they are just better than the Intel versions and any Intel Macs I have ever used. 2010 Mac Pro is still in the workflow but that is on its last legs (computer is fine, but its just time to let it go) and about to be replaced.

However, I have a couple of Windows systems in my workflow too. One of those is a $3,000 custom built desktop with an i9 and a 3080 Ti. I do not want Intel to go away. I have had very very VERY bad luck with AMD. Their hardware might be decent enough now but they still have HORRIBLE problems with drivers and firmware and the software side. I will always choose Intel and NVIDIA for my PC. Adobe Creative Cloud did not like my AMD 5700XT in Windows 10 and it was the WORST time when you could not even get a NVIDIA GTX 1060 for less than $400. I waited two years in order to get my NVIDIA card and am happy with my 3080 Ti.

There are part of my workflow where the Windows system is better than even the M1 Ultra. But there are others (like I said) where the M1 Mac mini is faster than even my i9 Windows setup (let alone the 2019 i9 iMac it replaced). This is why I have so many systems in my workflow.
 
Yes they do. I was able to get my work done much faster than my i9 iMac could with even the base M1 Mac mini.

It might not for YOU, but it did for ME. And there is a lot of video evidence out there if you want to just search for it beating out i9s.
You're thinking Mac i9's, i9's got a *lot* faster than what the Mac had.
 
People basically said the same thing about 2011 MBP GPU issue, and butterfly keyboards failing. I recall having debates with other members who defended apple to the bitter end, stating that the butterfly keyboards were not designed poorly or defective but rather it was the user who was typing wrong (seriously). We know how well that ended.
I was pretty much done with Apple with their butterfly issues. I do not like AIO either, not at all. The Mac mini was too low end for me even maxed out, the Mac Pro was too expensive for what you got for $6,000. At least with laptops I could use it clamshell mode. I did not enjoy my 2019 i9 iMac and the laptops from 2016 up until 2021 were HORRIBLE. Now that M1* is here, I have replaced 4 out of my 5 Macs used.
 
Last edited:
There are part of my workflow where the Windows system is better than even the M1 Ultra. But there are others (like I said) where the M1 Mac mini is faster than even my i9 Windows setup (let alone the 2019 i9 iMac it replaced). This is why I have so many systems in my workflow.
What gen processor is your i9 Windows machine, I'm betting 9th or 10th, that makes a big difference in what they can do. The current is 13th gen, about 200% faster than the 10th gen. Anyway, it really doesn't matter except to say that the M series chips aren't king of the hill and one should expect it. They're good machines, no doubt about it, and if the software you need to run, runs on them, they're great! I don't own a Mac Studio Max to just sit there doing nothing, but I also have an Intel Mac, and more than one Windows machine as well.
 
What gen processor is your i9 Windows machine, I'm betting 9th or 10th, that makes a big difference in what they can do. The current is 13th gen, about 200% faster than the 10th gen. Anyway, it really doesn't matter except to say that the M series chips aren't king of the hill and one should expect it. They're good machines, no doubt about it, and if the software you need to run, runs on them, they're great! I don't own a Mac Studio Max to just sit there doing nothing, but I also have an Intel Mac, and more than one Windows machine as well.
Nope, 11th gen. Nothing is king of the hill. This is why I have 7 computers and yes a couple Windows systems for specific parts of my workflow.
 
Just license ARM FRAND and design something new. Microsoft has already telegraphed that they are dead serious about moving away from x86-64 with serious work on Windows 11 ARM64. The Surface Pro uses the Microsoft/Qualcomm SQ-1 SoC, which is ARMv8.
Hah.

You don't understand how transitions happen in PC compatible-land vs in Apple land.

In Apple land, Apple launches new thing, provides an emulator that makes new thing compatible with old thing for a while, then people adjust their software to the new thing. Then compatibility with old thing potentially goes away. That's how they did the PowerPC transition, the MacOS X transition, the Intel transition, the Apple silicon transition.

In Windows/PC land, people buy the new chips to run the old software, which it runs better than the old chips while the new abilities are unused. 98% of 386s and a good chunk of 486s went to the e-waste pile having never run a 32-bit protected mode operating system, only DOS/Win3.11. The first ~5-7 years of amd64 chips ran 32-bit XP, 32-bit Vista, and maybe 32-bit Windows 7 and never ran a 64-bit OS before going into the e-waste pile. Etc.

Every time Windows-land has tried an Apple-style transition (Itanium, ARM Surface, etc), it has been a colossal flop. The one transition that was a success, from DOS/Win3.11 to 32-bit NT, it took about eight years and Microsoft had to engineer an entire OS family (Win9x) the purpose of which was to provide enough compatibility with both DOS/Win3.11 world and NT land that people would be able to transition slowly. And until XP came out and forced the low-end-hardware vendors and game developers to support NT, if you were in NT-land, you had to be careful buying software and peripherals because a lot of vendors did not support the NT-family OSes.

And there's a simple reason for this. In order to do a transition, you need four things:
1. The new hardware.
2. An operating system compatible with the new hardware.
3. Drivers, both for things inside the system and for things that people will plug into it, and
4. Application software.

If you look at Apple's transitions, Apple can deliver #1-3 on launch day. If you are on March 14, 1994, picking up your shiny new Power Mac, you have the hardware, you have an operating system that runs on it, you have drivers for whatever video/audio/SCSI/networking/etc componentry is in the thing, Apple has provided enough compatibility with third-party peripherals you care about (e.g. the printers that work on your Quadra will work fine on your Power Mac). The only thing you are missing is #4, and Apple has provided you with an emulator that is 'good enough'. Same thing if you are picking up your Intel MBP on January 10, 2006. And because Apple can start building an installed base on day 1, the Adobes and Microsofts and other third-party developers look at this and say "okay, this new platform is happening, time to start recompiling our code guys".

If you look at PC land, ever since IBM stopped setting the standard in the 286 days, different people are responsible for each. New processor architecture is released on a given day; that's nice, there aren't going to be any operating systems. Microsoft may recompile NT for your architecture on whatever timeline they feel like. Linux folks may compile their world for your architecture, again on whatever timeline they feel like (and just because the kernel boots on your thing doesn't mean that the established distributors like Ubuntu/Debian/Fedora/etc will have a distribution for your architecture for a few years.) NetBSD folks will probably port their OS to your architecture; other BSDs may or may not, and even if they do, you're probably not going to be a "Tier 1" (in FBSD speak) architecture for a few years at least.

Then you need drivers. No one is going to be coding drivers for printers and scanners and network controllers and GPUs and whatever else for a platform that doesn't exist. So, on the day that Microsoft ships NT for your architecture, congratulations, you can't plug any hardware into it. Then, of course, you have the software problem, but your OS vendor can at least provide an emulator for that. Maybe. And all the third party software developers are watching this clusterf*** of a transition that's going nowhere and they think they probably should wait to see how things flesh out before they assign developers to porting their software.

If you want to see this in action, look at how many developers support i) 64-bit Windows, ii) ARM Windows, and iii) Apple Silicon. You can barely get 64-bit Adobe Acrobat for Windows/arm64; the same thing has been available in 64-bit Intel for macOS for years and in 64-bit Apple Silicon for a while. Is there ANY third party software for ARM Windows, other than maybe some open sourcey things like VLC that get compiled for everything? And actually, VLC doesn't have a release version for ARM Windows, just nightly builds.

Meanwhile, with a kludge like AMD64, AMD releases the processors, they keep running 32-bit XP with existing drivers, software, etc. Which they do better than the other x86-compatible processors on the market. A while later, MS releases amd64 "XP", which no one uses because there is almost no driver support for it. But at least it exists and MS indicates that amd64 is going to be a supported architecture going forward. Vista/7 deliver equivalent feature sets and experiences on both x86 and amd64; by the launch of Windows 7, there are enough drivers for amd64 that consumer machines start to be preloaded with amd64. Meanwhile, businesses are still cautious - my dad, for example, got a new work laptop in 2011 with a sandy bridge Intel that was running 32-bit Windows 7. That's eight years after the Athlon 64 shipped. By... 2013-2015, amd64 is established enough that businesses are now abandoning 32-bit. And it is only in 2021, with the launch of Windows 11, that Microsoft stops shipping 32-bit Windows. So... eighteen years.

If anybody thinks that Windows-land and the established base of Windows software can move away from x86/amd64/etc, they are dreaming. More likely to move to Chrome OS on arm than Windows on arm. Windows on ARM will never get off the ground because... you have no installed base to motivate people to write drivers, no installed base to motivate people to port their application software, and you have nothing to motivate people to buy the devices so you can't build an installed base. Meanwhile HP/Dell/Lenovo keep shipping millions of amd64 devices every month. Microsoft couldn't even get touchscreens added to the Windows platform when they went big on touch with Windows 8 - Lenovo/HP/Dell basically said "no, we don't think people want to pay for touchscreens", 80% or whatever of Windows machines they shipped had no touchscreens, and... here we are, a decade after the launch of Windows 8, and touch on Windows remains a joke.

And Microsoft may be "dead serious" about Windows on arm. Just like they were "dead serious" about a smartphone platform when they realized that smartphones were going to muscle in on the PC. You can be as "dead serious" about something as you want - if you can't ship a functional product that enough people are willing to pay for that a third-party ecosystem builds around your platform, you are doomed.
And Windows on ARM has even less of an ecosystem than Windows Phone...especially since Windows Phone had at least somewhat of that Microsoft mystique from the 90s (people like my dad, when he got his first iPhone, expected that Microsoft would come out with a phone and everyone would switch, because, well... that had been how things had gone the previous couple of decades - Apple invents something, Microsoft copies/"improves"/mainstreams/out-markets/etc it, everybody switches to the Microsoft version. The idea that Microsoft would enter a market pioneered by Apple and... completely flop... and end up exiting in shame was... not really conceivable before Windows 8/Phone.)
 
Or do we have to defend the Motorola 68k while we're at it? Great processor, man! We need more people putting this thing in their machines. Maybe we should resurrect it and tinker with it for 30 years like Intel has done with Pentium and then later Core? Isn't it time to move on?
And I think it's an interesting counter-factual. Look at what happened to 68K - after the 68040, everybody pretty much became convinced that CISC architectures were a dead end, and 68K buyers fragmented. Apple went PowerPC hoping to get some economies of scale with IBM, NeXT went... Intel?, Sun built SPARC, Atari/Amiga failed, etc.

A decade later, everybody who had been 68K was basically either i) dead, or ii) adopting x86/amd64 (i.e. the architecture they all rejected when dumping 68k a decade earlier).

If Motorola had "tinkered" with the 68K Intel-style for 30 years, maybe we would have two vibrant worlds today. But instead, they blew up their non-embedded CPU business and their customers basically all flopped.

And... didn't John Sculley say that one of his biggest mistakes was that Apple hadn't gone x86 instead of PowerPC? PowerPC was cool for 5 years, an albatross for another 5, and then abandoned faster than anyone imagined.

Meanwhile, in the counter-factual world where 68K hadn't been abandoned, we'd probably all be using Copland-derived MacOS on the 68440 or whatever the current iteration of the 68K would be. And maybe we could actually open up MacWrite 1.0 on our 68440s...
 
That may be, but that's not how many businesses work. The administrative staff at my university is all PC-based, and they all use (and need) 2-3 cheap large externals for their work (one displays one large spreadsheet, a second displays another large spreadsheet, and a third displays the payroll program into which they are entering data). I wouldn't be surprised if you see this in private industry as well.
And those monitors will always, always be "normal"-resolution, not something high-resolution like Apple's retina displays.

With all the legacy software in Windowsland, the recent popularity of VDI solutions like Citrix (which may be running on servers with older Windows OSes), etc, reasonable business IT people simply do not want the potential headache of monitors that require scaling. Not to mention the potential additional complexity of having different monitors requiring different scaling settings.

And that is why there are no retina-grade monitor options on the market other than a few "4K" monitors that don't really fit (a 4K 28" monitor at 1920x1080 doubled is going to have some seriously big huge text).

One big thing, too, the last few years has been the ultra-wide 3440x1440 monitors. Note how they're a "dated" resolution but one that doesn't require any scaling... which I am sure is a big part of their popularity.
 
  • Like
Reactions: AlphaCentauri
Yes they do. I was able to get my work done much faster than my i9 iMac could with even the base M1 Mac mini.

It might not for YOU, but it did for ME. And there is a lot of video evidence out there if you want to just search for it beating out i9s.
i9 in your iMac is 9th or 10th generation chip. Intel is already on their 13th generation.
 
That's a corporate computer, not a personal one. Those are often horrible trash, because the buyer (manager) is disconnected from the user (worker). Lenovo is just the abandoned business machine devision from IBM after loosing the user market to Apple.
Ummm, check your history.

IBM abandoned the "user market" in the late-1990s. By about 1998, the IBM Aptivas that were left (the "E" series - I had one) were made by Acer, and a few years later IBM killed the Aptiva line and exited the home market.

They didn't abandon it to Apple. Apple was doing ridiculously poorly at the time. In 2001-2006 or so, if you walked around a university, you would have seen very few Apple computers - the Intel MacBook was really the product that put Apple in that environment.

What happened was simpler than that - you had massive commoditization in the Wintel market in the second-half of the 1990s. Plus the degree of integration/customization required became less as Intel started offering on-chipset graphics, sound, etc. People like Gateway, Micron Electronics, and more importantly, Dell were kicking IBM/Compaq's a** by doing a build-to-order model using Intel motherboards (before that era, IBM, Compaq, etc designed their own motherboards), off-the-shelf parts like RAM/hard drives/etc purchased from whoever had the cheapest price that week, etc. Meanwhile, IBM/Compaq/etc had designed their own motherboards, had paid for their components two months earlier when the system was built and shipped to a warehouse (which was a big problem - if you bought a system in a store from IBM or Compaq in September, they would have paid for the hard drive in July at July's prices, meanwhile, if you bought a Dell, Dell bought the hard drive after you placed your order... and paid September's prices, which in an era of rapid innovation, might have been 20% less than July's), etc, and so they were simply no longer competitive price-wise. If you walked into a consumery computer store in 1995, you would have seen Windows machines from IBM, Compaq, Packard Hell, AST, maybe one or two other clone makers I am forgetting... and those guys all exited the market a few years later in the face of the attack from the built-to-order guys selling systems built out of commoditized parts.

Interestingly, the built-to-order guys collapsed a few years later - when component prices stopped plummeting in the way that they had in the late 1990s, building pre-built systems in Asia and shipping them on a boat now made more sense than build-to-order in a U.S. plant (and the built-to-order model required US manufacturing so that you could ship ground relatively quickly). I once ordered a Dell in 2000 on a Thursday; it was built in Texas, shipped over the weekend, and arrived in Ottawa, Canada by noon the next Tuesday.

IBM continued making PCs for business (desktops and laptops) until about 2006, when they sold that to Lenovo. And to this day, Lenovo is one of the big three business PC vendors, along with HP and Dell, there's a reason that corporate IT types continue happily buying them.
 
i9 in your iMac is 9th or 10th generation chip. Intel is already on their 13th generation.
Those generations are mostly marketing though. There was a BIG improvement in performance with the 12th generation (Alder Lake), but otherwise Intel had been mostly stagnating from Skylake (6th generation) until Rocket Lake (11th generation). At one point they started trying to make up for that by adding more cores - the mainstream desktop chips went from 4 cores to 6-8...
 
Hah.

You don't understand how transitions happen in PC compatible-land vs in Apple land.

In Apple land, Apple launches new thing, provides an emulator that makes new thing compatible with old thing for a while, then people adjust their software to the new thing. Then compatibility with old thing potentially goes away. That's how they did the PowerPC transition, the MacOS X transition, the Intel transition, the Apple silicon transition.

In Windows/PC land, people buy the new chips to run the old software, which it runs better than the old chips while the new abilities are unused. 98% of 386s and a good chunk of 486s went to the e-waste pile having never run a 32-bit protected mode operating system, only DOS/Win3.11. The first ~5-7 years of amd64 chips ran 32-bit XP, 32-bit Vista, and maybe 32-bit Windows 7 and never ran a 64-bit OS before going into the e-waste pile. Etc.

Every time Windows-land has tried an Apple-style transition (Itanium, ARM Surface, etc), it has been a colossal flop. The one transition that was a success, from DOS/Win3.11 to 32-bit NT, it took about eight years and Microsoft had to engineer an entire OS family (Win9x) the purpose of which was to provide enough compatibility with both DOS/Win3.11 world and NT land that people would be able to transition slowly. And until XP came out and forced the low-end-hardware vendors and game developers to support NT, if you were in NT-land, you had to be careful buying software and peripherals because a lot of vendors did not support the NT-family OSes.

And there's a simple reason for this. In order to do a transition, you need four things:
1. The new hardware.
2. An operating system compatible with the new hardware.
3. Drivers, both for things inside the system and for things that people will plug into it, and
4. Application software.

If you look at Apple's transitions, Apple can deliver #1-3 on launch day. If you are on March 14, 1994, picking up your shiny new Power Mac, you have the hardware, you have an operating system that runs on it, you have drivers for whatever video/audio/SCSI/networking/etc componentry is in the thing, Apple has provided enough compatibility with third-party peripherals you care about (e.g. the printers that work on your Quadra will work fine on your Power Mac). The only thing you are missing is #4, and Apple has provided you with an emulator that is 'good enough'. Same thing if you are picking up your Intel MBP on January 10, 2006. And because Apple can start building an installed base on day 1, the Adobes and Microsofts and other third-party developers look at this and say "okay, this new platform is happening, time to start recompiling our code guys".

If you look at PC land, ever since IBM stopped setting the standard in the 286 days, different people are responsible for each. New processor architecture is released on a given day; that's nice, there aren't going to be any operating systems. Microsoft may recompile NT for your architecture on whatever timeline they feel like. Linux folks may compile their world for your architecture, again on whatever timeline they feel like (and just because the kernel boots on your thing doesn't mean that the established distributors like Ubuntu/Debian/Fedora/etc will have a distribution for your architecture for a few years.) NetBSD folks will probably port their OS to your architecture; other BSDs may or may not, and even if they do, you're probably not going to be a "Tier 1" (in FBSD speak) architecture for a few years at least.

Then you need drivers. No one is going to be coding drivers for printers and scanners and network controllers and GPUs and whatever else for a platform that doesn't exist. So, on the day that Microsoft ships NT for your architecture, congratulations, you can't plug any hardware into it. Then, of course, you have the software problem, but your OS vendor can at least provide an emulator for that. Maybe. And all the third party software developers are watching this clusterf*** of a transition that's going nowhere and they think they probably should wait to see how things flesh out before they assign developers to porting their software.

If you want to see this in action, look at how many developers support i) 64-bit Windows, ii) ARM Windows, and iii) Apple Silicon. You can barely get 64-bit Adobe Acrobat for Windows/arm64; the same thing has been available in 64-bit Intel for macOS for years and in 64-bit Apple Silicon for a while. Is there ANY third party software for ARM Windows, other than maybe some open sourcey things like VLC that get compiled for everything? And actually, VLC doesn't have a release version for ARM Windows, just nightly builds.

Meanwhile, with a kludge like AMD64, AMD releases the processors, they keep running 32-bit XP with existing drivers, software, etc. Which they do better than the other x86-compatible processors on the market. A while later, MS releases amd64 "XP", which no one uses because there is almost no driver support for it. But at least it exists and MS indicates that amd64 is going to be a supported architecture going forward. Vista/7 deliver equivalent feature sets and experiences on both x86 and amd64; by the launch of Windows 7, there are enough drivers for amd64 that consumer machines start to be preloaded with amd64. Meanwhile, businesses are still cautious - my dad, for example, got a new work laptop in 2011 with a sandy bridge Intel that was running 32-bit Windows 7. That's eight years after the Athlon 64 shipped. By... 2013-2015, amd64 is established enough that businesses are now abandoning 32-bit. And it is only in 2021, with the launch of Windows 11, that Microsoft stops shipping 32-bit Windows. So... eighteen years.

If anybody thinks that Windows-land and the established base of Windows software can move away from x86/amd64/etc, they are dreaming. More likely to move to Chrome OS on arm than Windows on arm. Windows on ARM will never get off the ground because... you have no installed base to motivate people to write drivers, no installed base to motivate people to port their application software, and you have nothing to motivate people to buy the devices so you can't build an installed base. Meanwhile HP/Dell/Lenovo keep shipping millions of amd64 devices every month. Microsoft couldn't even get touchscreens added to the Windows platform when they went big on touch with Windows 8 - Lenovo/HP/Dell basically said "no, we don't think people want to pay for touchscreens", 80% or whatever of Windows machines they shipped had no touchscreens, and... here we are, a decade after the launch of Windows 8, and touch on Windows remains a joke.

And Microsoft may be "dead serious" about Windows on arm. Just like they were "dead serious" about a smartphone platform when they realized that smartphones were going to muscle in on the PC. You can be as "dead serious" about something as you want - if you can't ship a functional product that enough people are willing to pay for that a third-party ecosystem builds around your platform, you are doomed.
And Windows on ARM has even less of an ecosystem than Windows Phone...especially since Windows Phone had at least somewhat of that Microsoft mystique from the 90s (people like my dad, when he got his first iPhone, expected that Microsoft would come out with a phone and everyone would switch, because, well... that had been how things had gone the previous couple of decades - Apple invents something, Microsoft copies/"improves"/mainstreams/out-markets/etc it, everybody switches to the Microsoft version. The idea that Microsoft would enter a market pioneered by Apple and... completely flop... and end up exiting in shame was... not really conceivable before Windows 8/Phone.)
For Apple, the transitions are easy because they support very limited set of devices and there is very limited app ecosystem. The ecosystem is just big enough to serve the needs of college students (mainly English majors) and housewives/husbands. It's not useful for much else.
 
Those generations are mostly marketing though. There was a BIG improvement in performance with the 12th generation (Alder Lake), but otherwise Intel had been mostly stagnating from Skylake (6th generation) until Rocket Lake (11th generation). At one point they started trying to make up for that by adding more cores - the mainstream desktop chips went from 4 cores to 6-8...
There was 70% single core performance jump for i9 from 9th to 13th generation. Multicore performance jumped 3x. Generational increases are not always a given. Just look at M1 vs M2. Not much there. A couple more GPU units and that's it?
 
There was 70% single core performance jump for i9 from 9th to 13th generation. Multicore performance jumped 3x. Generational increases are not always a given. Just look at M1 vs M2. Not much there. A couple more GPU units and that's it?
But again, "generation" in Intel-speak is branding.

12th generation Alder Lake is the first big architectural generation shift since Skylake. Then, with the 13th generation Raptor Lake, they upped the clock speeds on the i9s. 500MHz jump in the max turbo frequency... if you have the cooling for it.

Note that the only Raptor Lakes currently shipping are the K series, I think, or at least these are the only ones in Wikipedia. The K series have wild TDP and have just been getting wilder the last few generations.

What will be interesting to see is the non-K Raptor Lakes.

Looking at https://www.cpubenchmark.net/singleThread.html now and looking at the i9s - so the move from Alder Lake to Raptor Lake and that big increase in GHz gets you about... 12%... higher single-thread performance. But going from the 11th gen to the 12th gen (the one that I describe as the big shift) got you about 20% more. Alder/Raptor Lakes and contemporaneous Ryzens are basically dominating on this graph. The i7-7700 non-K in my Windows desktop doesn't look that dated until you see the Alder Lakes...

Oh, how I miss the days of the Intel tick-tock model - new core design with performance boost, followed by new manufacturing process that lets you ramp up clock rate within the same TDP, then new core design, etc. And I think one big reason M2 is meh compared to M1 is that there isn't a TSMC process shift between the two, is there?
 
For Apple, the transitions are easy because they support very limited set of devices and there is very limited app ecosystem. The ecosystem is just big enough to serve the needs of college students (mainly English majors) and housewives/husbands. It's not useful for much else.
That may be a little extreme, but yes, Apple used to be the go-to machines for education, film/video, and graphic design. Now, not so much, and yet when I look out at my classroom, I see a sea of apple logos facing me with faces peaking above them. When apple held those niches, they held a solid 5-6 percent of the market share. Clearly they found that lifestyle branding is far more profitable than excelling at specific tasks. And yet, here I am in my apple ecosystem.
 
  • Like
Reactions: AlphaCentauri
That may be a little extreme, but yes, Apple used to be the go-to machines for education, film/video, and graphic design. Now, not so much, and yet when I look out at my classroom, I see a sea of apple logos facing me with faces peaking above them. When apple held those niches, they held a solid 5-6 percent of the market share. Clearly they found that lifestyle branding is far more profitable than excelling at specific tasks. And yet, here I am in my apple ecosystem.
Education, at least up until some point in the 1990s, overwhelmingly meant computer labs (and, sure, teachers/faculty/etc). Hence education-centric products up to the "molar Mac" G3 that were clearly intended for use in a computer lab, complete with the DUAL headphone jacks so TWO kids could listen to multimedia software. Hence things like the LC Apple II card that were designed to get LCs into labs full of Apple IIs. Etc.

Then I think there started to be a bit of a revolution among parents, etc - a sense that it was silly for kids to learn Macs when the real world (and increasingly the home computer) ran on Windows. No doubt that was a key contributor to the darkness of the dark era. Doesn't help that some of the things they removed, like floppy disks, would have made some computer lab workflows a lot more challenging (you can't have each kid having his/her own floppy/Zip disc/etc in an iMac G3 world... and USB flash drives were a few years away)

As for your classroom and the sea of Apple logos, my sense is that the product that drove that shift was the white Intel MacBook in 2006. Prior to that, post-secondary classrooms would have been full of Dells and Toshibas and whatnot. Maybe a few big-budget die-hard Mac fans with PowerBooks and... I don't remember iBooks being very popular in the post-secondary crowd. Within a few years... major, major-league shift to MacBooks. Not sure what drove it - the coolness of the iPod (which really went mainstream in, oh, 2004 or so?) may have been a part of it, maybe pricing, maybe portability (a lot of the consumer Windows machines tended to be 15"+ laptops that were on the bulky side, particularly in that weird ~2002-2004 period where there was a mobile "Pentium 4" that guzzled power like a desktop), maybe disappointment with low-quality consumer Windows machines, etc. Also, I wonder if perhaps iBooks had re-entered the high-school scene in the early 2000s such that when those users went on to post-secondary education, they stuck to Mac.
But especially the fashion-conscious women who, in 2001 or 2002, would have bought the standard issue Toshiba running Windows Me from the university computer store... by 2007-2008, had completely entirely adopted MacBooks.
 
IBM continued making PCs for business (desktops and laptops) until about 2006, when they sold that to Lenovo. And to this day, Lenovo is one of the big three business PC vendors, along with HP and Dell, there's a reason that corporate IT types continue happily buying them.
Thanks for wrapping up the entire boring PC history. The facts remain that Apple leads the PC industry in profit share, holds a 90% share of the ARM-based PC segment, is the only brand which still grows its share every quarter and Big Blue has left the PC market entirely. Being an innovator instead of a monopolist payed off in the long run. And again, nobody cares if penny pinching managers are happy with their cheap Chinese junk. Users prefer Macs and they pay up for the best computers.
 
Education, at least up until some point in the 1990s, overwhelmingly meant computer labs (and, sure, teachers/faculty/etc). Hence education-centric products up to the "molar Mac" G3 that were clearly intended for use in a computer lab, complete with the DUAL headphone jacks so TWO kids could listen to multimedia software. Hence things like the LC Apple II card that were designed to get LCs into labs full of Apple IIs. Etc.

Then I think there started to be a bit of a revolution among parents, etc - a sense that it was silly for kids to learn Macs when the real world (and increasingly the home computer) ran on Windows. No doubt that was a key contributor to the darkness of the dark era. Doesn't help that some of the things they removed, like floppy disks, would have made some computer lab workflows a lot more challenging (you can't have each kid having his/her own floppy/Zip disc/etc in an iMac G3 world... and USB flash drives were a few years away)

As for your classroom and the sea of Apple logos, my sense is that the product that drove that shift was the white Intel MacBook in 2006. Prior to that, post-secondary classrooms would have been full of Dells and Toshibas and whatnot. Maybe a few big-budget die-hard Mac fans with PowerBooks and... I don't remember iBooks being very popular in the post-secondary crowd. Within a few years... major, major-league shift to MacBooks. Not sure what drove it - the coolness of the iPod (which really went mainstream in, oh, 2004 or so?) may have been a part of it, maybe pricing, maybe portability (a lot of the consumer Windows machines tended to be 15"+ laptops that were on the bulky side, particularly in that weird ~2002-2004 period where there was a mobile "Pentium 4" that guzzled power like a desktop), maybe disappointment with low-quality consumer Windows machines, etc. Also, I wonder if perhaps iBooks had re-entered the high-school scene in the early 2000s such that when those users went on to post-secondary education, they stuck to Mac.
But especially the fashion-conscious women who, in 2001 or 2002, would have bought the standard issue Toshiba running Windows Me from the university computer store... by 2007-2008, had completely entirely adopted MacBooks.
I mean that is part of the issue, right? If I go into one of the few universities that still have a computer store attached to the bookstore, you have the plastic dells and lenovo bottom of the line machines, and next to them a full metal MacBook Air. If I'm the kids' parent, the shiny metal one seems more sturdy, and just looks nicer. So yeah, I'll probably get them an air or a 13 inch pro. The richer kids get the 16 inch pros because only the best, even if it's never used. It's not that you can't get a nice looking PC, it's just that the recommended models tend to be pretty low rent. I miss the days of Apple really excelling at the OS, and amazing apps, but those days are long gone. I think OS X is marginally better from a user standpoint than Windows, but it's not night and day like it used to be. But damn if my students all don't have to have their 1500 dollar phones the week they come out.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.