Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
You got the computer you needed at the time.

macOS 26 should work on your Intel Mac, right?

Who’s to say M1 Macs don’t go obsolete next year, too?!

Don’t be sad for your self-proclaimed mistake. Be glad you’ve have a great computer for the last 6+ years and will be getting an M6 or better next time.

Unfortunately it doesn’t make the MacOS26 cutoff …

I’ll use this for as long as possible. It still works well for my needs. But the regret returns each time the fans kick in when I use certain photo editing programs.
 
You were absolutely right to wait today, the Windows ARM environment is finally mature enough for serious, productive use. But let’s be honest: that’s not thanks to Microsoft. The truth is, Apple is the one who jump-started the ARM desktop ecosystem.

Windows on ARM has existed since 2012, starting with Windows RT and later with devices like the Surface Pro X. But for eight years, Microsoft left that platform in a half-baked state:
  • no proper emulator,
  • incomplete development tools,
  • limited support in Visual Studio,
  • and almost no native ARM software from third-party developers.
Users were stuck with slow x86 emulation, and developers had little to no incentive to adapt. 😣

Then in 2020, Apple launched the M1 chips — and took a completely different approach:
  • blazing performance, even in Gen 1,
  • dev tools ready on day one,
  • Rosetta 2, an emulation layer that actually worked,
  • and above all, real pressure on the software industry to recompile for ARM.
Within months, major developers who had ignored Windows ARM for years were suddenly pushing native ARM builds (but for macOS). The key is that many of those apps and frameworks (Electron, Chromium, Unity, Qt, etc.) are cross-platform. So now, those same native builds also work on Windows ARM, by extension. But only because Apple forced the market to move.

So yes today Windows ARM is actually viable. But let’s be clear: we got here because of Apple’s momentum, not Microsoft’s vision. If we had waited for Microsoft alone to lead the ARM transition, we might still be waiting in 2030. 😜
I didn’t know that Win11 Arm was so old. Microsoft certainly seems to have picked up the pace recently with regard to OS and dev tools ports to Arm. It will be very interesting to see which of their other major software lines get the Arm treatment in the coming years. If Apple really is the reason, then I’ll take that.
 
The M-move hype has long fizzled - the performance boost of later M chips is minuscule at best, integrated memory is far from sufficient, and graphics performance continues to be worse than in top GPUs.
Effective Windows virtualization or BootCamp? Forget it.
Not everyone is a “creative artist”, and not everyone feels this “huge” advantage in using M-chip Macs - I have both and my 2019 Mac Pro continues to be the main machine by far.
I think the issue is that programming code to accelerate video in games never included versions that fully took advantage of the tightly-coupled memory management of the M-series SoC's.

As for RAM size, the M-Series now offer SoC's with 64 and 128 GB of RAM, which is more than enough for even fairly complex programs.
 
  • Like
Reactions: tubular
I’m not a power Mac user but I recently switched from a 2017 mini to an M4 and it’s night and day.
For my uses I can’t see that I will need an upgrade for a decade.

Oh, I'm definitely a Power Mac user. Got a few G5 towers, a few iMac G4s, some Platinum G3s, a blue & white G3, every generation of G4 tower...

Never had a 2017 mini. In fact, I've never seen a 2017 mini. Almost like it doesn't exist or something.
 
I knew Apple was going to make the transition when the A12 SoC was already showing performance that was only a small fraction behind that of the Intel and AMD CPU's of 2018 vintage. The M1 benefited from the fact it was closely coupled with MacOS, so when it rolled out in the fall of 2020 the performance surprised a lot of people with its impressive speed.
I think you’re confusing performance per watt with performance. The A12 wasn’t a patch on Intel chips for performance, it was just really efficient. Even current gen Apple chips struggle against a decent PC in pure performance if efficiency isn’t a consideration.
 
  • Like
Reactions: HighwaySnowman
I think the issue is that programming code to accelerate video in games never included versions that fully took advantage of the tightly-coupled memory management of the M-series SoC's.
There’s also the issue of power draw. For every desktop Mac Apple sells six laptops. An RTX 4090 on a laptop would be fun for the entire thirty seconds of battery life. It doesn’t make sense for Apple to design GPUs to blow the roof off when their core audience wants laptops that don’t singe thighs.
 
There’s also the issue of power draw. For every desktop Mac Apple sells six laptops. An RTX 4090 on a laptop would be fun for the entire thirty seconds of battery life. It doesn’t make sense for Apple to design GPUs to blow the roof off when their core audience wants laptops that don’t singe thighs.
That excuse doesn't hold for Mac desktops.
 
The M-move hype has long fizzled - the performance boost of later M chips is minuscule at best, integrated memory is far from sufficient, and graphics performance continues to be worse than in top GPUs.
Effective Windows virtualization or BootCamp? Forget it.
Not everyone is a “creative artist”, and not everyone feels this “huge” advantage in using M-chip Macs - I have both and my 2019 Mac Pro continues to be the main machine by far.
That’s simply not true lol

“miniscule at best” 🤦‍♂️

M1 -> M4 is at least a 70% gain in single thread performance (sometimes waaaay more than that thanks to SME), and multi-thread performance has more than doubled.

The GPUs have grown even more than that - Apple is arguably in second place on graphics now, they are well ahead of AMD and Intel for applications like Blender and running LLMs locally (Metal is well supported, unlike the half baked APIs AMD and Intel are pushing). I don’t think we’ll see them aim for an outright performance win over NVIDIA because they have no interest in chasing the 600W reticle size GPU end of the market 😅

The progress made in just 4 years faaaaaar outpaces the previous status quo.
 
I was going to post the same thing. When i ran my intel Mac Pro my wife yelled at me for the electric bill. Now that i have a mac studio that just sips power we are happy campers. Conversely my nephew has a giant gaming PC with a top end ryzen and nvidia and his dad cant figure out why the huge electric bill.
If only apple silicon was power enough to emulate said top end ryzen and nvidia card in full speed, then maybe his dad wont need to worry about the huge power bill.

Or devs take a look at Mac and develop games for it, or port their old games. As of now, windows still has the largest game library to date and Linux nowhere near the same level.
 
ARM is nice, but Apple threw away their advantage of legally being able run Mac and Windows on one computer when they ditched x86.

They could also switched to AMD which has better performance per watt than the Intel CPUs they were using, and not have to do a huge transition.
Last I checked you could still run windows via parallels… and the performance of the apple silicon is leagues ahead of intel or AMD in terms of performance per watt.
 
Apple isn’t interested in market share, they are interested in profit share. If they are taking the top of the PC market, that’s all they care about.

Sure, Apple has always prioritised high margins over market share. However, this doesn’t mean they’re entirely disinterested in gaining more of it. There are only so many ways to maximise margins, and at some point, gaining more market share is also necessary . Look at what they’re doing in China , they’re desperately trying to increase iPhone market share.

But at this moment, gaining more market share for the Mac is much easier than for mobile devices, precisely because their share is so small ( 16% for MacOS is almost insignificant , it’s absolutely marginal ). Any other company would have dropped the Mac completely and just focus on mobile stuff ( the iPhone is around 50% of the market ).
 
Last edited:
Thanks, but that's more of a visual bandaid and doesn't address this mess:

View attachment 2522871

This topic has been beaten to death, but it's a kludge and I'm looking forward to the day they can engineer their way out of it.
Look it's a kludge but you end up with extra workable space that's great for menus.
Putting a BLACK background makes it look like a regular no notch screen with menus.

It looks and works fine for me... :)
Your milage and expectations may vary, naturally...
 
That excuse doesn't hold for Mac desktops.
I thought I addressed that. These chips are designed primarily for the laptop. They’re mobile first and that was the right call. Apple decided not to build out the highest-power variation on their chips — I think it was being called Extreme — because, among other things, it was laptop-unfriendly.

Conversely, Intel and NVidia started on the desktop and found laptop life less comfortable. That was the driver behind the Apple Silicon transition: Intel could no longer make the chips Apple needed at the thermal footprint Apple wanted.
 
Totally wrong and they have to focus on those.

I simply disagree.

Take a look at AI competition that they are way behind than anyone else. That's because they cant even make server or super computer grade chips for their own. Since Apple is refusing to use Nvidia GPU, it only means they sucks. That's why Apple needs to create Mac Pro chips so that they can use it for their own server and AI development. According to people who blamed CFO, they are still using 5 years old GPU and they even reduced the amount of budget to buy more GPU.

Apple can buy whatever computing power they need to develop AI applications rather than spend time developing new chips that likely will not be huge leaps over what dedicated GPU developers are doing. They also have the money to buy the needed expertise they may lack.

Apple likely sees the value to Apple in AI is integrating it into their existing products and as a service, not in developing the iron to run it. I think that is a wise course since the hardware changes rapidly but the underlying programming doesn't change as rapidly.

It also get back to Apple isn't a computer company anymore; Macs in some ways are a legacy of what it was way back when. Services and integrated solutions for consumers and businesses are a big part of their future and where the money will go, IMHO.

Also, there are still many demands for high end and workstation computer that Apple ditched.

The worldwide sever market size estimates I've seen vary from 64 - 120+ BUSD; with the high end numbers around 7 billion. Apple has much better opportunities than to chase niche markets.

Apple already makes laptops that are high enough performance to provide enough mobile computing power for many users. Make a better GPU and more powerful energy efficient chips for the existing market and expand it; it makes sense to concentrate efforts on them rather than chase a tiny market.
 
Apple can buy whatever computing power they need to develop AI applications rather than spend time developing new chips that likely will not be huge leaps over what dedicated GPU developers are doing. They also have the money to buy the needed expertise they may lack.

Apple likely sees the value to Apple in AI is integrating it into their existing products and as a service, not in developing the iron to run it. I think that is a wise course since the hardware changes rapidly but the underlying programming doesn't change as rapidly.

It also get back to Apple isn't a computer company anymore; Macs in some ways are a legacy of what it was way back when. Services and integrated solutions for consumers and businesses are a big part of their future and where the money will go, IMHO.
Wrong, eventually Apple needs their own chips for their own server, super computer, and AI just like they did with Intel to Apple Silicon because it's a new future. Now, Nvidia is dominating the market by 90%. Even Apple admitted that they cant develop AI cause they are using old GPUs based on articles few months ago.

The worldwide sever market size estimates I've seen vary from 64 - 120+ BUSD; with the high end numbers around 7 billion. Apple has much better opportunities than to chase niche markets.

Apple already makes laptops that are high enough performance to provide enough mobile computing power for many users. Make a better GPU and more powerful energy efficient chips for the existing market and expand it; it makes sense to concentrate efforts on them rather than chase a tiny market.
Laptop can NOT replace desktop and workstation. Same story for PC that both desktop and workstation's market size is very niche and yet still important cause it's not replaceable. You are telling me that laptops can replace super computers. Since Apple is diving into 3D and AI markets, the need for high end desktop and workstation is higher than ever before. So tell that to all Mac Pro users or people who need high performance.
 
Wrong, eventually Apple needs their own chips for their own server, super computer, and AI

You seem fixated on the idea Apple needs to be in the server and supercomputer market when the reality is it's a tiny market and Apple's resources are better spent elsewhere.

It's not just chips, but service, support, and getting developers to develop for a new machine instead of sticking with the industry standards.

just like they did with Intel to Apple Silicon because it's a new future.

They left Intel to be able to control the roadmap and not be at the mercy of Intel's delays and problems so the could control the future of their devices.


Now, Nvidia is dominating the market by 90%. Even Apple admitted that they cant develop AI cause they are using old GPUs based on articles few months ago.

And as I pointed out Apple has the spare change to buy whatever computing power they need.

Laptop can NOT replace desktop and workstation.

Depends on the use, and Apple's high end laptops and desktops are powerful enough for a lot of heavy lifting.

Same story for PC that both desktop and workstation's market size is very niche and yet still important cause it's not replaceable.

And Apple stayed in the desktop market with 3 different offerings, all based on existing chips.

You are telling me that laptops can replace super computers.

No, I simply said it's a market not suited for Apple's strategy.

Since Apple is diving into 3D and AI markets, the need for high end desktop and workstation is higher than ever before. So tell that to all Mac Pro users or people who need high performance.

If they need more than a top of the line MacPro for their work they are in a small and very specialized market and a MacBook Pro is not the machine for them; one that doesn't fit in with Apple's strategy. The notion a company needs to be in every market is wrong and what drives some companies out of business.
 
Reading this on my 2019 iMac and shedding a tear…why didn't I wait a bit longer!!
Could be worse. I ordered a 2020 intel MacBook Pro when they were announced months before the M1 announcement.

Now that it’s not going to receive any new major MacOS updates, I’m going to replace it with the 2026 Air.
 
  • Like
Reactions: TVreporter
You seem fixated on the idea Apple needs to be in the server and supercomputer market when the reality is it's a tiny market and Apple's resources are better spent elsewhere.

It's not just chips, but service, support, and getting developers to develop for a new machine instead of sticking with the industry standards.
Then tell me WHY Apple made their own server with Apple Silicon chips while they are struggling with AI development thanks to 5 years old GPU as CFO didnt provide enough budgets? Ironic.

They left Intel to be able to control the roadmap and not be at the mercy of Intel's delays and problems so the could control the future of their devices.
Same thing for AI.

And as I pointed out Apple has the spare change to buy whatever computing power they need.
Buy what? Google or Microsoft? You are ignoring the fact that Nvidia is dominating the AI market by 90% and yet, you are telling me that there are alternatives. This is why Apple is falling behind cause they dont have any chips for AI or have to use slow chips.

Depends on the use, and Apple's high end laptops and desktops are powerful enough for a lot of heavy lifting.
And they NEVER be enough for high end and workstation uses. If you really think that Max and Ultra chips are powerful, take a look at RTX 5090. You are only justifying deteriorating the performance. Powerful enough is just a biased word to justify overall limited performance.

And Apple stayed in the desktop market with 3 different offerings, all based on existing chips.
Which only means they cant go beyond that. Mac Pro was planned to use 4x Max chips via MCM but it never happened cause it's extremely expensive and yet not powerful. This is why Ultra chips have huge problems as MCM does not provide twice the performance.

No, I simply said it's a market not suited for Apple's strategy.
Apple is also focusing on 3D, gaming, and AI which requires high specs and performance which already counter your claim. Why do you keep ignoring professional users? Those markets were always around 1% of overall marketshare and yet, there are still many pro users.

If they need more than a top of the line MacPro for their work they are in a small and very specialized market and a MacBook Pro is not the machine for them; one that doesn't fit in with Apple's strategy. The notion a company needs to be in every market is wrong and what drives some companies out of business.
Again, Apple is already focusing on 3D, gaming, and AI. You totally misunderstood their strategy after all.
 
Last edited:
  • Haha
Reactions: Bungaree.Chubbins
Why bother with a computer that can’t melt your face off doing things that only a tiny tiny fraction of a percent of people are doing with on-premise workstations?

Why don’t all cars have eighteen wheels?
 
Why bother with a computer that can’t melt your face off doing things that only a tiny tiny fraction of a percent of people are doing with on-premise workstations?

Why don’t all cars have eighteen wheels?
Rendering high resolution complex video scenes isn’t exactly niche. Yes they can do it, but rendering faster means more iterations and less “that’ll do” which leads to better results. Faster processors are nicer for a lot of people, and some of us always plug in so don’t need efficiency. There’s room for more than one requirement here, Apple just tend to focus on the mainstream.
 
There’s room for more than one requirement here, Apple just tend to focus on the mainstream.
What I’d really like is for Apple to make an Apple Silicon eGPU for when I’m plugged in and driving a big monitor.
 
Last edited:
Then tell me WHY Apple made their own server with Apple Silicon chips while they are struggling with AI development thanks to 5 years old GPU as CFO didnt provide enough budgets? Ironic.

From what I've seen, Apple is deevloping its own solution to use in house to drive its AI product; which is a far cry from making servers nd high end workstations for broader sale. Apple apparently is developing a special version of MacOS for them. This allows Apple to tailor the system to their AI needs and is in keeping with Apple's penchant for controlling the entire ecosystem. Apple wants to make their AI solution power how you use your phone, iPad, home automation, etc.

Selling access to their AI solution for iOS/iPad developers would be a path to additional revenue as well; and in keeping with Apple's increasing focus on services.

Buy what? Google or Microsoft? You are ignoring the fact that Nvidia is dominating the AI market by 90% and yet, you are telling me that there are alternatives. This is why Apple is falling behind cause they dont have any chips for AI or have to use slow chips.

You keep conflating AI with hardware. Apple could buy the hardware easily if they wanted, but if they need expertise they could simply buy a company like Anthromorphic to get talent or simply hire away the talent they wnat. AI is about software, not hardware which is basically a commodity.

And they NEVER be enough for high end and workstation uses.

And they do need to be because that is not a market that Apple is interested in.

If you really think that Max and Ultra chips are powerful, take a look at RTX 5090. You are only justifying deteriorating the performance. Powerful enough is just a biased word to justify overall limited performance.

No, it means Apple is making powerful chips for the markets they care about.

Apple is also focusing on 3D, gaming, and AI which requires high specs and performance which already counter your claim.
You can do a lot of 3D rendering on existing Macs just fine.

Apple's gaming focus is on handhelds, and just throws a bone to the Mac side by offering a way to port games. They know most game studios aren't going to bring out their games for teh Mac, but if one wants to do so cheaply they have a way.

As for AI, Apple seems to focus on incorporating it into their products rather than building the next OpenAI for general use.
Why do you keep ignoring professional users?

I'm not, there are plenty of pro users for whom the exiting product line is just fine. To try to go into very niche markets with entrenched players, operating systems and software with a new product incompatible with what are industry standard tools is a losing game.

Those markets were always around 1% of overall marketshare and yet, there are still many pro users.

There jsus isn't enough money to go after small markets; Apple is better off spending R&D on more valuable markets and leaving those markets to teh existing players.

Again, Apple is already focusing on 3D, gaming, and AI. You totally misunderstood their strategy after all.

NO, I understand why Apple is not interested in the high end workstation and server market.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.