Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As there has also been talk about axing Rosetta and its connection to the decline of Intel support; Rosetta 2 actually got a major upgrade with Sequoia - It now supports AVX(2)

Wow this is great news. There are several software I use that still don't support ARM/Apple Silicon. In the case of SQL Edge, they actually dropped ARM support. Given predictions that Apple would treat Rosetta2 as a transition tool (as it did with PPC->Intel Rosetta) rather than a long-term option, I've been hesitant to commit to Apple Silicon. However between making it available not just for macOS but Docker and emulators and now this, it feels like longer-term commitment.

For people who build their own software, it's also good because if Mac/Intel-users optimize for any Haswell or later processor (and Skylake has been a minimum officially supported processor since Ventura), the resulting binary wouldn't work under Apple Silicon/Rosetta2. Now it sounds like it will which will help people migrate in the future without having to worry about whether every program they have is a fat binary.

The just posted AMD w/AVX2 versus Apple Silicon/Rosetta2 w/AVX benchmarks are interesting. I would be curious to see benchmarks comparing AVX[1] versus AVX2 on Intel or AMD versus AVX[1] on Apple Silicon/Rosetta2 versus AVX2 on Apple Silicon/Rosetta2. Enabling AVX2 instructions in LLVM for use on my Intel 8th and 10th generation processors hasn't made as big a difference over AVX as I expected and so wasn't sure it was worth creating potential incompatabilities with non-Intel processors and/or Rosetta2. Either way the compatability is good for applications/binaries that are using AVX2 for whatever reason.
 
  • Like
Reactions: gank41
Wow this is great news. There are several software I use that still don't support ARM/Apple Silicon. In the case of SQL Edge, they actually dropped ARM support. Given predictions that Apple would treat Rosetta2 as a transition tool (as it did with PPC->Intel Rosetta) rather than a long-term option, I've been hesitant to commit to Apple Silicon. However between making it available not just for macOS but Docker and emulators and now this, it feels like longer-term commitment.

For people who build their own software, it's also good because if Mac/Intel-users optimize for any Haswell or later processor (and Skylake has been a minimum officially supported processor since Ventura), the resulting binary wouldn't work under Apple Silicon/Rosetta2. Now it sounds like it will which will help people migrate in the future without having to worry about whether every program they have is a fat binary.

The just posted AMD w/AVX2 versus Apple Silicon/Rosetta2 w/AVX benchmarks are interesting. I would be curious to see benchmarks comparing AVX[1] versus AVX2 on Intel or AMD versus AVX[1] on Apple Silicon/Rosetta2 versus AVX2 on Apple Silicon/Rosetta2. Enabling AVX2 instructions in LLVM for use on my Intel 8th and 10th generation processors hasn't made as big a difference over AVX as I expected and so wasn't sure it was worth creating potential incompatabilities with non-Intel processors and/or Rosetta2. Either way the compatability is good for applications/binaries that are using AVX2 for whatever reason.

Definitely. Do be aware that there's a few caveats to the new AVX/2 support though.

For one thing, if the codebase dynamically checks for AVX support at runtime, either with sysctl or CPUID, it will get told the CPU does *not* support AVX, so may report an error or run in the fallback "no-avx" mode. I've tested this myself. However, if forced to run anyway, the AVX instructions execute just fine. It's unknown to me if this is a bug or intended behaviour.

Second aspect is performance. Apple Silicon has two different mechanisms that sort of do something similar to AVX. It's got NEON and it's got SME/AMX. Based on performance, I at least feel confident in saying that AVX doesn't map to AMX. Probably not SME instructions (they go to AMX units) either but that's M4 only. This is good for latency, but bad for throughput so raw GFLOPs numbers will be somewhat low compared to what'd be possible with AMX, but it has its own set of constraints like acting as a streaming mode co-processor
 
I'm very happy that the 16" i9 mbp that I bought in May 2020 is supported by another round of Mac OS. I can edit 4k/6k footage just fine in Davinci Resolve, Lightroom/Photoshop have no problems with the raw files from my Sony a7 IV and I can boot into Windows for gaming. I'm actually surprised how well it handles Horizon Forbidden West. Performance wise I see no reason to upgrade, so it's good that Apple will not be forcing me to do so for the next 3 years. After the PPC/Intel transition, I actually expected Sequoia to drop intel support.
 
Definitely. Do be aware that there's a few caveats to the new AVX/2 support though.

For one thing, if the codebase dynamically checks for AVX support at runtime, either with sysctl or CPUID, it will get told the CPU does *not* support AVX, so may report an error or run in the fallback "no-avx" mode. I've tested this myself. However, if forced to run anyway, the AVX instructions execute just fine. It's unknown to me if this is a bug or intended behaviour.

Second aspect is performance. Apple Silicon has two different mechanisms that sort of do something similar to AVX. It's got NEON and it's got SME/AMX. Based on performance, I at least feel confident in saying that AVX doesn't map to AMX. Probably not SME instructions (they go to AMX units) either but that's M4 only. This is good for latency, but bad for throughput so raw GFLOPs numbers will be somewhat low compared to what'd be possible with AMX, but it has its own set of constraints like acting as a streaming mode co-processor

Yes supporting AVX/VX2 but not reporting them is kind of weird. I am reading that decision as emulating AVX/2 instructions is not as fast as emulating typical fallback code but would be curious to test that and confirm that's the why behind that decision. Maybe something to do that next time I have an M-series handy...

I can accept that Rosetta2 run programs won't be as fast as native code and likely won't take advantage of special hardware features. If it runs most things at better than 50% of native excepting things like NPU that's pretty good.

The one thing I didn't understand in my limited testing (possibly 2 years ago) was that Rosetta2 didn't seem to take advantage of native libraries. So a Rosetta2-run program using the matrix multiplication routine from Accelerate didn't even run as fast as the same running natively even though I would have thought Rosetta2 would have just substituted the call to the native version and so basically run at full native speed. As I remember it, Rosetta2 emulated C versions of matrix multiplication I had handy faster than the calls to Accelerate (while normally using routines from Accelerate are much faster than anything written in C when running natively).

In any case, as long as it is clear that Apple understands that the Intel/AMD64 ISA is important for the forseable future and it needs to keep Rosetta2 around (and ideally improving it...), that helps with commitment to a non-Intel platform. No one has to like that ISA -- just accept it is a common target and it isn't going anywhere anytime soon.
 
Yes supporting AVX/VX2 but not reporting them is kind of weird. I am reading that decision as emulating AVX/2 instructions is not as fast as emulating typical fallback code but would be curious to test that and confirm that's the why behind that decision. Maybe something to do that next time I have an M-series handy...
Yeah; I've been thinking about trying that out myself but doubt I'll find the time any time soon
The one thing I didn't understand in my limited testing (possibly 2 years ago) was that Rosetta2 didn't seem to take advantage of native libraries. So a Rosetta2-run program using the matrix multiplication routine from Accelerate didn't even run as fast as the same running natively even though I would have thought Rosetta2 would have just substituted the call to the native version and so basically run at full native speed. As I remember it, Rosetta2 emulated C versions of matrix multiplication I had handy faster than the calls to Accelerate (while normally using routines from Accelerate are much faster than anything written in C when running natively).
Hm; I think there are/were some restrictions to changing modes in the middle of a running process - Like if you had a program that could act as a plugin host you could also only use x86 plugins when host was running under Rosetta and only ARM plugins when the host app was in native mode
 
Yeah; I've been thinking about trying that out myself but doubt I'll find the time any time soon

Hm; I think there are/were some restrictions to changing modes in the middle of a running process - Like if you had a program that could act as a plugin host you could also only use x86 plugins when host was running under Rosetta and only ARM plugins when the host app was in native mode

Yes my understanding of the way Rosetta2 works is that at least the entire thread (and possibly the entire process) have to be one architecture or the other. Calls between one or the other require at a minimum a "thunk" to translate across ABI calling conventions.

My assumption however is that Cocoa UI and similar stuff runs native versinos of those frameworks and it's not for example emulating Intel-versions of the Cocoa UI frameworks for Intel applications run under Rosetta2 on Apple Silicon. If so, I would assume they could do the same for Accelerate code. But maybe there is something about Accelerate that makes that harder or they're actually running the Cocoa frameworks supporting legacy applications under emulation as well.
 
Question:
"Will MacOS 15 drop support for Intel?"

The answer is "NO".
(nothing follows)
 
Now the question is: "Will macOS 16 drop support for Intel?"

I wonder if Apple will make a formal announcement about ceasing all Intel support or will they just somewhat quietly keep culling Intel model support until none remain. Pretty much anyone with an older Intel model only found out about support or not at WWDC when a new macOS is announced.
 
Now the question is: "Will macOS 16 drop support for Intel?"

I wonder if Apple will make a formal announcement about ceasing all Intel support or will they just somewhat quietly keep culling Intel model support until none remain. Pretty much anyone with an older Intel model only found out about support or not at WWDC when a new macOS is announced.

They will just cut support with a future release of macOS. It is not typical for Apple to pre-announce those decisions.
 
They will just cut support with a future release of macOS. It is not typical for Apple to pre-announce those decisions.

If they want to get more into enterprise sales (e.g. hospital IT departments), they are going to have to learn how to announce these things in advance. A lot of organizations don't like things being declared obsolete/unsupported with less than a year's notice.
 
They will just cut support with a future release of macOS. It is not typical for Apple to pre-announce those decisions.
Some people point to when Apple dropped 32-bit support. They started making regular statements to expect 32-bit support to end during Sierra. Then continuing those statements during High Sierra, then statements and pop-up notifications in Mojave. So at least with 32-bit support, they telegraphed the end for a couple years. The difference is 32-bit support affected everyone and many developers. Maybe they don’t feel the need to be so transparent with the dwindling Intel base. At this point, I know I have a little over 3 years of support at minimum for my 2019 MBP. So even if Sequoia is the last to support Intel, I’ll make it to October 2027, which is a nice 8yr run of support for my MBP. Assuming I don’t jump to M5/M6 by then. But currently, my workflow relies on x86 VMs and my MBP is still going strong for what I need it to do.
 
If they want to get more into enterprise sales (e.g. hospital IT departments), they are going to have to learn how to announce these things in advance. A lot of organizations don't like things being declared obsolete/unsupported with less than a year's notice.

They are not interested. They don't publicly disclose how long they support a given release of macOS. It is already the case that they are not interested in catering to the needs of enterprise customers.

Some people point to when Apple dropped 32-bit support. They started making regular statements to expect 32-bit support to end during Sierra. Then continuing those statements during High Sierra, then statements and pop-up notifications in Mojave. So at least with 32-bit support, they telegraphed the end for a couple years. The difference is 32-bit support affected everyone and many developers. Maybe they don’t feel the need to be so transparent with the dwindling Intel base. At this point, I know I have a little over 3 years of support at minimum for my 2019 MBP. So even if Sequoia is the last to support Intel, I’ll make it to October 2027, which is a nice 8yr run of support for my MBP. Assuming I don’t jump to M5/M6 by then. But currently, my workflow relies on x86 VMs and my MBP is still going strong for what I need it to do.

They are not likely to pre-announce when support for x86 is being dropped.
 
I'm sure that the discontinuation of Intel Macs would take much longer, considering the fact that the Mac Pro was released one year before the transition to ARM started.

The 2013 Mac Pro lasted from OS X 10.9 Mavericks to macOS 12 Monterey (9 years of latest updates and 3 additional years of security updates). Using previous data, the 2019 Mac Pro would last from macOS 10.15 Catalina to macOS 18 that will release in 2027. Intel Macs would be fully obsolete in 2030 when macOS 21 releases.
 
Mac Pro 2019 is a niche system that likely didn't sell in great numbers. They're hardly going to let it drag their Intel maintenance burden to macOS 18.
 
  • Like
Reactions: Arctic Moose
Mac Pro 2019 is a niche system that likely didn't sell in great numbers. They're hardly going to let it drag their Intel maintenance burden to macOS 18.
the 2019 mac pro is a niche (but massively expensive) system for high end users that currently has no viable alternative Apple platform if you need large amounts of RAM.

its far too early to retire it yet, until Apple put out an Apple Silicon machine that can handle several hundred megabytes of RAM.
 
  • Like
Reactions: Nermal and gank41
Mac Pro 2019 is a niche system that likely didn't sell in great numbers. They're hardly going to let it drag their Intel maintenance burden to macOS 18.

What they lack in volume they may make up in margin. Plus their owners may be key accounts.

It's also unclear how large the burden of Intel is on the OS as a whole. It's not like most of MacOS X is written in assembly. Then they had 15+ years of maturity in the OS on Intel so all those portions of the kernel and libraries should be stable. I bet the resources dedicated to OS support for Intel are just a rounding error at Apple at this point.

If I had to guess it will all come down to revenue projections. Revenue lost from pissing people off for ending support "too soon" versus revenue gained from forcing people to upgrade.

This is all speculation of course since none of us have the actual revenue or expense projections of such a decision and those who do aren't going to share...
 
  • Like
Reactions: throAU
I'm sure that the discontinuation of Intel Macs would take much longer, considering the fact that the Mac Pro was released one year before the transition to ARM started.

The 2013 Mac Pro lasted from OS X 10.9 Mavericks to macOS 12 Monterey (9 years of latest updates and 3 additional years of security updates). Using previous data, the 2019 Mac Pro would last from macOS 10.15 Catalina to macOS 18 that will release in 2027. Intel Macs would be fully obsolete in 2030 when macOS 21 releases.
The final PowerMac G5 came with Tiger 10.4.2 and the last update it got was Leopard 10.5.8. There was much wailing and gnashing of teeth, especially on this forum but if Apple wants to drop something unceremoniously, it will.
 
The final PowerMac G5 came with Tiger 10.4.2 and the last update it got was Leopard 10.5.8. There was much wailing and gnashing of teeth, especially on this forum but if Apple wants to drop something unceremoniously, it will.
This.

The G5 Late 2005 models were on sale until the Mac Pro was released August 7, 2006. Of course there would have been stock remaining in sales channels, so you could have purchased one brand new from an authorized reseller even after this date.

Mac OS X 10.6 Snow Leopard, without PPC support, was available from August 28, 2009.

Worst case, you would have had less than three years of the ability to run the current version of the operating system.
 
The final PowerMac G5 came with Tiger 10.4.2 and the last update it got was Leopard 10.5.8. There was much wailing and gnashing of teeth, especially on this forum but if Apple wants to drop something unceremoniously, it will.
But nowadays, Apple couldn't do this anymore. Even the Android market begun to understood that people didn't accept that their devices only last 2-3 years of software updates when everybody knows that devices are powerful enough to support many years of software updates. Don't forget the ecologic factor as well.

I think at this point, even if Apple stop supporting intel mac for new OS next year, it will be an "acceptable" lifespan for the most recent intel macs. With the two extra years of security updates, the support will last until september 2027 at the earliest. So at this point, I think its "acceptable" but I expect at least one more major OS update for the 2020 macs.
 
The only real hope for Intel Macs to continue is the Mac Pro market and how many industries purchased it for production farms etc. Apple is focused on Business needs as well, so that "may" be the insentive that is needed. But if there were not many purchased, then...we may see a short run like before.

Another interesting factor in the rumored Mac mini and the change of the design. While consumers aways wants new and fresh design, production and internet farms etc. who purchase 100's of Mac minis place them in server racks. Though their racks have been designed since 2010 to fit Mac mini, after 14 years it would require them to buy new racks also with new purchases...so we will see how things go.

I think we have a few years to go with Intel Mac not being totally wiped off Apple's mind. But...again...we will just have to wait and see...
 
Don't forget the ecologic factor as well.
You give Apple far too much credit. It regularly touts how green it is all the while gluing its components in and making them unnecessarily difficult if not outright impossible to upgrade. A standard repair for Apple is: Receive malfunctioning device. Put device in "recycle" bin. Give the customer a new/returned Apple device.

The fact that it has been shown that the Studio Mac can have its storage upgraded but that Apple does not even offer this as a possibility post purchase tells you all you need to know about Apple's green credentials.
 
Yeah let's not forget that the only reason Apple has green credentials is because they're being allowed to specify their own measurements. To use modern vernacular, we 'need to stop normalising' allowing companies who have chosen to manufacture products that deliberately by design cannot be upgraded to call themselves 'environmentally friendly' and 'carbon neutral'.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.