Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

How much would you be able to pay for a Radeon RX 7900 XTX card driver for Apple Silicon Mac?

  • I wouldn't pay for the drivers

    Votes: 23 67.6%
  • $100

    Votes: 4 11.8%
  • $200

    Votes: 1 2.9%
  • $300

    Votes: 0 0.0%
  • $400

    Votes: 3 8.8%
  • $500

    Votes: 3 8.8%

  • Total voters
    34

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
I asked myself the question "how much would I pay for a driver for a Radeon RX 7900 XTX card to be able to use it for Metal rendering in Blender Cycles, OTOY Octane X or Maxon Redshift?". If I could use several 7900 XTX cards for rendering, I think I would be able to pay even $500. What would be your choice?
 

mcnallym

macrumors 65816
Oct 28, 2008
1,191
925
Well going to have pay Apple to add support for 3rd party GPU into Apple silicon first.

as the hw engineer said, is not a problem Apple interested in solving.
Apple will want a bit more then 500 dollars to develop that in.

so Intel Mac OS would be where could be added and then PC Card version cards.
 

Wokis

macrumors 6502a
Jul 3, 2012
931
1,276
Stockholm, Sweden
Big problem to solve so if I was in need I would probably be ready to pay a lot. 500.

Something way simpler would be a driver for the 2019 Intel Mac Pro. Even that isn't being served up. Horrible service for their most loyal customers.

People with large GPU-needs better leave Mac as a platform, or KVM their way to a PC whenever such work is needed.
 

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
Well going to have pay Apple to add support for 3rd party GPU into Apple silicon first.

as the hw engineer said, is not a problem Apple interested in solving.
Apple will want a bit more then 500 dollars to develop that in.

so Intel Mac OS would be where could be added and then PC Card version cards.
I mean commercial drivers written by an independent developer. I realize that this would probably require changing macOS settings to "Reduced Security" and "Allow user management of kernel extensions from identified developers" – similar to, for example, the Hijack Audio app.
 

leman

macrumors Core
Oct 14, 2008
19,357
19,430
I mean commercial drivers written by an independent developer. I realize that this would probably require changing macOS settings to "Reduced Security" and "Allow user management of kernel extensions from identified developers" – similar to, for example, the Hijack Audio app.

Even if someone were to write such drivers, you’d have to rewrite software to take advantage of them. There is no way to plug in third-party GPU drivers in macOS. So such a driver would need to provide its own API the application would need to use.

All in all? Sounds like a terrible idea. If you want to use 7900 XTX, getting a PC box seems like a no brainer. M2 Ultra is practically as fast as 7900 XTX in blender anyway (and significantly faster on complex scenes).

P.S. Why are you asking about the 7900 XTX and not the significantly faster Nvidia GPUs if you care about rendering?
 

Serqetry

macrumors 6502
Feb 26, 2023
346
527
I mean commercial drivers written by an independent developer. I realize that this would probably require changing macOS settings to "Reduced Security" and "Allow user management of kernel extensions from identified developers" – similar to, for example, the Hijack Audio app.
Sorry but this is not possible, doesn't matter what settings you change.

The only driver that could be written for MacOS would have to be for Intel machines only.
 

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
M2 Ultra is practically as fast as 7900 XTX in blender anyway (and significantly faster on complex scenes).
But you could use multiple RX 7900 XTX graphics cards and have many times better rendering performance.
Why are you asking about the 7900 XTX and not the significantly faster Nvidia GPUs if you care about rendering?
Because only Radeons had Metal support. Unfortunately there is no such thing as 'Nvidia' in the Apple world.
 

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
Because there is no interface for writing GPU drivers on Apple Silicon macOS. See my post above.
There are no GPU driver tools for Apple Silicon on Linux either, and yet such drivers have been created. So maybe someone exceptionally talented would be able to figure something out.
 

VivienM

macrumors 6502
Jun 11, 2022
496
341
Toronto, ON
Because there is no interface for writing GPU drivers on Apple Silicon macOS. See my post above.

There are no GPU driver tools for Apple Silicon on Linux either, and yet such drivers have been created. So maybe someone exceptionally talented would be able to figure something out.
By 'interface', I think @leman meant a way to actually 'connect' the driver to the OS.

Linux certainly has the ability to have lots of video drivers... and more importantly, it's open source, you have the source code in front of you. All the Apple Silicon on Linux folks had to do is reverse engineer the M1 graphics, write a Linux driver for it, add it to their kernel tree, and, once it's ready, send it upstream to Mr. Torvalds & co. to be added into the main kernel tree. Nothing controversial here - the frameworks already exist in the Linux kernel to support Intel, AMD, NVIDIA and lots of others - see https://en.wikipedia.org/wiki/Direct_Rendering_Manager and scroll down to the supported hardware section.

That is a lot easier than writing a driver for a (closed-source) OS that wasn't designed to have such drivers written for it.

Somewhere in between, you have an OS like (x86) Windows, which is closed source but designed to accommodate a lot of drivers for a lot of different hardware types.

It's worth noting, the entire concept of drivers in the Windows sense is somewhat unusual outside the 'IBM-compatible' x86 world. The more common model would have been the CP/M or Android model where the hardware maker licences the OS, modifies the OS source code to support the hardware they want to include, and then ships their product with their build of the OS that they are then responsible for supporting (or not supporting, in the case of many Android OEMs). The Apple model has been somewhat like that too - for some hardware types, the OS supports what Apple feels like it's going to support, and then doesn't have any way to add drivers for other models; for other hardware types, e.g. printers, the OS will allow third-party drivers.
 
  • Like
Reactions: Colstan

Serqetry

macrumors 6502
Feb 26, 2023
346
527
There are no GPU driver tools for Apple Silicon on Linux either, and yet such drivers have been created. So maybe someone exceptionally talented would be able to figure something out.
Yeah this has absolutely nothing to do with the problem. Writing drivers for an OS that uses GPU drivers (Linux) is one thing... Apple Silicon MacOS literally has no way to access GPUs other than the one built into the SOC. People really just need to get it out of their heads that with enough dedication and positive thinking that there might be a way to use a PCIe GPU with Apple Silicon someday. It will never happen unless Apple rethinks their whole ASi philosophy... something Apple has no intention of ever doing. They are very happy with their decision to dump external GPUs.
 

leman

macrumors Core
Oct 14, 2008
19,357
19,430
But you could use multiple RX 7900 XTX graphics cards and have many times better rendering performance.

Which multiple graphics cards? Mac Pro has only two 16x PCI-e slots, which share the same 16 lanes. I am really unsure what kind of "many times better" performance you expect of a configuration like that.

Because only Radeons had Metal support. Unfortunately there is no such thing as 'Nvidia' in the Apple world.

There is no third-party Metal support on Apple Silicon machines at all. Just because AMD drivers exist on the Intel version of macOS does not mean they are easy (or even possible!) to make for Apple Silicon. The ARM fork of macOS uses iOS graphics stack, it looks entirely different from the x86 one.

There are no GPU driver tools for Apple Silicon on Linux either, and yet such drivers have been created. So maybe someone exceptionally talented would be able to figure something out.

So a bunch of very talented hackers have reverse-engineered Apple hardware and firmware protocols and patched Linux to run on Apple Silicon. It's very impressive work and I respect them a lot. But Linux is designed to be patched. What you are talking about is hacking and patching macOS itself. Entirely different beast.

Anyway, my point is that you have an idea, but you don't back it by technical knowledge or a concrete implementation plan. The way you talk about it also makes clear that you don't know much about how the graphical stack works on macOS and Apple Silicon. It's ok to daydream, but if you actually want to get things done it might be a good idea to check your day dreams agains reality. I think enough folks have explained to you why what you want is not feasible or useful. The rest is up to you.
 
  • Love
Reactions: Colstan

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
Which multiple graphics cards? Mac Pro has only two 16x PCI-e slots, which share the same 16 lanes. I am really unsure what kind of "many times better" performance you expect of a configuration like that.

There is no third-party Metal support on Apple Silicon machines at all. Just because AMD drivers exist on the Intel version of macOS does not mean they are easy (or even possible!) to make for Apple Silicon. The ARM fork of macOS uses iOS graphics stack, it looks entirely different from the x86 one.

So a bunch of very talented hackers have reverse-engineered Apple hardware and firmware protocols and patched Linux to run on Apple Silicon. It's very impressive work and I respect them a lot. But Linux is designed to be patched. What you are talking about is hacking and patching macOS itself. Entirely different beast.

Anyway, my point is that you have an idea, but you don't back it by technical knowledge or a concrete implementation plan. The way you talk about it also makes clear that you don't know much about how the graphical stack works on macOS and Apple Silicon. It's ok to daydream, but if you actually want to get things done it might be a good idea to check your day dreams agains reality. I think enough folks have explained to you why what you want is not feasible or useful. The rest is up to you.

Ok, forget about the GPU, because it's really not about the graphics card.

Actually, it's about the Metal/Neural/AI accelerator. The system is supposed to send the task to the PCIe/Thunderbolt card, and the accelerator sends back the results of the calculations. In a 'simple' case, these would be consecutive 'tiles' of the rendered raytracing image, distributed to multiple PCIe accelerators (image attached) – with a large number of raytracing samples, a high bandwidth is not necessary for this (but obviously desirable). The result of the calculation, as a whole image is displayed by the Apple Silicon GPU with superfast memory. We have not sinned, Apple's dogma is not trampled and everyone is happy. For the sake of clarity, let's assume that we are not using an RX 7900 XTX, but, for example, an AMD MI210 (https://www.amd.com/en/products/server-accelerators/amd-instinct-mi210). In this case, the Mac Pro 2023 fans and power supply would finally come in handy for something meaningful. Is this also prohibited by Apple? After all, there are e.g. audio accelerators with their own DSP processors, and Apple has no problem with this.
Let's dream a little. Why not?

TilesRendering.jpg

AMD_Instinct.png
 
Last edited:

ChrisA

macrumors G5
Jan 5, 2006
12,723
1,884
Redondo Beach, California
I asked myself the question "how much would I pay for a driver for a Radeon RX 7900 XTX card to be able to use it for Metal rendering in Blender Cycles, OTOY Octane X or Maxon Redshift?". If I could use several 7900 XTX cards for rendering, I think I would be able to pay even $500. What would be your choice?
I worked as a developer for years, and many times my boss would ask me to do cost estimates for some project. At first I was totally shocked at how LITTLE work can be done for $1,000,000. No way on Earth could a driver like this be designed, written, tested and supported for "only" $1M.

Do you really think there are tens of thousands of people who would pay $500 for a driver?

Software seems cheap. You can buy MS Office for only a few hundred dollars. But that is because they sell millions of copies. If they only sold a few thousand per year, the price might be $4,000 per seat.

OK, but today you CAN get what you want. By a Linux PC and run blender on it. You can do most of the work on the Mac then do the rendering via remote login. The Linux PC can be "headless" with no keyboard or monitor.
 

leman

macrumors Core
Oct 14, 2008
19,357
19,430
In a 'simple' case, these would be consecutive 'tiles' of the rendered raytracing image, distributed to multiple PCIe accelerators (image attached) – with a large number of raytracing samples, a high bandwidth is not necessary for this (but obviously desirable). The result of the calculation, as a whole image is displayed by the Apple Silicon GPU with superfast memory. We have not sinned, Apple's dogma is not trampled and everyone is happy. For the sake of clarity, let's assume that we are not using an RX 7900 XTX, but, for example, an AMD MI210 (https://www.amd.com/en/products/server-accelerators/amd-instinct-mi210).

If you want to do raytracing, an AMD Instinct makes no sense whatsoever. M2 Ultra is going to be 50-100% faster than a MI210 for this.

If you want to do ML, $500 per month gets you a H100 8 hours per day. That's an equivalent of 3-4x MI250X. And the best part? You don't need to spend $50'000+ on a workstation.
 
  • Like
Reactions: Colstan

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
M2 Ultra is going to be 50-100% faster than a MI210 for this.

But 4x 7900 XTX would be 800% faster than M2 Ultra.

I think I'm running out of ideas on how to describe more simply what I mean. The idea is to unlock the pure computing power of PCIe cards for Apple Silicon, to performing Metal calculations. That's all.
Otherwise in 2 years the M2 Ultra will be a mid-range computing card, and in 4 years it will be low-end calculator without any upgrade options.
 

leman

macrumors Core
Oct 14, 2008
19,357
19,430
I think I'm running out of ideas on how to describe more simply what I mean.

I understand very well what you mean. It’s just that you don’t seem to understand that what you are asking is not technically feasible or meaningful.


The idea is to unlock the pure computing power of PCIe cards for Apple Silicon, to performing Metal calculations. That's all.

There is no interface for plugging third-party GPUs into Metal on Apple Silicon Macs. This is simply not possible without reverse-engineering significant part of the OS. Could be a fun project for a passionate hacker, sure, but the practical utility would be zero.

Otherwise in 2 years the M2 Ultra will be a mid-range computing card, and in 4 years it will be low-end calculator without any upgrade options.

Yes, this is true. And this is a clear limitation of Apples design. But there is just no way around it.
 

crsh1976

macrumors 68000
Jun 13, 2011
1,598
1,829
Otherwise in 2 years the M2 Ultra will be a mid-range computing card, and in 4 years it will be low-end calculator without any upgrade options.
Isn't that the whole idea and business model behind the ASi switch?

With everything integrated/soldered and nothing upgradable after purchase, Apple is more than happy to sell you an entire new computer once your current one is old/slow - that works for the consumer/prosumer market, which is Apple's main audience.

It doesn't fit with professional workstation needs, but Apple long decided to move away from that market anyway (it fully understands nobody in their right mind buys a whole new $10-20k workstation every 2-3 years).

The architecture is not designed to work with 3rd-party add-ons (and that's by design, not some accident or afterthought) - the middle ground is the latest Mac Pro, allowing some add-ons but nothing that stems away too much from the consumer-centric "buy a new one every other year" strategy.
 

senttoschool

macrumors 68030
Nov 2, 2017
2,592
5,368
But 4x 7900 XTX would be 800% faster than M2 Ultra.

I think I'm running out of ideas on how to describe more simply what I mean. The idea is to unlock the pure computing power of PCIe cards for Apple Silicon, to performing Metal calculations. That's all.
Otherwise in 2 years the M2 Ultra will be a mid-range computing card, and in 4 years it will be low-end calculator without any upgrade options.
Why can't you sell your M2 Ultra Mac Studio/Mac Pro and get an M3 Ultra? M4 Ultra? M5 Ultra? Etc.

I mean, you're going to sell the 7900 XTX and buy the 8900 XTX right?

Meanwhile, in a non-crypto-bubble world, AMD GPUs have low resale value relative to Nvidia cards and Apple gear.
 

MRMSFC

macrumors 6502
Jul 6, 2023
347
358
Otherwise in 2 years the M2 Ultra will be a mid-range computing card, and in 4 years it will be low-end calculator without any upgrade options.
That’s overly dramatic. At the current rate it would be at least five years before the top of the line hits midrange.
 
  • Like
Reactions: AlphaCentauri

Xenobius

macrumors regular
Original poster
Dec 10, 2019
182
473
That’s overly dramatic. At the current rate it would be at least five years before the top of the line hits midrange.
Really? I don't think so.

GPUs.jpg


...and this does not take into account the power of AMD/Nvidia RT cores, which Apple Silicon does not have. Besides, in a PC crap you can use several graphics cards and several times the total computing power.

Puget_4090.jpg


The situation would look a little bit better if Apple released a Mac Pro with M2 Extreme now. Unfortunately, that didn't happen and it looks terrible.
 
Last edited:

MRMSFC

macrumors 6502
Jul 6, 2023
347
358
Really? I don't think so.

View attachment 2228971

...and this does not take into account the power of AMD/Nvidia RT cores, which Apple Silicon does not have. Besides, in a PC crap you can use several graphics cards and several times the total computing power.

View attachment 2228985

The situation would look a little bit better if Apple released a Mac Pro with M2 Extreme now. Unfortunately, that didn't happen and it looks terrible.
One, good luck fitting 7 4090’s in a case, and not blowing a fuse doing so.

Two, tflops aren’t a hard measure of performance. (Radeons Vega architecture comes to mind, with the VII comparable in tflops to Nvidias 30 series but performing like a 1080ti in graphics)

Three, in raster, the gen to gen performance jump is minimal (and in some cases, backwards)

Four, the M2 Ultra performs comparably to a 3070 in Blender rendering with Nvidia’s Optix on.

And let’s do a performance analysis of competitors gen over gen.
According to lazy googling leading to userbenchmark, the 10 series got about a 30% uplift in performance to the 20 series. 20 to 30 got about another 30%, and the 30 to 40 series being, on average, 30% (the 4090 being the outlier).

Typical good gen over gen improvement over 7 years. Hardly “calculator level” as per your prediction.

But I guess we should see what monstrosity the 5090 is, since the goalpost seems to be the top performing gpu.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,579
1,012
The situation would look a little bit better if Apple released a Mac Pro with M2 Extreme now. Unfortunately, that didn't happen and it looks terrible.
Apple makes very opinionated computers that fit most workflows, but not all. If your workload doesn't fit, you're screwed and you have a decision to make: you have to decide if the loss of performance is more important than not using macOS. For some people it's not and they keep using Apple hardware. But, for others it is, and they use hardware from other companies.
 

Numa_Numa_eh

Suspended
Jun 1, 2023
87
105
Really? I don't think so.

View attachment 2228971

...and this does not take into account the power of AMD/Nvidia RT cores, which Apple Silicon does not have. Besides, in a PC crap you can use several graphics cards and several times the total computing power.

View attachment 2228985

The situation would look a little bit better if Apple released a Mac Pro with M2 Extreme now. Unfortunately, that didn't happen and it looks terrible.
Hmmm. As others have pointed out TFlops are not a good way to compare generally.

In gfxbench the M2 ultra compares to a 4080. In geekbench compute the ultra is close to a 4080 and in blender only optix based Nvidia cards and the 7900xtx (which is very close to the ultra) beat it.

You’re overstating things here. I have no idea what you mean by AMD and rt cores. They don’t appear to help AMD very much.
 
  • Like
Reactions: MRMSFC
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.