Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It seems you think writing a device driver is easy. Something like a modern GPU is impossible to be supported commercially without actual documentation from the GPU manufacturer. Case in point: the open source community could not even begin to support Broadcom’s WiFi chipset because there’s no documentation they could use to even start.

If you think it’s easy, which it seems you do judging from your posts, I don’t know what to say to you, because it is extremely hard. Please do not equate what Asahi folks is doing with Apple Silicon GPU to what Apple could do with nVidia GPU, because the open source community do not have any obligation to support AS Macs. It is a take it or leave it kind of situation.

Who said anything of Asahi or NVIDIA just now?

I'm talking about AMD. I'm telling you their Linux drivers are fully documented and open source.
The Linux community nowadays prefers AMD under Linux for this very reason.
If Apple really wanted, they could partner with AMD instead and use their open source drivers and documentation as a base.

Or they could go instead with Intel ARC, which is also open source.
 
The restrictions imposed by Linux's GPL licensing make this legally impossible.

They don't.

They could either make their driver open source too, or just take a look at how the open source version works and implement their own code (writing everything from scratch, without copying the code).

Or they could check if FreeBSD's AMD drivers use a BSD license, which allows for commercial use (even without crediting, if you so wish), and use them as a basis instead.

Or they could just check the comprehensive documentation.
They really have MANY options here.

I would just like to emphasize that Apple and Microsoft DO use GPL and BSD software in some applications and credit them accordingly. For example, MacOS descends from DarwinOS, which has a BSD license.
 
The argument of closed source drivers seems reasonable at first, but if you look at it more closely, you'll see it's bollocks.
You seem to be confusing things. The discussion of closed source drivers is only in reference to nVidia. There are many vocal users on here who will not be happy until Apple (or someone else with Apple’s blessing and support) has built a box with an AMD Ryzen CPU and nVidia GPU. They are not interested in any other CPU or GPU.
Here's why: Apple doesn't have just NVidia as an option. AMD Radeon drivers just happen to be open source, so Apple could even borrow inspiration from Linux drivers if they want to code support for third-party eGPUs.
Apple has moved in a different direction. They have built systems based around a Unified Memory Architecture and are not interested in supporting other GPUs, either PCIe connected or via a Thunderbolt expansion box. As Blackmagic Design has shown, one can get substantial performance improvements by coding to the architecture. Apple wants to encourage that to happen as quickly as possible and so do not plan to support any other GPUs.
Don't like AMD? Now we have Intel ARC too, which is also open source. It's the newest and weakest of the three products, but at least is an option.
An option for what? What problem are you trying to solve? Is there a huge community of users begging for macOS to support Intel’s ARC GPUs? Would you point to some major customer that has adopted that system?
But obviously, this has never been about proprietary drivers. It's about holding the control of the whole stack.
This will eventually backfire, since we're not in the 1990s anymore.
Apple has take a different approach to system design. Instead of GPUs connected across a slow PCIe bus, requiring separate memory and costly transfers between CPU RAM and GPU RAM (as well as the need to duplicate storage), they have built systems with Unified Memory, enabling their GPUs to use very large amounts of RAM, and to eliminate duplicate storage and expensive transfers between their CPUs, GPUs and specialized processors. You are correct, it is absolutely about controlling the whole system, in order to optimize it based on their design. Apple has always been about controlling the whole system, integrating their software with their hardware, they are now just big enough to be able to custom design even more of the components because they are a much bigger company than they were. I am not sure why you think this is new, or what companies in the 1990’s did anything like it and failed?
 
  • Like
Reactions: aytan
You seem to be confusing things. The discussion of closed source drivers is only in reference to nVidia. There are many vocal users on here who will not be happy until Apple (or someone else with Apple’s blessing and support) has built a box with an AMD Ryzen CPU and nVidia GPU. They are not interested in any other CPU or GPU.

I'm not confusing things. I'm saying that if the argument for them ditching Nvidia support is solely because their drivers are closed source (i.e, they want an OPEN alternative), then other companies do open their drivers. So, what they really want is to have the sole control of the stack.Blocking Nvidia was just an excuse for that.

They DON'T HAVE to use a discrete Nvidia or AMD GPU. But they could use those open source drivers as a base and provide eGPU drivers for MacOS if wanted. But that wouldn't let them control the whole stack, and would allow users to delay upgrading because they would be able to run a more powerful eGPU when their integrated GPU gets outdated.

Apple has always been about controlling the whole system, integrating their software with their hardware, they are now just big enough to be able to custom design even more of the components because they are a much bigger company than they were.

My peeve here is that people blame Nvidia for Apple not discouraging third-party graphics drivers, but ultimately the fault is not Nvidia's. I'm arguing that if THAT were the problem, they could just block Nvidia and support other vendors.
 
They could either make their driver open source too, or just take a look at how the open source version works and implement their own code (writing everything from scratch, without copying the code).
Unrelated to the fact that this discussion is completely irrelevant to the topic, if they were to look at the GPL code, they would risk infecting any code developed by those people. That is why companies do ”Clean Room” re-implementations.
I would just like to emphasize that Apple and Microsoft DO use GPL and BSD software in some applications and credit them accordingly. For example, MacOS descends from DarwinOS, which has a BSD license.
Apple used to use some software that was licensed under GPL 2 (the more flexible GPL license), but with the move to GPL 3, they have pretty much eliminated any GPL code (hence their replacement of GCC with LLVM and CLANG).

Second, macOS does not “descend” from Darwin, in that Darwin did not pre-date macOS. Darwin is just the name that Apple gave to their conglomeration of CMU’s Mach Kernel and FreeBSD’s UNIX APIs.

However, as I already pointed out, none of this is relevant to the discussion. The closed source driver issue is just one of the reasons that Apple does not use nVidia GPUs, it is not related at all to why they do not use AMD or other GPUs.
 
  • Like
Reactions: Nugget
Unrelated to the fact that this discussion is completely irrelevant to the topic, if they were to look at the GPL code, they would risk infecting any code developed by those people. That is why companies do ”Clean Room” re-implementations.

Then they could just look at the provided documentation and develop their own code, or just open source the specific code for AMD drivers. It would be support for a third-party driver, after all.
 
I'm not confusing things. I'm saying that if the argument for them ditching Nvidia support is solely because their drivers are closed source (i.e, they want an OPEN alternative), then other companies do open their drivers. So, what they really want is to have the sole control of the stack.Blocking Nvidia was just an excuse for that.
Let me try this again. Their reasons for not supporting external GPUs are based purely on their choosing to support a different architecture that has many advantages (and some disadvantages) over the more common discrete GPU architecture. Their reasons for not supporting nVidia for the last 11 or more years are based on a number of factors just one of which is nVidia’s refusal to provide Apple source to their drivers.
They DON'T HAVE to use a discrete Nvidia or AMD GPU. But they could use those open source drivers as a base and provide eGPU drivers for MacOS if wanted. But that wouldn't let them control the whole stack, and would allow users to delay upgrading because they would be able to run a more powerful eGPU when their integrated GPU gets outdated.
They do not want to support eGPUs because they want developers to optimize for their Unified Memory Architecture. Supporting other companies GPUs either via PCIe or external Thunderbolt connected card slots would just make it harder to get developers to do the work needed to get the maximum performance from UMA. Developers are not going to make an investment if they do not feel they have to do so. Look how long it took Microsoft and Adobe to move their code to Cocoa from Carbon.
My peeve here is that people blame Nvidia for Apple not discouraging third-party graphics drivers, but ultimately the fault is not Nvidia's. I'm arguing that if THAT were the problem, they could just block Nvidia and support other vendors.
The issues are completely separate. Apple’s lack of nVidia support is based on their issues with nVidia (detailed by me earlier in the thread), but their reason for not supporting discrete GPUs has nothing to do with that (nor is is based on the desire to prevent people from upgrading piece meal - the customers that matter in the space, do not do that, they upgrade whole systems).
 
  • Like
Reactions: Nugget
They do not want to support eGPUs because they want developers to optimize for their Unified Memory Architecture. Supporting other companies GPUs either via PCIe or external Thunderbolt connected card slots would just make it harder to get developers to do the work needed to get the maximum performance from UMA.

And how is that working out?
I'm sure Metal is a big hit!
 
Then they could just look at the provided documentation and develop their own code, or just open source the specific code for AMD drivers. It would be support for a third-party driver, after all.
Again, completely irrelevant. Their reasons for not supporting discrete GPUs has nothing to do with driver source code nor has anyone said it did. That discussion is only related to Apple’s lack of support for nVidia GPUs for the last decade.
 
And how is that working out?
I'm sure Metal is a big hit!
Blackmagic Design, Maxon, oToy, and many others seem to think it is working out very well. We will not know for sure for a few more years, but everyone that has really optimized for the architecture has seen huge performance gains.
 
Blackmagic Design, Maxon, oToy, and many others seem to think it is working out very well. We will not know for sure for a few more years, but everyone that has really optimized for the architecture has seen huge performance gains.

Surely it works well. But developers would port things to metal with or without GPU support, since it's a niche product anyway.

No one in their right mind would REQUIRE you to buy an eGPU just to run a video editing software or an image editing program. If Apple supported them, we would just have more... gasp... OPTIONS!
 
Surely it works well. But developers would port things to metal with or without GPU support, since it's a niche product anyway.
Sorry, that is not true. It took Adobe years to port to some of Apple’s new APIs despite being much slower and less stable as a result. It was not until Apple forced their hand by deprecating those older APIs that Adobe finally ported. More importantly, this is not just about porting to Metal, this is about optimizing for the Unified Memory Architecture. That requires much more work and if there is an easy way for them not to do it many would (again, we can simply look at the real examples of major companies like Adobe).
No one in their right mind would REQUIRE you to buy an eGPU just to run a video editing software or an image editing program.
Actually, many companies did require that one have either an eGPU or discrete GPU to run their software.
If Apple supported them, we would just have more... gasp... OPTIONS!
No, we would have inferior products, as people would not do the work to really support UMA.

I am curious since you are so invested in this topic, do you have a 2019 Mac Pro, if not what was the last Pro desktop you purchased from Apple? Do you use a GPU accelerated workflow professionally?
 
  • Like
Reactions: Nugget
If Apple really wants, they can reverse engineer and even write drivers for Nvidia cards. Will it have the same performance as a driver written by Nvidia themselves? Probably not, but if the Open Source community can do it with much less resources, I don't see why a trillion dollar would not be able to do it.

Of course, this is not a technical issue. It's more about politics + a clash of egos.

Who said anything of Asahi or NVIDIA just now?
Erm ... you did? I brought Asahi to make a point on reverse engineering.

I assure you, it is definitely a technical issue.

I'm talking about AMD. I'm telling you their Linux drivers are fully documented and open source.
The Linux community nowadays prefers AMD under Linux for this very reason.
If Apple really wanted, they could partner with AMD instead and use their open source drivers and documentation as a base.

Or they could go instead with Intel ARC, which is also open source.
It seem you still think relying on Open Source codes is a safe business model for Apple, or for any commercial businesses. Well, it looks like apparently Apple does not. Apple contributes to Open Source projects, but I'm not aware of Apple relying on Open Source projects to drive their business, unless they are in control.

I would think AMD releases full technical specifications of their GPU and shares GPU driver codes with Apple. There's no need for Apple to use Open Source AMD drivers which typically will be one or multiple generations behind in graphice chipset support. Not a good business model if you ask me.
 
I'm not clear on what you are saying here. Can you clarify? You want Windows to make a big change to impact the industry, and ARM code for rebuilt legacy apps ... won't be under Microsoft's hold? It never has been; anyone can compile code for Windows for any supported Windows compile target; right now that's ARM and X86 and X86_64; that was at one time or other Itanium, PPC, DEC, MIPS....I clearly don't understand your point. Please rephrase.

Taking what you’ve stated here, your right should you focus on development teams such as those within Adobe’s suit if applications:
Illustrator, Photoshop, InDesign, etc. they’ll develop for any platform. Yet if you’ve been noticing their ARM based apps have been lagging, initially with iOS/iPadOS in terms of features/functionality that allowed a lot new hungry developers to fill in holes missed. They’ve since over the last decade have improved.

Although I’d you look at apps such as limewire (BitTorrent apps of old), and many apps such as Microsoft’s ADUC, and other infrastructure apps, or utility apps you’ll notice there isn’t a proper equivalent - and if so very different feature set beyond the basics - of just entire different feature set.

Looking at developer teams that are primarily or solely Apple platform first such as CukturedCode which existed initially on Mac OS X (and prior), quickly and rapidly jumped to iOS very very early on and have continued to offer incredible performance and rapid bug fixes. No issues modifying their code for Apple Silicon as binary or not. Such devs or dev teams will continue to offer great solutions.

However should we look at devs/dev teams that primarily focus on x86-64 using .Net/C++/Python/VisualStudio are, usually not too interested in or lag or don’t even bother making their own apps, or alternative apps for ARM. One such example is CRM apps like Maximizer or Dynamics365, or even S&P’s Capital IQ and Bain for Excel add-ins. There was a huge decade old need for Microsoft’s Office for MacOS team to bring king time features such as creating PST’s therein but never have.

Sure I’m giggling most apps or add-ins for Windows that are used in a corporate environment but it doesn’t mean they’re not wanted or needed for ARM platform. Yet this is exactly what I’m stating is going to be very different in the future - possibly or ideally.
 
Taking what you’ve stated here, your right should you focus on development teams such as those within Adobe’s suit if applications:
Illustrator, Photoshop, InDesign, etc. they’ll develop for any platform. Yet if you’ve been noticing their ARM based apps have been lagging, initially with iOS/iPadOS in terms of features/functionality that allowed a lot new hungry developers to fill in holes missed. They’ve since over the last decade have improved.

Although I’d you look at apps such as limewire (BitTorrent apps of old), and many apps such as Microsoft’s ADUC, and other infrastructure apps, or utility apps you’ll notice there isn’t a proper equivalent - and if so very different feature set beyond the basics - of just entire different feature set.

Looking at developer teams that are primarily or solely Apple platform first such as CukturedCode which existed initially on Mac OS X (and prior), quickly and rapidly jumped to iOS very very early on and have continued to offer incredible performance and rapid bug fixes. No issues modifying their code for Apple Silicon as binary or not. Such devs or dev teams will continue to offer great solutions.

However should we look at devs/dev teams that primarily focus on x86-64 using .Net/C++/Python/VisualStudio are, usually not too interested in or lag or don’t even bother making their own apps, or alternative apps for ARM. One such example is CRM apps like Maximizer or Dynamics365, or even S&P’s Capital IQ and Bain for Excel add-ins. There was a huge decade old need for Microsoft’s Office for MacOS team to bring king time features such as creating PST’s therein but never have.

Sure I’m giggling most apps or add-ins for Windows that are used in a corporate environment but it doesn’t mean they’re not wanted or needed for ARM platform. Yet this is exactly what I’m stating is going to be very different in the future - possibly or ideally.
Sorry; I still don't think I understand what you are saying. You're saying... devs aren't making Windows apps for ARM? But MacOS developers did make iOS apps? Is that the key point? Several of those paragraphs, last in particular, are not at all clear. I'm sorry.
 
And of course, nothing is stopping Apple from shipping drivers: proprietary, 3rd party, or otherwise; to support just using GPUs as software accelerators, nothing to do with video out. I don't think they will, but they certainly could; but there is basically zero chance of it being part of macOS's graphical display pipeline.
 
Sorry; I still don't think I understand what you are saying. You're saying... devs aren't making Windows apps for ARM? But MacOS developers did make iOS apps? Is that the key point? Several of those paragraphs, last in particular, are not at all clear. I'm sorry.
Try reading without your bias. Remember I came at this with a corporate focus.
 
Last edited:
nVidia used to be the primary GPU vendor to Apple, but ended the relationship so badly that Apple has said it would never work with them again.



I see this mentioned in the forums here and there without something to back it up. I know Apple doesn’t work with NVidia but I do recall the 2008 MacBook having great performance and graphics vs the industry by using Nvidia’s SouthBridge.

When and where and how did this relation get so viscous between them?!
 
I sure they could if they really wanted to. Would be a good compromise especially if you still need boot camp.
They could also make Windows computers if they wanted to. It would be just as good a compromise for those who wanted to run Windows.

Apple has transitioned to Apple Silicon. The fastest way to get software companies to support it is to make it clear that is the only option for running macOS. I am not sure why this is so hard for people to get.
 
They could also make Windows computers if they wanted to. It would be just as good a compromise for those who wanted to run Windows.

Apple has transitioned to Apple Silicon. The fastest way to get software companies to support it is to make it clear that is the only option for running macOS. I am not sure why this is so hard for people to get.

Except it is not. It can also run other operating systems (although you cannot set those operating systems as the primary operating system, and cannot erase MacOS).
 
Except it is not. It can also run other operating systems (although you cannot set those operating systems as the primary operating system, and cannot erase MacOS).
I have no idea to what the ”it” in that sentence refers, nor do I understand what point you think you are making in your second sentence.

However, I will try once again to explain as simply as possible: Apple has made a transition to Apple Silicon. They may at some point in the future decide to make some other transition. They have already done this five times before:
  1. From Motorola 68K chips without MMUs to those with them.
  2. From 68K chips to PowerPC chips.
  3. From macOS 9 to macOS X.
  4. From PowerPC chips to x86.
  5. From x86 to x86_64.
Each time they made the transition in one direction. The more clear that the transition was irreversible, the faster software companies supported the new architecture. The transition to macOS X took much longer than it should have because they kept coddling developers with technologies like Carbon. I expect they will not make that mistake again.

There is only one architecture that they will support - their Apple Silicon’s Unified Memory Architecture, until they decide to make another transition.

It is very possible that this new direction will not suit every previous Mac user, nor will every niche software market be served. However, as a user of these systems, I am happy with their new direction and want this transition to happen as fast as possible. For me, the best way to get very high end Apple Silicon systems is to have applications that would take advantage of them. The way for that to happen is for software vendors to adopt this architecture.
 
Last edited:
Except it is not. It can also run other operating systems (although you cannot set those operating systems as the primary operating system, and cannot erase MacOS).

Not anymore. With the transition to Apple Silicon, the newer M1 and later macs can no longer run windows natively like the intel macs could. There is no Boot Camp anymore for those macs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.