I'm almost sure that the very stable AMD driver is something Apple wrote and maintain themselves . For whatever reason, good or bad, Apple is either unwilling or unable to do the same for NVidia cards - it could be NVidia having a license agreement (if you use our cards, you have to use our driver). This would make sense to protect the expensive Quadro line, which are basically binned GeForces with highly stable drivers (mobile Quadros are nothing more than that, some desktop parts are slightly different in real specifications, although never close to justifying their price premiums). [emphasis mine]
If Macs were using GeForces as Quadros, it would give Apple a huge price advantage over HP and other workstation vendors who pay the Quadro premium (and perhaps encourage HP to write a "GeForce as Quadro" driver). NVidia could be saying "we'll sell you GeForces, but we'll only port the gaming driver, and you can't use your own" - if you want a more stable driver, you pay for Quadros. Apple is a big enough fish in AMDs pond that AMD is letting Apple use their own driver.
Of course, it could also be Apple simply being lazy! NVidia could have no objection to Apple using any driver they want (and there could be no technical obstacle to a stable Apple driver for GeForces), but Apple might be saying "we already have a nice Mac GPU driver, let's task those developers with designing more Memojis instead of writing an NVidia driver". I wouldn't put it past them.
I don't think that's the issue; it seems to be more of a general conflict between the companies:
https://appleinsider.com/articles/1...in-macos-and-thats-a-bad-sign-for-the-mac-pro
Also, I understand from the above linked article that the issue isn't that Apple thinks it needs to write the drivers itself in order to get the stability it requires:
"It's not like we have any real work to do on it, Nvidia has great engineers," said one [Apple] developer in a sentiment echoed by nearly all of the Apple staff we spoke with. "It's not like Metal 2 can't be moved to Nvidia with great performance. Somebody just doesn't want it there."
In addition, even if Apple did write the driver, I'm not sure if it could turn a GeForce into a Quadro just by providing a Quadro-type driver. In particular, I've read the differences between Quadros/Teslas and GeForces go beyond binning and driver stability, including:
1) Error correction.
2) Unified memory (memory sharing) and, perhaps relatedly, GPU-Direct RDMA, which I gather allows GPUs to communicate directly with each other without going through the CPU.
3) Much faster connection speed (NVLink vs. PCIe).
[see:
https://www.microway.com/knowledge-...of-nvidia-geforce-gpus-and-nvidia-tesla-gpus/ ]
So my question is: How much of each of these is due to the drivers vs. the hardware? IIUC, ECC RAM requires different hardware, so I wouldn't be surprised if implementing error correction in GPUs required different hardware as well---which means Apple couldn't turn a GeForce into a Quadro simply by supplying a different driver.
Not saying NVIDIA isn't charging a large premium for all this, just wanted to get an accurate understanding of the differences between the chips, which seems to go significantly beyond what you wrote.
Also, even if one could turn a GeForce into a Quadro just by changing the driver (which doesn't seem to be the case), NVIDIA could still both allow Apple to write the drivers and protect its commercial interests through a licensing agreement that limited those drivers to a GeForce feature set.