Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

William Payne

macrumors 6502a
Original poster
Jan 10, 2017
931
360
Wanganui, New Zealand.
I notice a very small amount of info regarding peoples use of workstation grade cards in their Mac Pro, another poster said start a thread so here I am.

I do not want a debate about why to use a workstation card over a geforce or AMD equivalent. I just want somewhere where people can talk about what they are using? why they made that choice? and how the card is working for their needs?

To define workstation cards I am meaning Nvidia Quadro cards and AMD equivalents.
 
Last edited:
  • Like
Reactions: zoltm
Mr. Payne, I'm thinking the combination of:

- High Sierra
- eGPU
- pro AMD cards in iMac Pro
- modular Mac Pro
- Nvidia supporting the Mac
- AR kit
- machine learning push

Will see Mac users returning to pro cards in greater, albeit still relatively small numbers. Even then I bet most discussion will happen on support boards such as AutoCAD, Blender, TensorFlow, etc.

Unfortunately most of the people doing interesting stuff with GPUs on Mac now are probably not enthusiasts, and definitely not posting about it here. Their IT departments will test some exotic gear from max expansion or whomever - and if it works, they buy more to go into production. NDAs, etc. means a lot of this groundbreaking work doesn't make it to the general forums. Frankly, I'd expect this area to remain pretty quiet until official eGPU support is in the mainstream OS.

Just for the sake of discussion, I use my Macs professionally in many capacities, password cracking being the most GPU intensive. For this, pro cards are prohibitively expensive.

Might not hurt to start off with what you use your pro cards for. Also wouldn't mind seeing a poll - how many folks are using professional cards? Why or why not?

Good thread.
 
  • Like
Reactions: dabotsonline
Since getting my Mac Pro I was fully intending to put a Pascal Quadro in it and also leave in the stock gt120. However this morning I had a "why didn't I think of this earlier moment?" Where I thought I may try to find a Mac edition k5000 to replace my gt120 then if I need "more power!" I can just put single slot pascals in the 2nd 16x slot.
 
  • Like
Reactions: dabotsonline
The reason why GT120 is a good "boot screen card" because it's slot powered. Therefore, both mini 6pin can be use to drive the real powerful card.

If you use K5000 to replace it. The 2nd card's power supply may be affected. (Of course, there are lots of work around. e.g. Extra PSU, use SATA powers, or Pixlas mod etc.)
 
  • Like
Reactions: dabotsonline
The reason why GT120 is a good "boot screen card" because it's slot powered. Therefore, both mini 6pin can be use to drive the real powerful card.

If you use K5000 to replace it. The 2nd card's power supply may be affected. (Of course, there are lots of work around. e.g. Extra PSU, use SATA powers, or Pixlas mod etc.)

Correct I would have to steel power if required from elsewhere.
 
Also to break the ice I'll start with my uses. Whether it makes sense or not I honestly want these cards for 2 reasons.

1. Colour accuracy. I know that this can be software reliant. But I want to be able to calibrate to the highest I can. I know some will say "it doesn't matter!" And that is ok. It is a personal choice I am making for my photography.

2. Reliability/error correction. Here is another thing where some will say it does not matter. Most people want "maximum performance!!!" I would like that as well but within reason, I will gladly sacrifice a couple of minutes of performance if it means that the chances of anything going wrong are dramatically reduced. Data corruption is a definite avoid at all costs situation.

What do I do? I'm just a guy who is trying to transition his photography from hobby to full time working Pro.

Also interested in cad/design but on a more minor scale, I work in engineering shop by day so get to see the awesome software they use.
 
  • Like
Reactions: dabotsonline
I have a ASUS RX 480 STRIX OC 8GB with hacked *.kext!

I didn't want to spend the premium on a k5000 or a firepro, support was also an issue.
Besides, I needed a great card NOW (late Dec. '16)

I'm a indie game developer, does freelance motion graphics and dabbles in Fusion360 in my off time.

The reasoning behind my AMD choice.
nVIDIA seems more Windows compliant. Having better DX and CUDA performance than OpenGL/CL.
Lower price.
The ASUS card I bought is rated at 147w (PCI-e socket + 1x aux pwr = 150w) which meant I could get a second card if need be. The nVIDIA cards were rated at 170+ watt (have to use both aux pwr cables for that) for the same performance as the AMD at double the price.

For photography, ECC memory on the GPU isn't as useful as it would seem.
 
  • Like
Reactions: dabotsonline
Thanks for joining in, yeah ECC on the GPu is more of a nice to have feature but not the end of the world if not there.

I'm actually unaware whether it would have any useful effect compared to Raid storage, here I'm talking exclusively about pictures. Since your picture already is made on the SLR camera, I would imagine that keeping the file from corrupting on the storage solutions would be the thing to do.

3D modelling on the other hand, that would definitely need ECC memory.

English is not my first language (or second for that matter) so I'm hard pressed to find the right words to express what I mean here at 03:13 AM.
Hope it somehow makes sense!
I'll be checking back, in a couple of hours after some much needed sleep.
 
  • Like
Reactions: dabotsonline
I'm actually unaware whether it would have any useful effect compared to Raid storage, here I'm talking exclusively about pictures. Since your picture already is made on the SLR camera, I would imagine that keeping the file from corrupting on the storage solutions would be the thing to do.

3D modelling on the other hand, that would definitely need ECC memory.

English is not my first language (or second for that matter) so I'm hard pressed to find the right words to express what I mean here at 03:13 AM.
Hope it somehow makes sense!
I'll be checking back, in a couple of hours after some much needed sleep.

It is definitely important to have a safe storage setup. There are guys out there who do just fine without ECC anything. I don't mind being overkill with this kind of stuff so hey if I'm given the option to use it I just use it.
 
  • Like
Reactions: Reclzz
The engineering shop uses Macs? Which apps?

Sorry no the engineering shop I work in uses solid works on custom windows workstations. However a customer came in a few months back with solid works running via bootcamp on a MacBook pro.

As for Mac compatible engineering apps. I know Autodesk fusion 360 is available for Mac as well as other Autodesk software. I know of some others but for the life of me can't remember the names.
 
Also to break the ice I'll start with my uses. Whether it makes sense or not I honestly want these cards for 2 reasons.

1. Colour accuracy. I know that this can be software reliant. But I want to be able to calibrate to the highest I can. I know some will say "it doesn't matter!" And that is ok. It is a personal choice I am making for my photography.

2. Reliability/error correction. Here is another thing where some will say it does not matter. Most people want "maximum performance!!!" I would like that as well but within reason, I will gladly sacrifice a couple of minutes of performance if it means that the chances of anything going wrong are dramatically reduced. Data corruption is a definite avoid at all costs situation.

What do I do? I'm just a guy who is trying to transition his photography from hobby to full time working Pro.

Also interested in cad/design but on a more minor scale, I work in engineering shop by day so get to see the awesome software they use.

I am not going to argue if workstation card is a good choice, that's your choice. But I want to join the discussion.

AFAIK, colour accuracy / error correction / reliability… are all driver related. In Windows, work station card has their own driver, which make those difference. But under MacOS, most of those card are not even officially supported. Even they do (e.g. K5000), they use the same driver as the gaming card, which do not provide extra colour accuracy, reliability etc, and ECC is disabled. This is one of the main reason why lots of us didn't choose the workstation card. We pay the premium, but cannot enjoy any benefit. In some case, even the 24/7 support may be useless for us.

However, in terms of hardware, I believe workstation card is more reliable (less chance of having faulty hardware). I don't know their failure rate, it's just my guess. May be workstation card has more or less the same failure rate as the gaming card, but just has quick and helpful hardware support from manufacture. I really don't know.

Anyway, there is definitely more (powerful) single slot workstation card to choose. Or if want a card with 4 display port etc. It's also much easier to achieve that by installing a workstation card.

So, in terms of hardware, I can see their difference. But in terms of software, there should be no difference. And the software stop the hardware giving us any benefit under MacOS.
 
AFAIK, colour accuracy / error correction / reliability… are all driver related. In Windows, work station card has their own driver, which make those difference. But under MacOS, most of those card are not even officially supported. Even they do (e.g. K5000), they use the same driver as the gaming card, which do not provide extra colour accuracy, reliability etc, and ECC is disabled. This is one of the main reason why lots of us didn't choose the workstation card. We pay the premium, but cannot enjoy any benefit. In some case, even the 24/7 support may be useless for us.

I don't want to argue anything either but I had a really long back and forth discussion with nvidia tech support about that. It basically ended with them saying that all quadro features should work on mac os. I have yet to test it out however.

My nvidia driver manager on my mac pro does give me the option to turn ECC on if I have a ECC supported card.

I also discussed the 10 bit per channel colour thing with create pro in England via online chat (I live in New Zealand) they said that 10 bit works but they don't offer quadro cards due to the demand for them being extremely low.

Also lastly the spec page for the Mac edition (not a flashed card but the genuine mac version k5000) lists all the standard quadro features as functional in Mac OS

https://www.nvidia.com/content/PDF/data-sheet/NV-DS-QUADRO-K5000-for-Mac-US-NV-LR.pdf
[doublepost=1500095973][/doublepost]I am also aware that 10 bit per channel was not supported until el capitan. I don't want this thread to be a big debate. Just a place where we can discuss the use of these cards.

Plus I really want to confirm what in fact works.
 
Last edited:
on nvidia only quadro supports 10bit, so an old quadro may be the best option for Photoshop (a mac quadro 4000 is cheep or a pc quadro 4000 is even cheaper).

looks like the RX460 & RX480 support 10 bit out in osx, did a serch here and found a few people talking about it
RX 460 https://forums.macrumors.com/threads/help-dell-u2713h-not-30-bit.2031761/#post-24288991
not the best topic but in the image on that post you can see he's using a RX460 and it's giving 10bit out "30-bit color (ARGB2101010)"

i think ECC GPU ram is not a thing for Photoshop, the work is done in on the CPU and RAM not on the GPU, and even then ECC system ram is more for a high end scenario from my understanding.

im on OSX10.10 use a GTX770 4gb seems to work fine for the apps i use CS6 PS/PP/AE, FCX & resolve (not that the adobe apps use the GPU much :mad:), ill look at the ATI cards next time i get a GPU especially if i start doing more work in resolve or 10bit color.

id gess the RX460 or RX560 are the best value 10 bit GPU's for Photoshop. (i assume the RX560 is 10 bit)
 
  • Like
Reactions: dabotsonline
Is there anyone out there with a quadro in a cMP who can calibrate their setup and show that either do or do not get full 10 bit colour support. That would put the whole thing to rest yet I can't find an example of that on the internet.
 
in my post i link to an RX 460 that can output 10 bit, while it is not a quadro yes it can be done.
if i bothered to pull out my quadro 4000 and upgrade to a newer version of OSX id be able to pull out my x-rite calibrator and go.

a google gives lots of hits
https://www.google.co.uk/search?q=1...9pWd6HG4n38AeRq6q4CQ#q=10+bit+calibration+osx

even just on this forum there's a lot of topics on it.

or if your working with video a blackmagic PCI card will also give 10bit.
 
  • Like
Reactions: dabotsonline
in my post i link to an RX 460 that can output 10 bit, while it is not a quadro yes it can be done.
if i bothered to pull out my quadro 4000 and upgrade to a newer version of OSX id be able to pull out my x-rite calibrator and go.

a google gives lots of hits
https://www.google.co.uk/search?q=1...9pWd6HG4n38AeRq6q4CQ#q=10+bit+calibration+osx

even just on this forum there's a lot of topics on it.

or if your working with video a blackmagic PCI card will also give 10bit.

Hey thanks for your reply. Sorry I saw your link after my post. I will read your link and do a search.
 
iv done that before, np

just want to ask what displays your using as im fairly shore 8 bit displays will not relay benefit much from 10bit out on the gpu and you may need a currant calibration hardware for correct calibration.
DisplayCAL is a open source calibration app thats worth a look but it's a lot more complex than x-rites software.
https://displaycal.net/
some of the older calibration hardware has problems on newer displays or wide gamut displays.

also you may have you change photoshops preferences, from memory it's set up for CMYK press printing or something and defaults to the American settings.
http://www.color-management-guide.com/photoshop-color-settings.html
worth double checking as im a tad outdated on this so not 100% if that site is up to date ?
 
  • Like
Reactions: dabotsonline
iv done that before, np

just want to ask what displays your using as im fairly shore 8 bit displays will not relay benefit much from 10bit out on the gpu and you may need a currant calibration hardware for correct calibration.
DisplayCAL is a open source calibration app thats worth a look but it's a lot more complex than x-rites software.
https://displaycal.net/
some of the older calibration hardware has problems on newer displays or wide gamut displays.

also you may have you change photoshops preferences, from memory it's set up for CMYK press printing or something and defaults to the American settings.
http://www.color-management-guide.com/photoshop-color-settings.html
worth double checking as im a tad outdated on this so not 100% if that site is up to date ?

I am working towards buying an Eizo coloredge moniter
 
ah well that's going to be ok ^^ eizo do sell displays with a calibrator bundled in which might be worth it. (if you relay want a 10 bit display skip the 8 bit +2 bit AFRC displays)

the best Photoshop and color tutorials iv ever seen are by gry garness http://www.grygarness.com/ some relay amazing pdf's well worth getting.
think this is the books links https://www.eureka-publishing.com/product-category/ebooks/
https://www.eureka-publishing.com/p...lor-quality-in-digital-photography-workflows/
(this one covers color workflow the most https://www.eureka-publishing.com/p...lor-quality-in-digital-photography-workflows/ )

northern lights has some info, and always worth going to adobes forums to ask about workflows and settings (+ there help files).

found this too
https://photographylife.com/what-is-30-bit-photography-workflow
 
  • Like
Reactions: dabotsonline
ah well that's going to be ok ^^
the best Photoshop and color tutorials iv ever seen are by gry garness http://www.grygarness.com/ some relay amazing pdf's well worth getting.
think this is the books links https://www.eureka-publishing.com/product-category/ebooks/
https://www.eureka-publishing.com/p...lor-quality-in-digital-photography-workflows/
(this one covers color workflow the most https://www.eureka-publishing.com/p...lor-quality-in-digital-photography-workflows/ )
northern lights has some info, and always worth going to adobes forums to ask about workflows and settings (+ there help files).

Thanks orph. I really appreciate the links.

Well looks like the debate of whether it works is over. I hope more workstation card users post and talk about what they are upto.
 
Not on this forum. Here, I only find people who are judging the usefulness of a GPU for everything professional based on gaming benchmarks.

Which can be a little frustrating. I personally have a windows machine if I want to game. Also little interest in running Windows via bootcamp. If I wanted Windows then I never would have brought a Mac in the first place.

I used to game a lot when I was younger, I'm approaching 30 now and for the last 6 years I've just lost interest.

I still play the odd game but what happens is I think "wow cool game!" So I get it then I play it for a day or so then slowly lose interest and stop.

I'm just not into it anymore.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.