Workstation Graphics cards, who is using what? Why? how are things working out?

Discussion in 'Mac Pro' started by William Payne, Jul 14, 2017.

  1. itdk92 macrumors 6502

    itdk92

    Joined:
    Nov 14, 2016
    #26
    Also the K5000 as insanely bad performance for the price
     
  2. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #27
    I know this is just my personal opinion but the price to performance difference between a workstation card and a gaming card is I think acceptable factoring in the feature difference.

    Obviously that is if you want/need those features.
     
  3. flat five, Jul 16, 2017
    Last edited: Jul 16, 2017

    flat five macrumors 603

    flat five

    Joined:
    Feb 6, 2007
    Location:
    newyorkcity
    #28
    there's AutoCad and Alias from Autodesk..

    Rhino is on Mac (which is what i personally use)
    (Grasshopper for Rhino is also on Mac now though still in beta.. still very usable in its current state)

    a few others which may be used in more specialized 'engineer' fields would be Vectorworks, Modo, Cinema 4D, Maya, Moi3D, Blender, SketchUp, and openSCAD.

    Apple designers/prototypers use Alias and Rhino.. that said, i imagine they also have boot camped machines running NX and Catia.. and the likes as well as CAM software (which is sorely lacking on macOS).

    ---
    that aside, the GPUs in the upper end Mac builds are good to great for all of this software.. (like, the Radeon560 in the MBP or the 580 in the iMac).. it's highly unlikely you're going to get better CAD performance using more 'pro' GPUs than these.
    (unless we're taking something like GPGPU based rendering or simulations or something.. in which case, more vRAM generally equates to larger scenes being able to be computed at a faster pace)
    --- Post Merged, Jul 16, 2017 ---
    how so?

    idk, i don't think i've ever experienced a bit flip when modeling (though i very well could have).. thing is, with modeling at least, you see the results.. if there's an error, you'll very likely notice something went weird..

    further, the more adept designers/engineers/modelers are triple-checking their work.
     
  4. devon807 macrumors 6502

    devon807

    Joined:
    Dec 31, 2014
    Location:
    Virginia
    #29
    I know this is a long shot but does anyone use a FirePro despite the lack of official drivers? I was wondering if they can be used as accelerator cards in FCPX when paired with a standard GPU ex: GTX series or something similar
     
  5. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #30
    For anyone running solid works, I know the PC machines at work benefited from being upgraded to quadro cards.
     
  6. mpta, Jul 17, 2017
    Last edited: Jul 17, 2017

    mpta macrumors member

    Joined:
    Jan 17, 2013
    #31
    Hi Guys, I am using quadro 4000 (cMP 12 core 3.0) in Blender and it's very slow n viewport in texture mode :(. My GTX 670 on old Hackintosh was way faster. Also I never get 10bit mode working. Also with special software and right cables. I gave up.
     
  7. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #32
    You state nMP? so I take it you are running it via eGPU? I wonder if that is effecting your performance.
     
  8. mpta macrumors member

    Joined:
    Jan 17, 2013
    #33
    Of course my mistake (it's cMP not nMP)
     
  9. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #34
    Ah ok, looking at the the specs of the quadro 4000 vs the gtx 670 yeah I wouldn't compare them. The gtx 670 is a newer generation card to the quadro 4000. Gtx 670 being a kepler card it would be more in comparison to a quadro k4000. I don't know if you would have any improvement though as I haven't tried them to compare.
     
  10. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #35
    Yeah to confirm what has been said, you don't need ECC for photography and a consumer level GPU has more than enough power. Even an Intel iGPU can handle Photoshop and Capture One.
     
  11. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #36
    I just want to emphasise this thread isn't about whether workstation cards are needed or not. This thread is a place to talk about what people are doing with the cards.

    Plus on the nvidia side if you want 10 bit colour your choices are made for you.
    --- Post Merged, Jul 17, 2017 ---
    For anyone wondering, here is a great piece of info regarding how Adobe Creative Cloud utilises Nvidia GPU's.

    http://images.nvidia.com/content/qu...obePremierPro-SolutionOverview-US-Fnl-WEB.pdf
     
  12. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #37
    I've been in photography for 4 decades so bear with my ignorance ;)

    Adobe had its own Mercury Engine that is more efficient than OpenGL. It supports a few OpenCL features. It isn't accelerated by CUDA.

    I also confirm what others have said about 10 bit output. Barely anyone ever needs it, perhaps only half a percent of global photography has required it because most imagery (almost everything) doesn't have a colour palette anywhere close to what 10 bit can provide.

    We have to bear in mind that most publishing has been done on a Mac and only a handful of AMD GPUs supported 10 bit output starting on El Capitan.

    The advent of HDR video content will speed adoption.

    Here's what the platforms provide:

    - Nvidia settings offer 8 and 10 bit output on Windows, even on consumer cards three years old.

    - Nvidia doesn't offer 10 bit output on the Mac.

    - A small selection of AMD cards offer 10 bit output on the Mac. Most recent cards support it.

    - AMD's 10 bit output option doesn't show in Windows so I presume it switches automatically.
     
  13. William Payne, Jul 17, 2017
    Last edited: Jul 17, 2017

    William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #38
    See this is what I was hoping. Guys like yourself who have been there done that and have a lot more experience then myself so that I can learn and discuss. Thank you for joining in. I should be saying excuse my ignorance.

    It is just very hard to get information.
     
  14. tomscott1988 macrumors regular

    tomscott1988

    Joined:
    Apr 14, 2009
    Location:
    UK
    #39

    I would disagree. I have a 5770 and im a photographer. I have a 27" ACD and an older 23" ACD that I use for music/email and if I have the 23" plugged in lightroom is soooo slow its pretty slow normally as its written so poorly but noticeably faster with one display with the piddly amount of ram.

    Doesnt seem to matter what machine you use with lightroom, I have a high end i7 windows machine with a 1070 also and it still runs slowly but a card that has 2gbs or more ram runs more smoothly if you have higher pixel monitors like 2 or 4k 1080 pannels are much easier to run.

    My hesitation of getting a newer card is that none seem to work well out of the box, the Nvidia cards seem to have really poor performance in creative apps because of poor open CL/GL performance so the lower end ATI card outperform them in those apps. The ATI cards dont all have native support so editing kext files etc so still arent naively supported. I use the machine for a living and not messing with it is the reason I use a mac.

    Unless you use older cards of both varients performance gains are minimal in actual usage.

    All the options seem to give limited performance gains comparative to their actual power on a 5,1 and the drivers seem pretty poor. Hopefully with high sierra the 4 and 5 series cards from ATi will work natively then Il buy one, hopefully this mining craze will cool too meaning prices get back to normal.
     
  15. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #40
    It's funny you mention Lightroom speed as Adobe have just these last few days admitted that Lightroom is painfully slow.
     
  16. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #41
    Thanks. It's best to go on specialist post production forums with some hardcore geeks. This forum is a generalised one and unfortunately many Mac users haven't been able to experience what hardware and software is truly capable of doing because Apple is always late at supporting industry standards.

    Monitor wise yes go for an Eizo or NEC with at least 96% Adobe RGB coverage.

    The cheaper 10 bit panels are for 10 bit consumption, which is different from 10 bit or wide gamut production. DCI P3 panels aren't suitable for the best prints either.

    The more expensive monitors let you proof for print, digital, etc. You can't soft proof properly for print on a cheaper high contrast panel even if it says 'wide gamut'.
     
  17. tomscott1988 macrumors regular

    tomscott1988

    Joined:
    Apr 14, 2009
    Location:
    UK
    #42
    Ye lets hope they do something about it. Unfortunately it is CC suite wide.
     
  18. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #43
    Old post and not sure if you will read it but it may help someone. Talking to Nvidia regarding the stock Mac edition quadros. The Quadro 4000 for Mac was not a 10 bit card, however the K5000 for Mac does support 10 bit.
     
  19. Asgorath macrumors 65816

    Joined:
    Mar 30, 2012
    #44
  20. h9826790 macrumors 604

    h9826790

    Joined:
    Apr 3, 2014
    Location:
    Hong Kong
    #45
    Yeah, it's pretty easy to get 10bit now as long as you have the right GPU and monitor (I am also running 10.13.1).
    Screen Shot 2017-11-10 at 08.11.21.jpg
    The system may not turn that on by default, but this little free software can fix it with just few clicks.

    http://resxtreme.com
     
  21. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #46
    Ok here is where it gets tricky. 10 bit on a geforce is Direct X. Fine for your monitor and basic tasks and gaming.

    Photoshop does not read 10 bit over direct X. It reads it via open GL. Which GeForce cards do not do. Only Quadro and AMD pro cards do.

    That is after months of back and forth between Adobe and Nvidia regarding this very subject.

    If you don't use photoshop or Software using Open GL as the API then none of this matters.
     
  22. h9826790 macrumors 604

    h9826790

    Joined:
    Apr 3, 2014
    Location:
    Hong Kong
    #47
    Obviously there is no DirectX in MacOS. And our Geforce cards support 10bit colours.

    Anyway, my Photoshop in MacOS detected that I am using a Geforce card, and let me enable 10bit colour.
    Screen Shot 2017-11-11 at 06.51.51.jpg
    TBH, I am very new in this area, not even know how to test if my 10bit setup is actually working or not. Any simple test that I can do to confirm your info is right?
     
  23. William Payne, Nov 10, 2017
    Last edited: Nov 10, 2017

    William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #48
    I have the 30 bit monitor choice selected on a stock gt120 haha.

    Best way to tell is with a gradient. Black to white gradient with various shades of grey in between. If you get banding it's 8 bit. If it's a smooth transition it's 10 bit.

    This can be a very confusing topic regarding 8 bit and 10 bit. Also there is the whole true 10 bit versus 8 bit plus dithering to produce a false 10 bit.

    It is super confusing.
    --- Post Merged, Nov 10, 2017 ---
    If you want to get even more confused here is some interesting reading that goes into this topic, caveat it is about windows but it does explain a bit.

    https://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/12
    --- Post Merged, Nov 10, 2017 ---
    I am going to be doing more research on this topic as I am finding colour management to be a really fascinating topic. I am fully planning on sometime in the new year getting an Eizo coloredge screen, and have just brought a quadro p4000 to test out this stuff. I will do a full report sometime next year after I can fully do the research. My current screen is only 8 bit so can't really give a proper opinion yet.
    --- Post Merged, Nov 10, 2017 ---
    Mac OS does support 10 and has drivers for it but that is with the AMD cards I am not knowledgeable enough to comment on that regarding their use in a cMP.

    I just find this a fascinating topic. I would like to eventually test this out with both geforce and quadro cards to get a real answer to how it works in the Mac Workspace.
     
  24. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.
    #49
    That image regarding the samsung monitors listed in that picture is a head scratcher haha, they are not 10 bit monitors.
     
  25. William Payne thread starter macrumors 6502a

    Joined:
    Jan 10, 2017
    Location:
    Wanganui, New Zealand.

Share This Page