Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

iluvmacs99

macrumors 6502a
Original poster
Apr 9, 2019
920
674
Hi everyone,

I plan to keep my Mac Pro 5,1 permanently in High Sierra due to having many software that are only compatible with HS and not in Mojave. I also use VideoProc to assemble quick 560p, 720p and 1080p video clips into a single movie when I don't feel like I need to fire up Davinci Resolve or iMovie and it had worked great so far. Sometimes I would farm it out to my Macbook Air and use Quicksync which is quicker than Mac Pro's CPU combo even up to 4K h.264 clips. This year though, I'll be experimenting using VideoProc to transccode 4K HEVC 10bit video (h.265) to 1080p, but I have no hardware acceleration to do this yet. Going through CPU transcoding is painfully slow both on my Mac Pro and on my Macbook Air. I am aware that running Mojave allows me to activate the AMD hardware accelerator of my RX580 and use VideoProc to use it to hardware transcode. But I would prefer if I can have both my RX580 coexist with a Nvidia Pascal card and use the Pascal only as a GPU transcoder. Right now, I have both the RX580 and Nvidia GT120 coexist wonderfully and my current apps don't conflict with this setup. I assume that replacing the GT120 with a Pascal card will give me the same compatibility, except I have HEVC hardware encoding capability.

Which brings me to the Nvidia Pascal cards which do support HEVC 10bit hardware transcoding and I am wondering if anyone of you here who has a GTX1050 Ti, GTX 1060, GTX 1070 and GTX 1080 and 1080Ti can share some experiences if the pascal card gets recognized by VideoProc and get used as a hardware accelerator? How about the GTX1030 and the GTX1050? They are smaller form factor that don't need an extra power draw like the bigger brothers and therefore don't need the Pixlas Mod for my Mac Pro.

Thank you.
 
if anyone of you here who has a GTX1050 Ti, GTX 1060, GTX 1070 and GTX 1080 and 1080Ti can share some experiences if the pascal card gets recognized by VideoProc and get used as a hardware accelerator?

Only in Windows, not in macOS.

If all you want is just HWAccel, but not necessary VideoProc, then Linux also work.

Nvidia never release NVDEC / NVENC support in macOS.
[automerge]1577900251[/automerge]
Also, Please double check if your GT120 + RX580 combo is really working flawlessly.

1) Connect the monitor to the RX580
2) Boot High Sierra
3) Open Photos apps
4) select one photo and double click it

I don't know if Apple fixed the issue yet. But if the bug still there, the photo will become transparent. This is particularly GT120 + RX580, if you remove the GT120, then the RX580 can display the photo correctly.
 
The bug is there if I have a monitor attached to the GT120 or just a plain DVI to VGA adapter attached to the DVI port but not to a monitor. If I left the GT120 alone with no adapter or monitor attached to its ports and only use the RX580, then it's perfectly fine. Same with the other apps.

Which is the reason why I was hoping to use a Nvidia GTX card only as a headless GPU specifically for hardware transcoding from 1 file format to another and wonder if anyone had tried doing so. When I had Davinci Resolve 15, it did see 2 cards and allowed me to use both, but because 15 no longer support the GT120, it crashed the software. Resolve 16 removed this bug so I couldn't see the GT120. But it seemed that my desire won't work from what you said.
 
@iluvmacs99 - this is exactly what I want clear guidance on but I've never been able to establish (despite the length and breadth of this extraordinarily valuable forum) whether an NVIDIA GPU can deploy it's tentacles through macOS to seamlessly engage functions such as 'hardware accelerated encode/decode' as and when required.

Fortunately - it seems @h9826790 has just given a short but precise answer to our question.

Unfortunately - by confirming that NVIDIA GPUs are apparently transparent to the rest of the blindlingly well engineered and integrated architecture of cMP (*EVEN WITH* carefully curated hardware drivers diligently supplied by NVIDIA and kissed on the dong by Apple) I find myself AGAIN asking an old question..

WHAT, in the name of Edward R Vedder, is the point of buying an expensive NVD or AMD video card beyond 'Display Output Ports' and 'Screen Resolutions'?

As you correctly pointed out - a GT 1030 with Pascal architecture will serve up 4K@60HZ to my giant LCD display every day of the week - and it only takes up one x4 slot in the heart of the beast (and almost no power draw).

For whatever reason - there has been materially inadequate discussion on the topic: what [else] does a GPU do?

I think the answer is something like..
1) it produces video output on one or more displays.
2) it binds like a symbiotic parasite to the heart and soul of your system and triples the speed of certain, numerically complex digital production outputs by harnessing the power of 'parallel processing' and 'compute units'.

Sadly - as near as I can tell - purpose number two is mostly myth as far as Mac users are concerned.

I'd love to see someone document/itemise the discrete verifiable benefits of owning a GTX 1080 over a GT 1030 (I've been thinking of buying a 1080 after all).

This is what a discrete verifiable benefit sounds like:

- With a powerful graphics card (such as the GTX 1080) the speed at which Gemini can run it's de-duplication algorithm is increased by a factor of 3.7 times

- With a powerful graphics card (such as the GTX TITAN) the refresh rate when reviewing iPhone photos (image files) using 'Quick Look' in finder falls from 0.22 secs to 0.09 secs
- With a powerful graphics card (such as the VEGA 64) you can use screen recording software (Capto, Video Proc, Captivate, Screen Capture Pro, Camtasia) and it won't slow down everything on screen like your mouse pad is made of molasses

Just tell me *ONE* other thing these expensive, PCIe lane hogging, hotter than the sun, pixel managers actually do!

PS - I am on High Sierra forever* because I can choose and use GPU solutions from Team Red and/or Team Green (after I work out what the hardware they manufacture actually does).

*which I regard as a gift to (or privilege conferred upon) the final remaining macOS workflow-dependent professionals before they handed over all future development to the C team in the iOS division.
 
@iluvmacs99 - this is exactly what I want clear guidance on but I've never been able to establish (despite the length and breadth of this extraordinarily valuable forum) whether an NVIDIA GPU can deploy it's tentacles through macOS to seamlessly engage functions such as 'hardware accelerated encode/decode' as and when required.

Fortunately - it seems @h9826790 has just given a short but precise answer to our question.

Unfortunately - by confirming that NVIDIA GPUs are apparently transparent to the rest of the blindlingly well engineered and integrated architecture of cMP (*EVEN WITH* carefully curated hardware drivers diligently supplied by NVIDIA and kissed on the dong by Apple) I find myself AGAIN asking an old question..

WHAT, in the name of Edward R Vedder, is the point of buying an expensive NVD or AMD video card beyond 'Display Output Ports' and 'Screen Resolutions'?

As you correctly pointed out - a GT 1030 with Pascal architecture will serve up 4K@60HZ to my giant LCD display every day of the week - and it only takes up one x4 slot in the heart of the beast (and almost no power draw).

For whatever reason - there has been materially inadequate discussion on the topic: what [else] does a GPU do?

I think the answer is something like..
1) it produces video output on one or more displays.
2) it binds like a symbiotic parasite to the heart and soul of your system and triples the speed of certain, numerically complex digital production outputs by harnessing the power of 'parallel processing' and 'compute units'.

Sadly - as near as I can tell - purpose number two is mostly myth as far as Mac users are concerned.

I'd love to see someone document/itemise the discrete verifiable benefits of owning a GTX 1080 over a GT 1030 (I've been thinking of buying a 1080 after all).

This is what a discrete verifiable benefit sounds like:

- With a powerful graphics card (such as the GTX 1080) the speed at which Gemini can run it's de-duplication algorithm is increased by a factor of 3.7 times

- With a powerful graphics card (such as the GTX TITAN) the refresh rate when reviewing iPhone photos (image files) using 'Quick Look' in finder falls from 0.22 secs to 0.09 secs
- With a powerful graphics card (such as the VEGA 64) you can use screen recording software (Capto, Video Proc, Captivate, Screen Capture Pro, Camtasia) and it won't slow down everything on screen like your mouse pad is made of molasses

Just tell me *ONE* other thing these expensive, PCIe lane hogging, hotter than the sun, pixel managers actually do!

PS - I am on High Sierra forever* because I can choose and use GPU solutions from Team Red and/or Team Green (after I work out what the hardware they manufacture actually does).

*which I regard as a gift to (or privilege conferred upon) the final remaining macOS workflow-dependent professionals before they handed over all future development to the C team in the iOS division.

@zedex Actually purpose #2 is not a myth. There are applications that will use the GPU as a rendering node or compute device. That was the main reason why I got a used Mac Pro as opposed to buying a Mac Mini 2018 Core i3, because you also need to buy an eGPU to take advantage of my paid copies of Topaz, DXO and Davinci Resolve. Davinci Resolve is one of the video applications I use that can use more than one GPUs to render the final video and motion graphics and I know of a company that custom built Mac Pro 5,1s with more than one GTX1080Ti to do just that under Davinci Resolve other than just using them as display drivers. The reason I got the Mac Pro was so I could use the RX-580 to do AI rendering of digital photos using Topaz AI products and motion graphics and color grading with Davinci. The Topaz applications (several of them) actually do and can use either my AMD Radeon RX-580 or my Nvidia GT-120. To select them, I just connect one as my main display driver and the OS will default to that GPU. I tested them both under RX-580 and GT-120 and both worked; albeit the RX-580 renders much much faster of all my digital images compared to the lowly GT-120. I do know from the Topaz forum that someone else had used a GTX1080Ti as the GPU of choice in his Mac Pro and that worked for him too. So to me, any CUDA supported applications will make use of the GPU more so than just being a display driver. VideoProc actually defaults to Nvidia, because when I execute it, it only sees the GT-120 and not the RX-580, and because the GT120 is not a Kepler card, it won't work. In fact, VideoProc requirements for Mac are for the GT 640, 650, GTX 680 and GTX 780 all of which were installed in the older iMacs albeit in their mobile versions and are Kepler based. Digiatry (maker of Video Proc) did confirm that the GTX-680 card "WILL" do hardware transcoding via VideoProc and so this led me to believe that as long as Apple had native drivers for the Kepler GPU cards, then the CUDA transcoding should work. But I don't have the GTX-680 to try it out to confirm Digiatry's claims either and I don't want the GTX-680 as the Kepler card can't do HEVC. Only the Pascal cards can.

The GT 1030 GPU card is the only card that does not support hardware transcoding, so you can only use it as a display GPU. The minimum card would be the GTX 1050 or GTX 1050Ti. But if you have the GT 1030, you can download a free trial copy of VideoProc to see if it sees your GT 1030 and if it allows you to do hardware transcoding or not.
 
@zedexDigiatry (maker of Video Proc) did confirm that the GTX-680 card "WILL" do hardware transcoding via VideoProc and so this led me to believe that as long as Apple had native drivers for the Kepler GPU cards, then the CUDA transcoding should work.

100% won't work in macOS.

CUDA won't work with the native driver. CUDA only work with web driver.

I doubt if the CS from Digiatry really know what they are talking about. Or may be they simply assume that can work in macOS like in Windows.

Anyway, I have 1080Ti, I can tell you that VideoProc 100% cannot use it in macOS.

Also, VideoProc may not able to show the graphic card that do HWAccel on cMP.

e.g. I activated my Radeon VII's HWAccel on my cMP in Mojave, and as you can see, VideoProc can't even show any GPU in this page. But HWAccel is actually working properly.
FULL HWAccel Mojave.png


And when I use it to convert videos (hardware transcode), it shows that I am using Intel Quick Sync. That's wrong obviously.

IMO, this software simply never optimised for cMP, because cMP never support HWAccel natively. I can't see anything special in this software so far, it only use VideoToolBox (an OS level HWAccel) to convert videos. It has no idea which GPU it is using. VideoToolBox take full control of it.

Same as Compressor, Handbrake, FFMpeg.... or any other software. All hardware encodings are done via VideoToolBox. If VideoToolBox use Quick Sync, then Quick Sync. If VideoToolBox use VCE, then VCE. If VideoToolBox use T2, then T2. The software has no control in this area, and doesn't even know which GPU is doing the job.

You are not the first person I've seen that hope / believe VideoProc can do hardware transcoding on cMP (without my Hack to activate HWAccel). And I am sure you won't be the last one as well.

IMO, their TechSpec page (or even may be the CS) is quite bad to state the actual support per OS. e.g. According to their page


we can even have hardware transcoding in Mountain Lion with a HD7950 installed. I am 100% sure that won't work. There is simply no HWAccel support for HD7xxx card at the OS level. And a software can never do more than an OS allow it to do.

In fact, I wonder, if you ask them "Can I use VideoProc to hardware transcode a video with GTX680 in OSX Snow Leopard?". They may also say "will" (or should, etc).
 
@h9826790 Perfect! That's exactly what I needed to know and confirm. Thank you very much for sharing your experiment with me with the GTX 1080Ti and VideoProc, which confirmed what you said about MacOS and CUDA to be precisely correct.

Thank you..
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.