Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

monkeybagel

macrumors 65816
Original poster
Jul 24, 2011
1,142
61
United States
It has been a good while now since Mojave was released, and unfortunately we are receiving the same stonewall answers that Apple is known for. I have Mojave on my two MacBooks, and it is a stable OS, I don’t see a reason to upgrade a GPU to a supported one, or wait in anticipation for a driver to be released. My 5,1 is very stable on 10.13.6, and I see very little incintive to upgrade. BTTM was removed, which I use , and I think it’s safe to say with Apple going out of their comfort zone with the 5,1 making it “sort of” supporting Mojave, it will be the last OS X build released for the Mac Pro 5,1. With that said, I can’t see spending the money on upgrades just to get the rarely used features in 10.14, and lose and important one.

What are others thinking about this?
 
I have the same dilemma. 5,1 working well with flashed GTX960 and NVME SSDs on HS for Adobe, Logic, web dev and windows development in bootcamp. My need is 3x DP displays. I am considering buying an RX570 to get to Mojave but there’s nothing I need in Mojave. The question is at what point to replace the machine and with what :(
 
My 5,1 is very stable on 10.13.6, and I see very little incintive to upgrade.
I can’t see spending the money on upgrades just to get the rarely used features in 10.14

Seems like you've answered your own question...

I'll stick with 10.13.6 (or 10.13.X) for as long as I can, but I do have an RX580 on standby in case I am forced to upgrade for client compliance. If you do not "NEED" to make a decision about your future today, then stick with what you're doing - it clearly is working for you.
 
The question is at what point to replace the machine and with what

The same question that MacPro users have been asking for years. Many leave the platform. Others go hackintosh. Others just wait and cross fingers. As soon as what you have no longer works for your needs, it's time to move on. If you have not reached that point right now, you have time to make a decision.
 
It has been a good while now since Mojave was released, and unfortunately we are receiving the same stonewall answers that Apple is known for. I have Mojave on my two MacBooks, and it is a stable OS, I don’t see a reason to upgrade a GPU to a supported one, or wait in anticipation for a driver to be released. My 5,1 is very stable on 10.13.6, and I see very little incintive to upgrade. BTTM was removed, which I use , and I think it’s safe to say with Apple going out of their comfort zone with the 5,1 making it “sort of” supporting Mojave, it will be the last OS X build released for the Mac Pro 5,1. With that said, I can’t see spending the money on upgrades just to get the rarely used features in 10.14, and lose and important one.

What are others thinking about this?
From lessons learned way back in OS 9, I don't consider upgrading to the latest macOS until x.y.3 update. Major bugs are (for all intensive purposes) fixed. So, I'd hold off making a video card change until then unless you have requirement to upgrade to Mojave. I'm still using a R9 280X (MVC updated) on 10.13.6. I have a compatible Sapphire RX 580 Pulse available to use, but, so far, I haven't
need to upgrade to Mojave. It would be great to have the boot screen with the RX 580, but I'm awaiting 10.14.3+ and RTN 2060 reports. Oh, well...
 
My MVC flashed GTX 1080 is fast, draws low power and ain't goin' nowhere. The feud is silly and childish. I sold 200 of my shares of Apple last week as my minor mark of defiance. It's gone up since then:oops:

Lou
 
My MVC flashed GTX 1080 is fast, draws low power and ain't goin' nowhere. The feud is silly and childish. I sold 200 of my shares of Apple last week as my minor mark of defiance. It's gone up since then:oops:

Lou
I hear you. 1080 & Tis are going to serve folks well for a long time. I've emailed and tweeted my displeasure to Nvidia & Apple, signed petitions, etc. Only thing left to do now is keep on turning Z series to Hackintosh. Hey, it's actually saving me money.
 
  • Like
Reactions: owbp and TheStork
I finally built my new 9900k system and I'm not looking back... Had enough of Apple.
It's Windows 10, but I don't have to hope every time for months and months for drivers and compatibility.

Going to Hackintosh this thing at some point.
 
My problem is that I have been leveraging the advantages of OS X since 2006 when Apple went to Intel. Although I was a Microsoft Certified Trainer for a while, I personally prefer the gestures over touch screen, and the whole renting of software Microsoft is inevitably going is a big turn off for me. Not to mention installing Windows 10 Enterprise and having Candy Crush Saga on the Start menu. After Vista didn't do too well, and Windows 8 tanked, I was eagerly looking for an alternative and have loved OS X since. The UNIX foundation is stable, it has the commercial apps I need, and I can administer ESXi/vSphere and Windows Server easily. The iMac Pro is a fast machine, but they do not make a matching monitor that could sit beside it to run dual monitors - it will be a different panel and manufacturer, have a different height, etc. This is why I will not buy an iMac. For educational purposes, I looked to see what the "hackintosh" was all about in the 10.6 days, and it was so unstable it was no where close to production quality, and that doesn't factor in the EULA.

In those days, you had to purchase off-the-shelf components from the vendors Apple purchased from the hope to get a stable system. I have not looked at an HP Z Workstation, but it is interesting that they can be easily (from what I understand) to run OS X.

I am guessing that Mojave will have the current -2 support like other OS X versions, so it made the Mac Pro expensive up front, but cheap and long lasting for certain. I will cross that bridge when I get there.
 
For educational purposes, I looked to see what the "hackintosh" was all about in the 10.6 days, and it was so unstable it was no where close to production quality, and that doesn't factor in the EULA.

In those days, you had to purchase off-the-shelf components from the vendors Apple purchased from the hope to get a stable system. I have not looked at an HP Z Workstation, but it is interesting that they can be easily (from what I understand) to run OS X.

I am guessing that Mojave will have the current -2 support like other OS X versions, so it made the Mac Pro expensive up front, but cheap and long lasting for certain. I will cross that bridge when I get there.
I paid to have my first Z820s Hackintoshed. Curious to see if it was feasible to do (and maintain) this myself, I dragged one home and haven't gotten it to complete installation yet. This weekend I'll cave and make an account at tonymacx86.

The ones at work are flawless.

Really wish relevant parties would come together and give us Mojave drivers. I am absolutely not waiting for it, though, and every month or so that goes by there seems to be a new Z820 on the bench; once Hackintoshed they're functionally what the cMP should have been upgraded to.
 
I always assumed the main reason Apple favour AMD is that they give them strong volume discounts on GPUs, whilst Nvidia, being the market leader, won't. AMD tend to win games console contracts too, perhaps for the same reason. If Apple is choosing between equivalent GPUs for an iMac (e.g. an RX 580 or GTX 1060), they'll go for the one that costs them £50 less in parts every time. The spoils of being the sole provider of hardware for a platform.

I also read somewhere that Nvidia won't let Apple label their gaming GPUs in Mac / iMac Pros as Quadro's, whereas AMD are OK with Apple calling their's FirePro's. As the drivers are the main difference between consumer and workstation GPUs, and on macOS drivers are bundled with the OS, it gives Macs a boost in perceived value, given how much pro GPUs usually cost.

Nvidia have been the technically superior choice for years, in terms of power efficiency, highest available performance, and their CUDA processing. Yet Apple act like they don't exist, as they tend to do for anything that would cut into their profit margins (hello expandable i7 desktop??).

Speaking of CUDA, perhaps Apple is vetoing it in Mojave, in order to push developers to support Metal 2 instead. Trying to get a new API off the ground is difficult if there's an existing, well-supported alternative. Maybe this is a deal-breaker for Nvidia?
 
Last edited:
  • Like
Reactions: thornslack
Speaking of CUDA, perhaps Apple is vetoing it in Mojave, in order to push developers to support Metal 2 instead. Trying to get a new API off the ground is difficult if there's an existing, well-supported alternative. Maybe this is a deal-breaker for Nvidia?

That is basically the argument many make, but it is NOT yet fully proven the GTX 10XX (at least 1070+) and/or RTX series GPUs do not meet or support the Metal2 requirements. Apple does officially acknowledge and officially support the GTX 680 Mac Edition which is limited to GPUFamily1v3/GPUFamily1v4 (Metal1) compatibility through Apple's native drivers. Without proper NVIDIA Web Drivers for Mojave, it is a lingering question if any of the newest NVIDIA GPUs are capable of Metal2/GPUFamily2v1. Metal2 currently requires Mojave (10.14+).

FYI, there are less than a handful of checkmarks difference between the published specs for 1v3, 1v4, and 2 in the feature set table:
https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf
 
That is basically the argument many make, but it is NOT yet fully proven the GTX 10XX (at least 1070+) and/or RTX series GPUs do not meet or support the Metal2 requirements.

I'm sure the 10/20 series fully support Metal 2 features.

My point was more that if CUDA is available, software such as Redshift and Octane (renderers) that use it exclusively will keep using it. Only if CUDA is not available will such companies be motivated to rewrite their code to use Metal 2. OTOH, I'm not sure how many macOS apps are CUDA-only, and of those, how many will just drop the macOS port instead.
 
I'm sure the 10/20 series fully support Metal 2 features.

My point was more that if CUDA is available, software such as Redshift and Octane (renderers) that use it exclusively will keep using it. Only if CUDA is not available will such companies be motivated to rewrite their code to use Metal 2. OTOH, I'm not sure how many macOS apps are CUDA-only, and of those, how many will just drop the macOS port instead.

Adobe has greatly improved Metal support in CC 2019 video applications, especially when compared to tests with both GTX 1080 FE and RX 580 in CC 2018 applications (versions about 2-3 months before CC 2019's release). It is ALMOST to the point where Metal performance is on par with CUDA performance for many tasks. There are not a ton of other major software companies that have embraced Metal in this way, but Adobe wants to stay relevant on Apple products (both macOS and iOS).

If Adobe could fix the remaining glitches with Metal-based renders (especially with 3rd party plugin integration), it would help a ton. After Effects is nearly OpenCL and Metal only at this point (plus software only), except for specific plugins that can use CUDA or OpenGL. Their ray-traced 3D renderer is basically being phased out (officially depreciated by Adobe). Media Encoder equally supports CUDA, OpenCL, and Metal (and software only). Premiere Pro also supports CUDA, OpenCL, and Metal (and software only).

Do any of the Octane and OctaneRender options support Mojave? Last I knew they were capped at High Sierra (and much older) versions for full compatibility, likely exclusively due to lack of CUDA option. Can see Octane/OctaneRender becoming nearly Windows and/or Linux only in the future. A lot of Autodesk has already gone that way with latest versions. If Apple thinks pushing Motion to fill this void on Mac is even close to an option, they are seriously mistaken. Using CUDA or pushing cloud rendering is the path forward for many of these vendors.
 
From lessons learned way back in OS 9, I don't consider upgrading to the latest macOS until x.y.3 update. Major bugs are (for all intensive purposes) fixed. So, I'd hold off making a video card change until then unless you have requirement to upgrade to Mojave. I'm still using a R9 280X (MVC updated) on 10.13.6. I have a compatible Sapphire RX 580 Pulse available to use, but, so far, I haven't
need to upgrade to Mojave. It would be great to have the boot screen with the RX 580, but I'm awaiting 10.14.3+ and RTN 2060 reports. Oh, well...
So what is stopping you using Mojave with your R9 280X? Or using both R9 280X and RX 580 at the same time in Mojave?
 

I had one of those MBPs with an 8600M GT, and like the rest mine went bad. Apple replaced the logic board for free, outside of warranty and with no AppleCare. With two entire generations of 15/17" MBP using this chip, that's a lot of repairs - as well as bad rep, as it made flagship Apple products appear unreliable. As a related article said, it wasn't so much the cost (Nvidia ultimately covered much of that), but that Nvidia repeatedly lied about the scale of the problem and said the MBP would be unaffected.

It's understandable that if a supplier causes huge problems with your top product, then deliberately downplays the scale of the issue, hampering your efforts resolve it smoothly, you might swear off using them again. But this was over a decade ago, and Nvidia products have been reliable since.

This ultimately represents another aspect of the Apple tax - the downside of one company controlling your platform. Whereas Dell may have been pissed off with Nvidia too, they would just have to swallow it if their customers demand Nvidia GPUs, or competitors would just fill the demand. Whereas Apple can hold a grudge forever if (most) customers aren't going to leave the platform over it.

Having said that, it's weird to block customers from using their own Nvidia cards on Mojave - and why now, when they were working fine on High Sierra? This kind of pettiness is indeed worrying when applied to a new modular MP.
 
Last edited:
  • Like
Reactions: 09872738
The AMD dGPUs were faulty in the 2011 MBPs. Each supplier has had some problems. Honestly it seems like any dGPU is a liability on a laptop. I can’t wait until eGPU is a fully functional and cost appropriate option.
 
I have the same dilemma. 5,1 working well with flashed GTX960 and NVME SSDs on HS for Adobe, Logic, web dev and windows development in bootcamp. My need is 3x DP displays. I am considering buying an RX570 to get to Mojave but there’s nothing I need in Mojave. The question is at what point to replace the machine and with what :(

It would be advisable to not buy an RX570 and instead stick with Apple's recommend ones. There are reports on this forum that indicate the RX570 is not compatible with OS X.
 
The AMD dGPUs were faulty in the 2011 MBPs. Each supplier has had some problems. Honestly it seems like any dGPU is a liability on a laptop. I can’t wait until eGPU is a fully functional and cost appropriate option.

eGPU is available for TB3 with AMD GPUs in either expansion boxes or via closed solutions like those from Blackmagic Design that incorporate docking station and port expansion style functions. That is not the solution many WANT but it is available in some form...

Regardless, this thread seems to be getting way off topic from NVIDIA 9XX series and Mojave at this point.
 
Regardless, this thread seems to be getting way off topic from NVIDIA 9XX series and Mojave at this point.

True, though there's no news on Nvidia Mojave drivers. Basically, they're not coming out, so it's a case of stick with High Sierra or buy an RX580 / Vega.

I think the frustration here is that although AMD GPUs are fine, and sometimes great, being a Mac user means being denied access to some of the best GPUs on the market. Rather than just make the best machine possible, even when charging £5000 for a base iMac Pro, Apple are like 'AMD are good enough for you; our corporate politics / profit margins take priority'.
 
  • Like
Reactions: 09872738
Basically, they're not coming out, so it's a case of stick with High Sierra or buy an RX580 / Vega.

APPLE officially supports an extremely limited number of GPUs for Mac Pro 5,1 with Mojave. Decision and directions are really simple: either use what APPLE wants/recommends, or stick with your High Sierra setup. It's not the answer 90% of the people on this forum want, but it's reality.

Until there is actual clarity on the 7,1 most people appear to be willing to kick the can for now. There's rumors of a class-action lawsuit against Apple for anticompetitive practices related to this. Let's see if this NVIDIA web driver issue is magically resolved before that happens.
 
There's rumors of a class-action lawsuit against Apple for anticompetitive practices related to this. Let's see if this NVIDIA web driver issue is magically resolved before that happens.
I'd seriously like to see that happening.

The fact alone that something like it has to be considered tells a lot on what kind of people run Apple.

So sad! (Ups. Did I really just paraphrase the one that shall not be named? )
 
Last edited:
There's rumors of a class-action lawsuit against Apple for anticompetitive practices related to this.

I doubt that would work. What's next, insist Apple use Ryzen CPUs? No one's forcing anyone to use a Mac. The Mac Pro is essentially vintage at this point, so it's unlikely Apple would be forced to support it. They'd probably just block 10.15 from running on it instead. eGPUs are another matter I guess, but they're probably 0.0001% of the user base.

Despite what I've written elsewhere in this thread, I really am perplexed by Apple. I don't know why they can't just be nice to (a niche of) their users. Preferably support Nvidia hardware, but at the very least not obstruct it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.