Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I agree with your sentiment, and I think what it really comes down to is that Apple and nVidia have been at odds with each other for several years now, and the result is that Apple decided to use AMD products in their machines and have very little care for the customers who want to use nVidia cards. They are marginalizing users of the older, upgradable Mac Pros both to force purchases of newer hardware, and to eliminate the last foothold that nVidia has in the Mac market.
These decisions make sense from a standpoint of selling more hardware and making money, but make no sense at all in terms of cultivating a user base of professional users.
I am not a professional user, so while it ticks me off to have to replace an nVidia card that I purchased less than a year ago with an AMD card for compatibility reasons and to not constantly worry when the next "web driver" will be released, I still placed an order for an RX 560. It still beats the alternatives of overpaying for a machine with no upgrade path or building an unsupported hackintosh that I will have to keep running just to enjoy the benefits of Apple's operating system.

I'm n ot sure there's a rivalry between Nvidia and Apple. I think the AMD partnership for GPU's was out of necessity for custom design work. Nvidia will not do customized SKU's for manufacturers. That means what's in NVIDIA's lineup is what you buy. Apple had their own thermal limits in mind, especially with the "pro" devices that use GPU's. AMD was likely the only one willing to do these custom SKU's for Apple.

it's the similar reason why both the Xbox 1 and PS4 are using AMD chips for CPU and GPU. AMD was willing to do custom Designs. Nvidia is not.


Apple also probably does not want to use CUDA but wants to push to use their own standard. Since AMD supports Open standards such as OpenCL, it's easier for Apple to move forward using a fork of an open standard, than be stuck using a proprietary technology that they have no control over.

What's going on now with the Nvidia drivers not being approved by apple is just probably more of Apple being Apple. Apple doesn't profit in any way from Nvidia devices being used. in fact, since not a single Apple product in 5 years has used Nvidia, Apple probably figures why support it? All it does is help support either Hackintosh's or users who are using their own eGPU solutions that aren't purchased from Apple. They have no monetary / vested interest in helping support Nvidia.
 
Yeah, I can understand not having a vested interest in a product that isn't making them money. The bit about needing custom GPU design makes some sense too. The unfortunate part is that Apple's inability to provide an easily customizable Pro solution over the last few years is driving the hackintosh situation somewhat, and that situation certainly costs them money. I am not a pro user myself, but I know folks who are and several of them have gone to hackintoshes because they like what Apple's operating system provides but think that the hardware options have gone to crap. For what they are doing with their machines, I can't say that I disagree with them. I personally wouldn't go that route because I don't need the headaches of running on unsupported hardware and I don't need a high end rig for what I do with my computer (basic computing tasks, very limited gaming). My machine is overkill for what I use it for, but I got a screaming deal on it used and it is better to have more power than you need. Also, there is the issue of it being a violation of the EULA for the operating system with a hackintosh.
If you aren't a pro user, and you aren't using your mac pro for high end gaming, you can put in a lower end AMD gpu (RX 560) like I did for Metal support. It sucked to have to swap GPUs after a year, but fewer headaches make it worthwhile.

I'm n ot sure there's a rivalry between Nvidia and Apple. I think the AMD partnership for GPU's was out of necessity for custom design work. Nvidia will not do customized SKU's for manufacturers. That means what's in NVIDIA's lineup is what you buy. Apple had their own thermal limits in mind, especially with the "pro" devices that use GPU's. AMD was likely the only one willing to do these custom SKU's for Apple.

it's the similar reason why both the Xbox 1 and PS4 are using AMD chips for CPU and GPU. AMD was willing to do custom Designs. Nvidia is not.


Apple also probably does not want to use CUDA but wants to push to use their own standard. Since AMD supports Open standards such as OpenCL, it's easier for Apple to move forward using a fork of an open standard, than be stuck using a proprietary technology that they have no control over.

What's going on now with the Nvidia drivers not being approved by apple is just probably more of Apple being Apple. Apple doesn't profit in any way from Nvidia devices being used. in fact, since not a single Apple product in 5 years has used Nvidia, Apple probably figures why support it? All it does is help support either Hackintosh's or users who are using their own eGPU solutions that aren't purchased from Apple. They have no monetary / vested interest in helping support Nvidia.
 
Yeah, I can understand not having a vested interest in a product that isn't making them money. The bit about needing custom GPU design makes some sense too. The unfortunate part is that Apple's inability to provide an easily customizable Pro solution over the last few years is driving the hackintosh situation somewhat, and that situation certainly costs them money. I am not a pro user myself, but I know folks who are and several of them have gone to hackintoshes because they like what Apple's operating system provides but think that the hardware options have gone to crap. For what they are doing with their machines, I can't say that I disagree with them. I personally wouldn't go that route because I don't need the headaches of running on unsupported hardware and I don't need a high end rig for what I do with my computer (basic computing tasks, very limited gaming). My machine is overkill for what I use it for, but I got a screaming deal on it used and it is better to have more power than you need. Also, there is the issue of it being a violation of the EULA for the operating system with a hackintosh.
If you aren't a pro user, and you aren't using your mac pro for high end gaming, you can put in a lower end AMD gpu (RX 560) like I did for Metal support. It sucked to have to swap GPUs after a year, but fewer headaches make it worthwhile.

I'm also wary of hackintosh with the current Apple direction. I believe it's only a matter of time that Apple will start embedding in checks into the OS for the T2 chip. no T2 chip, no boot.

We've already seen that the T2 chip is causing problems with booting other OS's, so I feel like this is just a matter of time before they use it to lock down even further.
 
I'm also wary of hackintosh with the current Apple direction. I believe it's only a matter of time that Apple will start embedding in checks into the OS for the T2 chip. no T2 chip, no boot.

We've already seen that the T2 chip is causing problems with booting other OS's, so I feel like this is just a matter of time before they use it to lock down even further.

Keeping in mind how long pre-T chip Macs will likely be supported by macOS, that'll be a while out.
[doublepost=1548101564][/doublepost]
Seeing the new AMD announcements, I have a bad feeling about the Mac Pro. I think it's more likely that Apple is taking so long to launch the Mac Pro because they are waiting for AMD's new 7nm GPU line and support for PCI 4.

https://arstechnica.com/gadgets/201...uture-7nm-gpus-with-pcie-4-zen-2-zen-3-zen-4/


Curious which version they will use. With Vega 64 in the current iMac Pro, the pro and consumer cards only differed in clock speeds, so I believe Apple made a deal to brand it a Pro card when the distinction was pretty blurry. However with Radeon VII, the VII does not support full rate FP64 (running at 1/8th rate), and does not support PCI-E 4, while M150, with identical silicon, supports both.
 
I'd say even more likely given it's a couple months later now, and RTX is fully launched with no macOS webdriver's yet either -and the Radeon VII's look to pack a punch. Competition-wise it's all good, but the drivers have to be there for bragging rights to really come into the mix. Just my opinion though, LOL.

They are all GPU's, I'd just like to be able to get the one I want, not whichever one doesn't need to be hacked to work. That's not freedom, thats the lesser of two evils (because they both take money from me, LOL).
 
Last edited:
I'd say even more likely given it's a couple months later now, and RTX is fully launched with no macOS webdriver's yet either -and the Radeon VII's look to pack a punch. Competition-wise it's all good, but the drivers have to be there for bragging rights to really come into the mix. Just my opinion though, LOL.

I think I am just going to stay stuck in the past until security updates for Mojave run out, running my current hardware. When I reach that point, I will either have to invest in new Apple hardware or switch to Windows. For cost reasons, it will likely be the latter. The Macintosh is quickly becoming a niche, luxury product.
[doublepost=1548102870][/doublepost]
I think I am just going to stay stuck in the past until security updates for Mojave run out, running my current hardware. When I reach that point, I will either have to invest in new Apple hardware or switch to Windows. For cost reasons, it will likely be the latter. The Macintosh is quickly becoming a niche, luxury product.
I am quite sure that I won't be able to run the next version of Mac OS on my machine, even via unsupported means.
 
Curious which version they will use. With Vega 64 in the current iMac Pro, the pro and consumer cards only differed in clock speeds, so I believe Apple made a deal to brand it a Pro card when the distinction was pretty blurry. However with Radeon VII, the VII does not support full rate FP64 (running at 1/8th rate), and does not support PCI-E 4, while M150, with identical silicon, supports both.
Everything is sorted here by software, and drivers. The die for Radeon 7, and HPC MI50 and 60 GPUs is the same. What differes them is the BIOS, and drivers.

Apple will buy the same parts, but its up to them to decide wheter they will give full FP64 support, through BIOS, and Drivers.
 
  • Like
Reactions: dwmusic
I am quite sure that I won't be able to run the next version of Mac OS on my machine, even via unsupported means.

Don't worry about having the most "current" macOS and it'll give time for whatever game plan to develop so we can all make the best decisions for each of us. Final builds are always the most solid and can last for years-we have time, if we have the patience to spare.
 
  • Like
Reactions: dwmusic
Everything is sorted here by software, and drivers. The die for Radeon 7, and HPC MI50 and 60 GPUs is the same. What differes them is the BIOS, and drivers.

Apple will buy the same parts, but its up to them to decide wheter they will give full FP64 support, through BIOS, and Drivers.


That's why it's a murky situation. On Windows, AMD charges a significant markup for the drivers that enable those features, even when the silicon is the same. Curious if they'll allow Apple to unlock both, or if Apple will pay for it.

Before it hadn't really come up because Vega even in consumer use used the same FP64 rate and same PCI-E revision as the pro-ified cards.
 
That's why it's a murky situation. On Windows, AMD charges a significant markup for the drivers that enable those features, even when the silicon is the same. Curious if they'll allow Apple to unlock both, or if Apple will pay for it.

Before it hadn't really come up because Vega even in consumer use used the same FP64 rate and same PCI-E revision as the pro-ified cards.

Ok, so I'm clueless on FP64 (after searching the web I see it's computational related)-but is it used in scientific and statistical software? Engineering and CAD software?

I understand it Floating Point 64, but is it just a 32/64 jump for GPU's that's only coming now, following the transition to 64 bit compute architecture?

These are rhetorical questions since this isn't an AMD hardware thread. My reference was only use as an indicator of why we are still waiting on Mojave web-drivers, and the "It's up to Apple to approve them."

In my hopeful eyes-that mean in the OS. They don't need to get a GPU into a machine to get their code into the OS-if it just works, and Apple gives a green light. Fingers crossed and whip cream with a cherry on top-but only time will tell.

Point being- Nvidia, AMD, old tech, new tech, none of that matter's in the real world-where we really live if it gets the project done. Newer GPU's just help you get a job done faster-in most cases, closing the gaps in your project timelines. If you aren't lined up back to back on your project timelines your gains will be minimal, at best. It always boils down to having a surplus of either: time or money. Please stop demanding that consumer grade GPU's consist of more pro level features-I used to be able to afford to be an early adopter-now I have to sit back and wait, because at one time ATI was an option, but AMD ruined it by being the only two GPU's to ever fail on me in over 26 years of working with computers (Starting with Intel 386 and Apple II, failures ending with HD5770 and HD 6770).

In my mind it's all related to licensing fee's, nothing more. The same reason pro-level cards cost a lot more money, more features.
 
Last edited:
Ok, so I'm clueless on FP64 (after searching the web I see it's computational related)-but is it used in scientific and statistical software? Engineering and CAD software?

I understand it Floating Point 64, but is it just a 32/64 jump for GPU's that's only coming now, following the transition to 64 bit compute architecture?


These are floating point formats. Essentially how many decimals over of precision calculations can be on a computer. FP64 is a double precision operation, FP32 is single, FP16 is half. It's not like the 32/64 bit jump in processors, all modern GPUs can run FP64 code, the difference is in how consumer cards are capped, as I mentioned in Radeon VII it runs at 1/8th the rate of FP32, artificially, as the M150 runs it at full rate FP64 (otherwise known as 1/2 rate, since that's how you would expect it to perform twice the precision as single rate).



Your use cases are right - say you're designing a rocket part, you want that increased precision in exact tolerances. Or you're modelling protein folding etc etc. Anything where a higher degree of accuracy is needed, whereas games don't need to be that precise and now even FP16 is starting to be used since not all elements of a game needed even single precision performance.


What happens here will make a substantial impact to the iMac Pros value proposition. If it gets full rate FP64 while Apple is only paying Radeon pricing, it could be quite a good value (so long as you wanted a 5K screen to go with it).
 
Last edited:
  • Like
Reactions: Reindeer_Games
These are floating point formats. Essentially how many decimals over of precision calculations can be on a computer. FP64 is a double precision operation, FP32 is single, FP16 is half. It's not like the 32/64 bit jump in processors, all modern GPUs can run FP64 code, the difference is in how consumer cards are capped, as I mentioned in Radeon VII it runs at 1/8th the rate of FP32, artificially, as the M150 runs it at full rate FP64 (otherwise known as 1/2 rate, since that's how you would expect it to perform twice the precision as single rate).

Your use cases are right - say you're designing a rocket part, you want that increased precision in exact tolerances. Or you're modelling protein folding etc etc. Anything where a higher degree of accuracy is needed, whereas games don't need to be that precise and now even FP16 is starting to be used since not all elements of a game needed even single precision performance.

I can see how any GPU using those features would want proper drivers and acceleration of such features-but from what I found it's present in both brand's of GPU's, but in Nvidia all the way back to Kepler's- with at least the 780Ti, slowly though I'm sure by comparison. AMD had already absorbed ATI by that point:

https://www.engadget.com/2006/07/24/amd-buying-ati-for-5-4-billion/

But this goes back to my point-this is when GPU's started taking off in price and why I want acceleration back before moving beyond HS. These crazy sales pitches of Scientific Application of gaming GPU's is literally driving customers away (at least me), not selling them more GPU's on more potential applications (as reflected in their stock prices in the last two quarters). They think gamers will pay more for a luxury, most adult gamers won't unless their profession relies on it- buy the GPU for your application. Would you go off-roading in a Tesla (electric car)? Those are "nice to have's" not necessary components for operation if we're only talking about half-speed vs full-speed anything when by your numbers we are talking about a 4 fold bit rate above the intended application. CUDA made gaming cards unnecessarily expensive for the gaming community until developers used it-blurring the Prosumer/Pro Series lines of cards for both vendors, since AMD responded by introducing their own level of pro-features. I'm willing to supplement creatives digital artistry since the tech required is the same and is usually quite beautiful, and there is a reason the saying is "starving artist" and not starving engineer.

JIT manufacturing made more sense in the 90's during the first tech bubble, it makes more sense in 2019 to avoid another one. If everybody buys the newest GPU who's buying the old one, or the next generation after that?

If anyone is designing something structural such as a rocket-if I was the one in that rocket, I would hope they were using quality parts and equipment at all stages of design and production. Just my 2 cents-maybe you work at Estes Rockets and no one's life is at stake but structural support is still reliant on the same principals. Drivers of the tech are always key though.

But I do believe the iMac Pro should have a pro series GPU at it's asking price-this just isn't that forum either and it's drivers are fine.
 
Last edited:
Don't worry about having the most "current" macOS and it'll give time for whatever game plan to develop so we can all make the best decisions for each of us. Final builds are always the most solid and can last for years-we have time, if we have the patience to spare.

I agree. As long as there are still security updates, I will be fine. Once security updates are done, I either buy a new Mac or go full Windows on my current one. I don't do anything on mine that requires cutting edge hardware.
 
  • Like
Reactions: navaira
I agree. As long as there are still security updates, I will be fine. Once security updates are done, I either buy a new Mac or go full Windows on my current one. I don't do anything on mine that requires cutting edge hardware.
Having spent four weeks doing my best to live with Windows I'll either buy a new Mac or build a new Hac. Apple and NVidia's relationship is being problematic. Microsoft logo is in the dictionary next to the as the very definition of problematic.
 
Having spent four weeks doing my best to live with Windows I'll either buy a new Mac or build a new Hac. Apple and NVidia's relationship is being problematic. Microsoft logo is in the dictionary next to the as the very definition of problematic.

I agree, but a brand new Mac is probably out of the affordable range at this time, and the lack of expandability is a concern. I am not personally fond of going the hackintosh route (I have nothing against people going that route, it just isn't the way that I prefer to go). For these reasons, I am not sure what the best solution will be for me. I will have a few years to figure it out.
 
Look, I know AMD graphics cards are typically cheaper than Nvidia cards. I also know that Nvidia cards are typically superior for many tasks. I personally think that Apple should give the customer the option to upgrade to an Nvidia chip. I, for one, would pay a little more for the superior option.
Of course, and this is painfully obvious, Apple isn't overly concerned with the things that their customers want. Nothing anyone says here is likely to make a bit of difference in the direction that Apple chooses to go with their products. I know that the drivers for Nvidia CUDA have to be rewritten for Metal compatibility for the Nvidia cards to be useful to pro users again. I am hopeful that it happens. It would be even better if those drivers were integrated into the OS like the drivers for Kepler cards were. No need to update drivers with every incremental OS update that way.
 
Look, I know AMD graphics cards are typically cheaper than Nvidia cards. I also know that Nvidia cards are typically superior for many tasks. I personally think that Apple should give the customer the option to upgrade to an Nvidia chip. I, for one, would pay a little more for the superior option.
Of course, and this is painfully obvious, Apple isn't overly concerned with the things that their customers want. Nothing anyone says here is likely to make a bit of difference in the direction that Apple chooses to go with their products. I know that the drivers for Nvidia CUDA have to be rewritten for Metal compatibility for the Nvidia cards to be useful to pro users again. I am hopeful that it happens. It would be even better if those drivers were integrated into the OS like the drivers for Kepler cards were. No need to update drivers with every incremental OS update that way.
It seems like Apple has been more and more been embracing their marketing myopia. It’s all about how the product looks in ads and how thin it is, and less about how the customer uses it and what the customer wants.
 
It seems like Apple has been more and more been embracing their marketing myopia. It’s all about how the product looks in ads and how thin it is, and less about how the customer uses it and what the customer wants.

There is little doubt that their products are attractive looking, but that means little to many customers if it doesn't provide the features that meet their needs. Especially at the price point that most of their products are at.
I was an iPhone user up until my most recent upgrade, but the cost/feature balance got too unbalanced for me. I switched to a $180 android phone that more than meets my needs at 1/5 of the cost.
I love the elegance of Apple's computer operating system, but the cost/feature balance on their computers is starting to outweigh the benefits of Apple's computing environment for me. And I really don't want to build and maintain a hackintosh.
My trusty Mac Pro 3,1 is still accomplishing what I need with a few key upgrades, but I will soon be at the point where the upgrade decision will need to be made.
 
Nearly a year now. Any change to this situation? I'm stuck on High Sierra. Apple truly does not care about this market at all, nor about anyone who would want to expand his computer's capabilities. "Just buy a new one every three years." **** that. Ten years old and still going strong. I'm able to run Mojave even with a GT120, for ****'s sake. The only thing that doesn't render correctly under that chip is iBooks Author, but I need that for work. The only thing keeping me from Mojave is lack of support for my GTX 980. I'm sure that this computer can run whatever comes after Mojave–with some hacking–as well. But not if there's no driver!
 
  • Like
Reactions: turbineseaplane
I've given up on waiting for Nvidia. I am now holding out hope for native Radeon VII Mac support coming for my GPU rendering. On eGPU.io, someone got a radeon VII working and AMD has told people to wait until the next release of Mac OS.
 
I've given up on waiting for Nvidia. I am now holding out hope for native Radeon VII Mac support coming for my GPU rendering. On eGPU.io, someone got a radeon VII working and AMD has told people to wait until the next release of Mac OS.

In the hackintosh community, there are already a few users with mostly functional Vega VII cards in their systems.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.