Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Jul 4, 2015
4,487
2,551
Paris
That's certainly debatable. Back in my late 90's/early to mid 2,000's yesteryears I used to build desktops with others as an enthusiast. Sure AMD, formerly known as ATi had some good cards, such as the 8,000 and 9,000 series, I had a 9500 Pro and a 9800 Pro that I really liked. But overall nVidia had always had better quality/stability, especially when it came to their drivers. Remember the Detonator drivers, when you'd see double digit performance increases? ATi always lacked in that department and still does to this day "fine wine", especially their historic Catalyst drivers. A new driver would fix one problem, but at the same time create another problem with a different application, which lead to consumers desperately using beta and hacked up 3rd party drivers to get all their programs to work properly. Overall the drivers from nVidia where far superior to ATi's offerings. Also, today AMD cannot compete with Nvidia's high end offerings. So it's fair to say quality and driver wise nVidia has higher quality offerings than AMD.

I don’t know why you wrote all that. It had nothing to do with my post and it didn’t impress me ;))) because I have owned nearly every kind of GPU since the 90s :p

Nvidia doesn’t output 10 bit video unless you step up to Quadro, so let’s not even debate GeForce cards in a forum dedicated to pro users. Nvidia does great gaming cards. Radeon is better for certain professionals. These companies know how to not step on each other’s toes these days.

Back TO WHAT I SAID, Apple is a lifestyle brand. When they changed their corporate name from Apple Computer to Apple Inc the direction of the company went from computing to lifestyle.

They create a narrow line up of tools that reflect they are a lifestyle brand. Their emphasis is communication, fashion, health, creativity.

Companies like that cannot have throw kitchen sink level of options on their product pages. It’s messy, creates too much support issues, loss of focus, end up looking like an awful Dell website. A ugly computer website.
 

orph

macrumors 68000
Dec 12, 2005
1,884
393
UK
the fine wine thing is just that ATI/AMD dose not have the cash to get the drivers fully done as fast as NVIDIA,
id relay like to know what kind of wine OSX has for drivers :D seems to at times be relay good and then stays still for ages.

think the RTX cards got the 10bit out and asynchronous compute improved a lot, was never shore what the 10-bit problems was the GTX 10xx cards have 10-bit out but i saw reviews saying a 10% penalty in games maybe due to the color compression not working on full 10-bit output (nm that by default on windows they had 16-245 or something set)
but i only saw that in reviews so i was never relay clear on it.

i dont have any of them tho :oops: i can see the old GPU parts working in OSX without a lot of work but the new parts id assume will take a while as i assume there will be bugs and problems on windows which they will want to fix in time for there 7nm shrink before they spend to much time on osx
 

GrumpyCoder

macrumors 68020
Nov 15, 2016
2,072
2,650
Is it now RTX and bust?
That's what it looks like. Alot of FEs blowing up and NVIDIA doesn't handle it very well. People are beginning to switch to ASUS cards, curious if those are better or will show issues as well.
 

cube

Suspended
May 10, 2004
17,011
4,972
That's what it looks like. Alot of FEs blowing up and NVIDIA doesn't handle it very well. People are beginning to switch to ASUS cards, curious if those are better or will show issues as well.
It seems they put a decent cooler this time, but the design would have heat problems.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
I will jump on the bandwagon. RT cores seem like a waste of silicon now that I have seen benchmarks. I think ray tracing should be done with more general purpose hardware, which would push actual use even further into the future.

I was expecting RTX cards not to be powerful enough for real time ray tracing to be practical.
 

namethisfile

macrumors 65816
Jan 17, 2008
1,186
168
I will jump on the bandwagon. RT cores seem like a waste of silicon now that I have seen benchmarks. I think ray tracing should be done with more general purpose hardware, which would push actual use even further into the future.

I was expecting RTX cards not to be powerful enough for real time ray tracing to be practical.

GTX 1080Ti can do ray-tracing without the RTX cores. But, it would be even worst performance than it is now with BFV and RTX cards. This is why other Nvidia cards doesn't support them because it is pointless. Same is true for AMD cards. They can make it run with DXR-enabled. It just wouldn't run. Maybe 1 FPS.

The dip in performance with DRX enabled is not a surprise since the tech just came out.

What we don't know is if the performance can be improved with future software patches and optimizations down the road. And/or, if the RTX cores in the RTX GPU's are physically tapped out and that the RTX hardware has reached its limit without either adding more RTX cores and/or architecture tweaks in future RTX cards.

Also, we don't know if RTX cores, if say, a game isn't using DXR can be used for general compute task. FOr example, in MacOS, can the RTX cores be leveraged via Metal? To say, encode/decode/render videos, etc.? This is alongside, the regular CUDA cores and Tensor Cores? If, so, that's a lot of "compute" stuff to work with.....

In Windows, can RTX cores be leveraged doing non-ray-tracing tasks? We don't know!

Just general understanding of what makes an RTX core and also the Tensor Cores would be useful to know.

To say, if RTX cores are a waste of silicon is too early to say. Like, we don't know what makes an RTX core an RTX core and what it can do. We don't know if RTX-cores, for example, is fundamental to the Turing architecture in that they aren't just sitting idle-y in games that doesn't have ray-tracing enabled. For example, an RTX 2070 has less CUDA cores than a GTX 1080. But, in game benchmarks, an RTX 2070 always comes out ahead of the GTX 1080 and doing it with less cores. So, is the RTX-cores in the 2070 somehow being utilized there or is the architecture just so different that an RTX 2070 is outpeforming the GTX 1080 with more cores?

So, yeah, basically we need to really know what is an RTX core to be able to say, "It's a waste of silicon."
 
Last edited:

CreeptoLoser

Suspended
Jul 28, 2018
369
333
Birmingham, Alabama
You have to RTX out there in developers hands anyway even if it is not practical for something like 4K gaming yet.

Where it is practical now is 3d modelling/SFX apps that can use the ray tracing APIs. CG artists can get very quick feedback.
 

darksithpro

macrumors 6502a
Oct 27, 2016
582
4,572
It seems the RTX 2080ti is the first single card, asides from the 3 grand Titan V that can play pretty much every game "non ray trace" at 4k above 60 fps at full settings. Before only 1080 SLI and Titan V could do it. Not even 1080ti. that is an accomplishment.
 

Eweie

macrumors regular
Oct 5, 2013
152
84
anyone knows if the 2060 has boot screen support for the 5,1 Mac Pro? or was that a hoax. thanks.
 

apanyz

macrumors newbie
Feb 6, 2019
1
2
Hello, yesterday install my RTX 2060 founders edition by nvidia in my Mac Pro 5.1.
I can confirm the boot screen run perfect !
In this moment only windows 10 is install in this machine but this night try to install Mojave.
 

zozomester

macrumors 6502
Apr 26, 2017
363
266
Hungary
Hello, yesterday install my RTX 2060 founders edition by nvidia in my Mac Pro 5.1.
I can confirm the boot screen run perfect !
In this moment only windows 10 is install in this machine but this night try to install Mojave.
Thank you for your info. Unfortunately, webdriver is not yet on Mojave.
 

Eweie

macrumors regular
Oct 5, 2013
152
84
Hello, yesterday install my RTX 2060 founders edition by nvidia in my Mac Pro 5.1.
I can confirm the boot screen run perfect !
In this moment only windows 10 is install in this machine but this night try to install Mojave.
awesome. thank you.
 
Last edited by a moderator:

AidenShaw

macrumors P6
Feb 8, 2003
18,667
4,676
The Peninsula
What do you folks think of the points raised in this video :

One Andy Ihnatko quote stands out for me "I think that's why a lot of pro users are walking away from the Mac". (08:15 into the clip, although it doesn't make much sense without the preceding 8 minutes.)
 

flowrider

macrumors 604
Nov 23, 2012
7,229
2,956
^^^^And his Top Down/Bottom Up comment sure makes a lot of sense. Bottom Up is Something Apple does not agree with for it's top Mac Pro models.

Lou
 

joebclash

macrumors regular
Jun 14, 2016
209
119
What do you folks think of the points raised in this video :


As a loyal apple user since the Apple IIc this just makes me sad. Another example of apple putting it's ego ahead of it's most loyal customers. What has happened to apple? This madness needs to stop...
 
  • Like
Reactions: 09872738

jeanlain

macrumors 68020
Mar 14, 2009
2,430
933
What do you folks think of the points raised in this video
The commenters are misinformed and the arguments are quite weak.
- About the Nvidia hardware issues in Macs. The worst issues were in 2008 with the infamous geforce 8600M GT, and we've seen Nvidia in macs up to 2013-2014. I don't recall particular hardware issues in 2013.
- About Metal vs Cuda. The argument is stupid. First, Nvidia have been releasing Metal drivers since the beginning of Metal and their web drivers do support Metal. Second, Metal is first and foremost and iOS API. It's not a direct competitor to CUDA. Metal powers a lot of frameworks and its adoption is already huge (hundreds of thousands of iOS apps). Apple doesn't need to ban CUDA on the Mac to push Metal. Actually, the Mac is an afterthought when it comes to Metal and CUDA is probably the least of their concern. Third, back in 2009, Apple's own OpenCL had to compete against CUDA on its own merits (in a way Metal doesn't have to). Yet, Apple didn't ban Nvidia from Macs then.
- When one commenter says "Nvidia want to write they driver, Apple wants to write their drivers", he doesn't know how Metal works. The implementation of Metal in drivers is completely up to the GPU vendors, much more in Metal than in OpenGL. He probably believes that drivers that come bundled with macOS are written by Apple. No. AMD write the Metal drivers for their GPUs, just like Nvidia do for those few GPUs that are supported in Mojave.
As said in the Appleinsider article, there was probably a dispute between Nvidia and Apple executives, over a threat of patent lawsuit related to GPUs in mobile around 2013. I suspect there was a heated discussion at some point, after which Apple said "screw them".
 
  • Like
Reactions: 09872738
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.