Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I still find it baffling that Apple went with Intel and not AMD considering they have more than twice as many cores (64 vs 28) yet cost less. Let's not even get started on the security omnishambles Intel has become.
 
I have been running the w5700 [with 8gb ram] on an egpu for the last few months. I have no complaints at all and made my 16’”mbp faster than my old iMac pro with Vega 64.
i use it for real time rendering and general 3d work.
 
  • Like
Reactions: wilhoitm and sos47
Im not sure if RDNA 2 is useful on Mac system cause RDNA 2 is purely focused on gaming only. AMD changed their strategy by creating two new architectures instead of using one universal architecture: GCN. This means Radeon Pro version RX 5900 is not possible.

There is far more to the Mac line up than just the workstation class systems. As long as a RDNA2 GPU makes it into one of the other Macs ( e.g., MBP 16" , iMac , iMac Pro ) then a mainstream market card for the same GPU will likely get a driver that works pretty well in a Mac Pro 2019 also. It wouldn't be a MPX module, but the Mac Pro 2019 doesn't have to solely take MPX GPU modules to upgrade.

That Apple will never use a RDNA2 anywhere in any of their systems is a pretty big stretch. They are skewed toward gaming but AMD is talking substantive performance per watt improvements too. For the MBP 16" and iMac those are actually important even if Apple didn't optimize the ray tracing (if any) hardware. Yes (if any). Like Nvidia there may be "smaller" RDNA2 implementation that drop the ray tracing stuff just like Nvidia has lower end priced options that do. Second half of 2021 that 'improved' 7nm TSMC fab solution would be more affordable and hence be used to replace the original RDNA1 stuff with equivalent RDNA2 options. ( Nvidia isn't going to sit still. They are probably 'moving' to new stuff in 2020 so RDNA1 stuff isn't going to be competitive on the leading edge past end of 2020. )

For the Mac Pro specifically the "compute" focused track is probably the next step after the W5700X gets stable. I suspect Apple will iimp along with the 580X until the low end RDNA2 replacements come out. ( Maybe a 5600XT derivative if AMD has excess next year and gives them a very good price if the low end RDNA2 replacements are taking too long. )


They need to make different GPU for workstations.
As for a "whole new architecture. ... I think that may not be as far as you think.



But I'm not sure if AMD is going to do that soon or not.

Or take as much time.
[automerge]1587013183[/automerge]
Be aware: AMD will start using a whole new architecture for their GPU: RDNA and CDNA. RDNA will focuses on gaming while CDNA focuses on computing.
.....
We probably need to expect CDNA based GPU, not RDNA based GPU like RX 5900 or 5800.

There is a decent chance that CDNA is pretty close to be Vega 30 ( or third iteration of Vega). The M1000 probably isn't that far away from the timeline that RDNA2 is on.

Vega 20 ( MI50/60 and Vega VII product) was already incrementally better at compute tasks than Vega 10 architecture. An iterative tweak to that that would be a better compute options. ADM doesn't have to completely start over from scratch.

( RDNA isn't really a complete start over from scratch either. )

If the weaved the power optimizations for 7nm in with a change to allow more CU units ( past 64) , and added bfloat16 they probably have a very good computational/ML card. Wouldn't need max clock rates if mainly were throwing embarrassingly parallel workloads at it. ( even without bfloat16. it could work in that space. )
 
Last edited:
  • Like
Reactions: wilhoitm
Just went on the site, and I get their stupid popup about "customised ads". 9:55 this morning. Got rid of it. Pissed off. Macrumors, get your act together.
 
May I ask what your uses will be for this machine?
Fortnite 😂
[automerge]1587037206[/automerge]
I'm glad you took the time to answer my question. I definitely was not trolling. I am a retired attorney and am looking to upgrade to a more powerful Mac. The ability to upgrade your Mac in the future is, IMO, a very good reason to buy the Mac Pro. There are videos on YouTube where people are still making the original cheese grater run very fast. As you can see from my sig, I'm a flight simmer, so I'm thinking about the Mac Pro or maybe the (hopefully) upcoming iMac Pro refresh.
Flight simulation on a Mac ?
I had to buy a PC for that...
 
Last edited:
Finally. Now I have to wait for the Apple Store in Delaware to re open so I can save $700 in Maryland sales tax.
I believe MD law still requires you to pay sales tax (directly to the Comptroller).

Just sayin’...
 
I'm glad you took the time to answer my question. I definitely was not trolling. I am a retired attorney and am looking to upgrade to a more powerful Mac. The ability to upgrade your Mac in the future is, IMO, a very good reason to buy the Mac Pro. There are videos on YouTube where people are still making the original cheese grater run very fast. As you can see from my sig, I'm a flight simmer, so I'm thinking about the Mac Pro or maybe the (hopefully) upcoming iMac Pro refresh.



How about this flight simulator setup?

 
Don't have any sense, cause Apple went away from Open GL a long tome ago, and created their own graphic system called Apple Metal that only works with AMS cards... Don't understand this Nvidia obsession in varios people around this forum....
AMD not AMS
fix the typo lol

is not an obsesion simply because we like nvidia cards more than AMD cheap brand

CUDA can speed things up faster than AMD

so why we have to use AMD because that is what apple wants

that is what mr cook and company choose for us

they took the right for us to choose
monopoly at it’s best

that’s why i can’t wait for tim to go away

wait untill the 3000 series comes out and tell me who wouldn’t like to have one of those in their mac pro, unfortunately thanks to apple, mac pro owners can’t use nvidia cards
 
[automerge]1587045409[/automerge]
is it? Titan does half of this.

View attachment 906224
if those results are from mac os then you won’t see the results from cards like pascal RTX because they are not support it in mac os due to the lack of drivers

you mentioned nvidia titan but that is an old archichecture, pascal, not the lastest that have a much higer score, if you go to windows then you will the results from those cards that are missing from the mac os results

i know that we ate talking about mac os and not windows but if we would have mac os drivers for those cards the you will see the numbers there, you only see AMD because that is what apple supports, then the results from the nvidia cards are from
previous generations which are a sliwer than the newer or latest gen of gpus
 
Last edited:
I'm pretty sure AMD efforts w/ Blender will be optimized for RDNA2 and RDNA2 GPU's ray-tracing & deep-learning hardware more than anything else to have parity with Maxell & RTX's steller tensor cores + the latter's ray-tracing cores.

Better.

The AMD ProRender engine is GPU neutral. Team Red, Team Green, it doesn't care.
 
I still find it baffling that Apple went with Intel and not AMD considering they have more than twice as many cores (64 vs 28) yet cost less. Let's not even get started on the security omnishambles Intel has become.

1. I believe Final Cut kinda depends on AVX-512. AMD passed on adding that instruction set to Eypc/Threadripper/Ryzen, because only a handful of products use it and they didn't want to waste die space on it. That would not prevent Apple from contracting with AMD for a semi-custom chip. A semi-custom Ryzen is what is going into the next-gen consoles. OTOH, would Apple be willing to pay for a semi-custom chip? They aren't selling very many 7,1s, so it probably wasn't cost effective.

2. Probably still have an ongoing contract. That is why the Ryzen 1700AF exists - AMD has an ongoing contract with Global Foundries (SP) for 12nm parts.

3. No one ever got fired for buying IBM, Intel.

4. Apple would be completely dependent on AMD for both CPUs and GPUs.
 
Anybody has a comparison of render time (mp4, H264) for Compressor between the 580X and the 5700X?

As far as I understand, the 5700 is double the speed (in tests!) of the 580. If it's really double the speed, it would worth while spending additional 600.

A second 5700 will probably not help:
I was looking today (MacPro 2014) and Compressor is using only one D500 (one at 80%, the second one at 20%). Using Compressor, I have serious issues with heat dissipation after 10 mins (rev'd up the fan to the max already). Once the heat sink reaches 106(41C) the INTEL throttles down, and render times double. With Compressor, the heatsink is up to 115 (46C). Render speed is best if below 104(40C) = open window or run AC.
 
Hopefully this means a Blackmagic External GPU Pro update soon with this chip in it.
 
There is also a chance that the W5700X is also faster than the VegaII for doing hardware H.264/HVEC encoding.
 
  • Like
Reactions: worldburger
AMD not AMS
fix the typo lol

is not an obsesion simply because we like nvidia cards more than AMD cheap brand

CUDA can speed things up faster than AMD

so why we have to use AMD because that is what apple wants

that is what mr cook and company choose for us

they took the right for us to choose
monopoly at it’s best

that’s why i can’t wait for tim to go away

wait untill the 3000 series comes out and tell me who wouldn’t like to have one of those in their mac pro, unfortunately thanks to apple, mac pro owners can’t use nvidia cards
Tim Cook leaving Apple isn’t going to get NVIDIA GPUs in Macs. NVIDIA burned that bridge and it’s not ever going to be rebuilt. I wouldn’t be surprised if there is something in Steve’s Will about never doing business with NVIDIA ever again.
 
Also the people posting about AMD being cheap brand obviously have not seen the prices for the workstation and server cards. AMD still currently has the fastest single card, as far as compute goes.

CUDA isn't really anything that special. Every single app that uses CUDA could be coded to use Metal. It is just a question on how stubborn the developer is. CUDA was good because of cross platform use though; but that bridge has crumbled so if you must use a Mac, might as well learn how to code using Metal APIs if you want things to be fast.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.