And so reasonably priced!
Is there an decent graphics option if you're not a Saudi prince?
Why yes. It's built into the M1 SoC.
And so reasonably priced!
Is there an decent graphics option if you're not a Saudi prince?
M1 is yesterday's news. Now it's on to M2! But M2!But M1! But M1!
Thank you. The prices are just not justified. This is coming from a 2019 mac pro user that knew I was getting into overpriced territory compared to just building a pc back then.As someone who has tested the regular dual 6900 XT vs the Vega ii duo, the 6900 XT smashed it - so I doubt the W6900X, while being great with 32GB of VRAM, will be much better than the 6900xt. In fact, in some cases it will be slower due to the clock speeds.
I mean, $999 msrp vs $6000 is a HUGE delta. There's a lot of fluff priced in this one, I don't think it will reflect in the performance. It's even more than the W6800x Duo!
If a single W6800x is $2800, the W6900x should really have been close to $4000 tops, right below the $5000 W6800x Duo.
I mean, for less than 2x W6900x you can get 4 W6800 GPUs with two duos - that's a big gap imo.
Are you sure they aren't both Navi21 dies?
I am not sure I understand what you mean. All the WGPs in both the consumer version and workstation version are operational. Navi 21XTX and 21XTXH don't have any broken parts. And to be honest I am not sure the clock binning is really a thing between them either.They are, but they are binned quite differently. Only a very very small percentage work at the necessary process corners.
I am not sure I understand what you mean. All the WGPs in both the consumer version and workstation version are operational. Navi 21XTX and 21XTXH don't have any broken parts. And to be honest I am not sure the clock binning is really a thing between them either.
EDIT: as in are there 21XTX parts that can't run faster than the base/boost clocks?
It depends on the software support.Bro. a retail 6900xt is $999. a retail w6900x is $6,000. $5,000 extra price tag does not justify the minor slight increase in performance and 32gb of vram. I am simply stating that for those prices we should have seen huge increases in performance.
As far as I know none of the reference 21XTX run at the base/boost clock AMD advertises. They all run faster. Doing some more poking around, it seems like the 21XTXH versions actually have faster memory (18 vs 16 gigabits per second).Clock binning is *always* a real thing, and just because you can overclock a part from a lower bin to run at a higher frequency doesn’t mean it will function that way for a long time, at appropriate thermals, etc.
When you fab any chip, they form a bell curve when you plot any of the Schmoo parameters.
Well those Siemens NX results are interesting. Even the GeForce cards do poorly. The results are so close together that it seems like an artificial limitation in the app.It depends on the software support.
![]()
Best GPUs For Workstations: Viewport Performance of CATIA...
It's been a while since our last in-depth viewport performance look, so we're going to get up to speed with the latest drivers and software. We've integrated SPEC's latest SPECviewperf 2020 into our testing, and also include a little Blender - so let's see how things fare with our seventeen...techgage.com
You can compare the RX 5700 XT to the W5700 on this page. Although they are not identical in specs they are closely related. In some of the benchmarks the W5700 isn't any better (or even worse) than the consumer card. But when it is is fully optimized, it flies. No idea what the exact difference is, everyone seems to omit the details, are but I suspect there will be something similar with these new cards vs consumer grade cards.
If you have properly implemented software support the difference can be dramatic and worth the price uplift. Hopefully Apple will optimize for these new cards within their pro apps as well as third party vendors.
No big difference between RX 5700XT and Pro W5700:
![]()
A huge quantifiable difference:
![]()
The 2013 Mac Pro was famous for thermally killing its pro-grade GPUs so I wouldn't take manufacturer's word at face value.Clock binning is *always* a real thing, and just because you can overclock a part from a lower bin to run at a higher frequency doesn’t mean it will function that way for a long time, at appropriate thermals, etc.
When you fab any chip, they form a bell curve when you plot any of the Schmoo parameters.
Your comment cracked me up because Apple continues to push the big dollar items. My iMac was manufactured in 2014. I looked at an upgrade with a similar configuration and the tag is close to 3 grand. So that's not going to happen this year unless I refinance my car. I love the products but the price tags have always been a tad unreasonable.And so reasonably priced!
Is there an decent graphics option if you're not a Saudi prince?
And could you kindly provide some details?Yep.
NVIDIA screwed apple in the past.
Well those Siemens NX results are interesting. Even the GeForce cards do poorly. The results are so close together that it seems like an artificial limitation in the app.
Pretty sure this is in reference to several generations in a row of 15” MacBook Pros that had to have repair programs due to Nvidia chips going bad.And could you kindly provide some details?
Bad solder, NVIDA blamed Apple, Apple had to recall (8600M GT), NVIDIA didn’t reimburse, then some other crap happened when they went to renegotiate their contract, Apple said eat sh** and die, the end!And could you kindly provide some details?
Bad solder, NVIDA blamed Apple, Apple had to recall (8600M GT), NVIDIA didn’t reimburse, then some other crap happened when they went to renegotiate their contract, Apple said eat sh** and die, the end!
It's Apple's fault so why would Nvidia reimburse? GPU processor is directly soldered to system board. Only if it was a plug-in module provided by Nvidia then it would be Nvidia's fault. In this case it's correct for Nvidia to tell Apple to eat sh**.
I don’t think Apple uses AMD VCN (VCE on Vega), so I am not sure the new GPU will help much in that regard.I spent far too much on my MP build back in 2019. Thankfully able to write it off through work but still.. I didn't go all out on my build, and fortunately my box is still faster than M1 in fcpx etc buy quite a bit.
16 core/384 gig ram / vega ii duo, afterburner card, pegasus r4i internal mpx module raid.
One thing that always made me scratch my head was how this machine is supposedly so powerful, yetmif I have a h265 video on my desktop, it won't preview it. Sure I can open it in vlc or fcpx but any video out of my Sony or Canon cameras... nah, it chokes on those, hard. Newer codecs, I get it... and the radeon pro vega ii duo was outdated to begin with when it launched.
i just ordered a W6800x DUO and I'm hopeful it's worth it. My M1 mac mini sits quietly on my desk as if to say PICK ME COACH when I set to edit. I stay with my MP because I also have an internal OWC Accelsior 8tb raid 0 card so it mows through videos better than the M1.
I was going to sell this mp and grab something else next year... but this new card has given me hope... for now anyway. I am sure M1x or M2 will be quicker, but given many of us spent 15-20-30k on the new mp powerhouse in Nov/Dec 2019 only to have Apple lawl at us and release the M1 a year later? Come on... they knew they were going Silicon and they told us these hay guys, it'll be upgradeable for a long time to come storylines and we drank the kool aid. :/
My order says Aug 16-18 for arrival. Question to answer is... will I return it or will it do a simple task like preview h265 / hevc videos without opening them, will it show me thumbnails for h265/hevc or will it still choke on videos from newer cameras utilizing h265?
I wonder what my vega ii duo will sell for used?
So did AMD.Yep.
NVIDIA screwed apple in the past.
Sounds like the expectation is for users to use Thunderbolt displays instead of HDMISo, what happened to HDMI2.1? A consumer 6800 has 2.1 while Apple seems to be stuck at 2.0, which means no 8k display support.
I was looking forward to this upgrade but it’s not worth it. I don’t really hit the limits of the current W5700 I’m using right now, even with 2 5k displays plus other displays, because I mostly do real-time and desktop stuff. (Although the 16GB VRAM are getting a little tight sometimes.) the big upgrade I was hoping for was support for 8k via HDMI 2.1.
The AMD Instinct MI100 won't be of any real value to anyone until Apple launches Xeon W33xx-Series Mac Pro with PCIe 4.0 slots. Then we may see it released as a BTO option...for the dozen or so people who want it. I just don't see the demand from the typical Mac Pro customer given the relative lack of HPC or AI customers using macOS for their research nowadays. Windows or Linux drivers, of course. But macOS drivers? I really doubt there's enough market momentum to get that off the ground.When is AMD going to release macOS drivers for their AMD Instinct MI100 Accelerator GPU? I want Apple to offer that GPU for their Mac Pro.
https://www.amd.com/en/graphics/servers-radeon-instinct-mi