Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Of course. I thought you were talking about "tested" off-the-shelf configurations.

I would imagine that the Vega 64 isn't an option in the BlackMagic line due to power supply requirements.
You just showed a Vega 64.

An overclocked Vega 64 would have to be looked at. I would imagine that BlackMagic just wants to use a card made by AMD. And they are targeting professionals.

A simple enclosure without fancy stuff (USB, etc) should be fine if you stay within its limits.
 
So many people complaining that Apple upgraded a model, because they recently or relatively recently they bought one, pre upgrade.
Are you guys completely new to the world of personal computing?
It's the same story 40 years now.
FFS get over it.
Only thing I feel sorry for Apple. If they don't update their machines regularly they get blasted. If they update their machines, they get blasted.
 
I hope it’s true but it seems that:

Radeon Pro Vega 16 could be like a GeForce GTX 470 or GeForce GTX 560 Ti which are 2010-2011 cards.

https://www.techpowerup.com/gpu-specs/radeon-pro-vega-16.c3331

Radeon Pro Vega 20 could be like a GeForce GTX 660 which was released on 2012!

https://www.techpowerup.com/gpu-specs/radeon-pro-vega-20.c3263


They should include (or at least give the option) nVidia cards, even if they have to add a fan and and few millimeters to the case.
Are you comparing mobile to desktop?

BTW, that site is not completely reliable.
 
Frontier would theoretically give you an improvement for professional use. Not for gaming.

You could buy an eGPU enclosure and a gaming Vega 64 separately.

And don't buy a reference cooler.
Thank you for the clarification.

To whit, I game on my Xbox One and I would argue that buying a Mac mini and an eGPU for gaming makes absolutely zero sense from a financial perspective.

Any number of YouTube video by BitWit, Gamers Nexxus, et al. show how to build a solid 1080p gaming PC for $500-$700, but I digress.

As for the reference cooler, I assume you mean the cooling solution for the FE version is inadequate? If so, I am not sure what the solution would be as the FE is the only version of the Vega 64 to sport 16GB of HBM memory.

For around $900-$950, a Sapphire Nitro + Vega 64 w/eGFX 650 should beat the Vega 20 BTO in the MacBook Pro. I imagine the Vega 56 could as well, and a Sonnet setup from Sonnet Tech is about $800 right now.
 
  • Like
Reactions: RandomDSdevel
You can see now in their page how much faster is the new gpu. They say that it is 45% faster in c4d (a program that i use a lot) and concerning some reviews that have a score of 98,44fps (noteboocheck.net) in the 560x model, that means 142,73fps in vega 20. Well that is impressive for a laptop that thin. Or the 50% advantage in rise of the tomp raider. Lets wait and see if that is true...
Sounds like the nVidia snobs can stuff it now...
 
This is a significant COST increase. And BTW, Intel has been charging more for each new generation of "equivalent" CPUs. So is Apple just supposed to "eat" that cost-increase?

Think about it.
The recent refresh of the MacBook Pro lineup is actually a good example of the OP’s wish that technology improvements alone (absent a price increase) shouldn’t result in higher prices. 13” dual-core configs became quad-core, and 15” quad-core became 6-core, without Apple raising prices.
 
Interesting...but expensive upgrades...It's hard to imagine the performance can be much better within the same thermal cooling system that the MBPro has. Unless performance per watt is a lot better of course...?
It would be interesting to see what Apple is REALLY capable of putting inside the MBP chassis as is. They’ve had the same exact one since 2016 and just now they’re adding the option for Vega GPU right before the holidays... lots of people who bought there’s within the last few months are gonna be disappointed. But all that matters is that Apple’s sales will benefit, right?
 
Why should it be?

You payed and bought for the product at that time for the current price at that time. It is with everything like that. At one point you decide to buy or to wait. If you have to wait until the next revision or model you never have to buy anything :)

I also don’t understand why some Apple consumers say a 2600 dollar product is deprecated or obsolete when the next one comes out with only a few minor changes. It is not like the iPhoneX is total crap and stops working now the the XS is out...
The price paid for a high-end pro machine invites complaints of many kinds, from throttling, to keyboard issues, to mid-cycle refreshes that could've conceivably been offered at the product's introduction. Whoever said a prior version is obsolete would be wrong. Others are saying they'd like a near-perfect product for a high price, and I understand that, even though paying through the nose for a high end Apple machine doesn't offer perfection, far from it.
 
  • Like
Reactions: RandomDSdevel
Yes, that's what pros say to their employer to justify the purchase. It's for GPU compute processing but we know full well that on their down time they're going to boot camp Windows and fire up those AAA games.

Well... yeah but my point was that gaming performance metrics are pointless on a card like this.
 
Are you comparing mobile to desktop?

BTW, that site is not completely reliable.

Well, I was just naming a GPU with comparable specs

If we talk about current nVidia mobile cards it gets worse because, for example a 2017 Mobile nVidia like the GTX 1070 Max Q is +164% the Desktop GTX 660

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-660-vs-Nvidia-GTX-1070-Mobile-Max-Q/2162vsm301524


It’d be great if Apple gave the choice. I think a lot pro users would prefer Apple to use nVidia cards. But I’m sure most non-pro users would definitely prefer that Apple used nVidia cards. And non-pro users are the main buyers of Mac devices.
 
Yeah, Nvidia makes me cough indeed. And hurl, on MacOS.

Nvidia drivers for MacOS have always been garbage compared to AMD GPU drivers. I'm so thrilled Apple has finally stopped selling Macs with Nvidia and is sticking with Radeon.
Wasn't Apple just using the standard nVidia "Web Driver" package?
 
im one of them, lol

Same. $5k. Primary use for design/video too. Apple Support have directed me back to the retail store to ask for a swap but stipulated the return period is 14 days.

All of this on the back of twice daily kernel panic crashes, a month of Apple troubleshooting, wiping, reinstalling OS’s that screwed my business/daily operation however eventually led to swapping the machine. That was 4 weeks ago. (Still had 2 KP’s since).

Gutted.
 
  • Like
Reactions: RandomDSdevel
I would argue that buying a Mac mini and an eGPU for gaming makes absolutely zero sense from a financial perspective.

Only if that's your sole reason for buying one. Most people buy a Mac Mini specifically to fulfil a particular role in their computing setup (eg Pro Audio). Adding an eGPU allows one to turn a Mac Mini into a respectable gaming machine, after hours when the work is done.
 
  • Like
Reactions: MrUNIMOG
Interesting...but expensive upgrades...It's hard to imagine the performance can be much better within the same thermal cooling system that the MBPro has. Unless performance per watt is a lot better of course...?

At low clocks, Vega is **much** better than Polaris in terms of perf/watt, mainly due to HBM as well as the better architecture.

These chips will be screaming fast.
 
Those cards aren’t even in the same ballpark when it comes to compute performance.

The comparison is misleading.

Well, I was just naming a GPU with comparable specs

If we talk about current nVidia mobile cards it gets worse because, for example a 2017 Mobile nVidia like the GTX 1070 Max Q is +164% the Desktop GTX 660

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-660-vs-Nvidia-GTX-1070-Mobile-Max-Q/2162vsm301524


It’d be great if Apple gave the choice. I think a lot pro users would prefer Apple to use nVidia cards. But I’m sure most non-pro users would definitely prefer that Apple used nVidia cards. And non-pro users are the main buyers of Mac devices.
 
  • Like
Reactions: RandomDSdevel
But they both share two ports from one controller (hence my two LG 5K's have to be in each side of my 15MBP), so how do they achieve four eGPU's working at full speed then?
I think you are confusing the PCIe data and the DisplayPort data. Both appear on the same TB3 connector; but they are different data-streams.

AFAICT, an eGPU uses the raw PCIe "bus", and the Display uses the DisplayPort "bus". And the eGPU turns "commands" over the PCIe bus into "Pixels" on the DisplayPort bus. And there's a BIG difference in the amount of data required over each. But in an eGPU situation, the actual DISPLAY usually connects to the eGPU enclosure directly; thus, yes, each TB3 port can service its own eGPU, even though it takes TWO TB3 ports to drive the "pixel data" (DisplayPort Data) for a 5k display.

I think. ;-)
 
Seriously $400 option...its disgusting how greedy apple has become, that chip only costs $50 bucks.


HBM2 memory is SUPER EXPENSIVE. In fact if I recall nVidia only put HBM2 on it's Titan V card. All the others with the excecption of perhaps Quadro, they used GDDR5 "GDDR5x on 1080ti" and GDDR6 on their high end RTX cards.
 
Well, I was just naming a GPU with comparable specs

If we talk about current nVidia mobile cards it gets worse because, for example a 2017 Mobile nVidia like the GTX 1070 Max Q is +164% the Desktop GTX 660

https://gpu.userbenchmark.com/Compare/Nvidia-GTX-660-vs-Nvidia-GTX-1070-Mobile-Max-Q/2162vsm301524


It’d be great if Apple gave the choice. I think a lot pro users would prefer Apple to use nVidia cards. But I’m sure most non-pro users would definitely prefer that Apple used nVidia cards. And non-pro users are the main buyers of Mac devices.
The GTX 1070 Max Q is 80+ Watts. Apple doesn’t make gaming laptops, it’s not what most of their customers want.
 
  • Like
Reactions: MrUNIMOG
The recent refresh of the MacBook Pro lineup is actually a good example of the OP’s wish that technology improvements alone (absent a price increase) shouldn’t result in higher prices. 13” dual-core configs became quad-core, and 15” quad-core became 6-core, without Apple raising prices.
True enough. But I think in that case, Apple either negotiated a good deal with Intel, or felt that they had sufficient margin to safely absorb the increased cost.

But to expect Apple to do that ad infinitum or with significant cost increases (which I would assume the inclusion of a Vega-class GPU qualifies as) is a reductio ad absurdium, LOL!
 
  • Like
Reactions: MrUNIMOG
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.