This.Release a damn enclosure already and let use use any GPU we want. And make it cheap.
Why not just sell an enclosure ?
This.Release a damn enclosure already and let use use any GPU we want. And make it cheap.
Can I sidetrack here and ask:
is it possible to play games on a new MBP 16“ like the upcoming flight simulator or unreal tournament 99 or something like that...
maybe someone could explain where the shortcomings are, what’s possible and what not and where the difference is.
I’ve never understood what a eGPU really does. Can you have the smallest machine and with the eGPU you have all the power you need for those games?
many thanks!
Because a custom-built PC will never have problems? I don’t understand your point here.
Are you kidding me? Your post is a prime example of whataboutism.
Doing a PC build needs research. It will not work if components are incompatible.
The Mac Mini has no such expectations. It should just work as fanboys here frequently tout.
Apple could make a Mini-ATX motherboard. Now, *that* would be something very fun!Apple could un-cripple the mini and put in a real GPU, but what fun would that be?![]()
Don't be, still enjoying it very much, it is great with all of its ports. Nicely complements the Vega 64 16GB in my iMac Pro, and the Nvidia 2080TI inside a razor core X chroma attached to an Ubuntu-based Intel-NUC Frost Canyon.
That can't be the whole story, though. Then TB3, USB4 and TB4 would all be the same thing.
Given that TB4 will run on PCIe 4, it seems silly not to use that opportunity to double its bandwidth.
Why not just sell an enclosure ?
As far as I can deduce, TB4 is just going to be an Intel-certified implementation of USB 4 with all the trimmings (including full TB1,2,3 backwards compatibility).
Whereas generic USB4 = USB 2 + optional USB 3 @ 5Gbps + optional USB 3 @ 10Gbps + optional PCIe @ 20Gbps x 2 (the latter based on TB3 and optionally backward compatible with legacy Thunderbolt devices - I read somewhere that full backwards compatibility with TB devices required some odd legacy data rates that weren't required by the USB 4 spec. Sorry, can't find the link again).
...so, basically, sounds like Intel are hoping to capitalise on the USB-IF's propensity for creating mass confusion. They did say that, post-USB4, they were going to focus on certification and developer support.
This.
Why not just sell an enclosure ?
Because when the end user put in an unsupported graphics card (we are talking Apple users), they get stuck with the support costs.
How long can Apple sell the base model iMac Pro with this Vega 56 card?
So can other egpu set ups. The quietness of the set up couldn't outweigh the lack of an upgrade path imo. Probably why we're here now with the product being discontinued.
huh?2 1/2 years? You do know that Apple releases new iPhones and iPads every year right?
Yeh, but the base iMac Pro is still using the Vega 56! Another reason a iMac Pro refresh could be around the cornerThe base Mac Pro its offered with an AMD 580X.![]()
Yeh, but the base iMac Pro is still using the Vega 56! Another reason a iMac Pro refresh could be around the corner
That isn't possible to easily achieve with a home made design for less money?1. Don't use a off-the-shelf GPU discrete card.
2. Vertical cooling stack design.
The DisplayPort video streams from GPU are not on a card edge. There are some direct output ports) , but also the video out can be relayed out through the Thunderbolt ports. On external PCI-e card enclosures the discrete GPU card's output all come out through the card's edge. None of those card edges provision a Thunderbolt port (at this time. some years down the road that may not be as true as it is now). Therefore, can't really drive most Apple Mac targeted Thunderbolt displays all that well.
Because don't have a card edge and trying to mix cooling and video out into the same small contained space, they can do all the cooling in a different direction. So basically orthogonal to one another. Also not particularly "max frame rate at any cost" design oriented so don't overclock as much. ( with AMD GPU chips backing off the overclock is a significant power saver. )
No magic. Just solid straightforward engineering that isn't crippled to design constraints from the late 80's and early 90's.
FYI; I'm using the BM Pro Vega 56 on my 2013 Mac Pro for occasional Windows gaming. I've also had it running on my 2017 MacBook Pro, but don't use the mobile arrangement much anymore. Honestly, it's not hard to get working at all, same steps as any other eGPU running bootcamp, egpu.io is the ultimate source for info about getting an eGPU running on Mac Windows. For what it's worth, eGPU running Windows on the Mac is not officially supported by Apple/Bootcamp and requires workarounds regardless of eGPU unit.
I'm not sure why the majority of people here don't seem to understand the ultra-quiet nature of the Blackmagic eGPU and how it can run 100% 24x7, 365 without even breaking a sweat.
Some people prioritize these things over cheap and card-swap, etc.
The article/first comment literally gives their reason:I wonder if this has something to do with the jump away from x86.
Blackmagic today informed MacRumors that it is no longer manufacturing its Blackmagic eGPU Pro due to AMD discontinuing its Radeon RX Vega 56 graphics chip.