Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Can I sidetrack here and ask:
is it possible to play games on a new MBP 16“ like the upcoming flight simulator or unreal tournament 99 or something like that...

maybe someone could explain where the shortcomings are, what’s possible and what not and where the difference is.

I’ve never understood what a eGPU really does. Can you have the smallest machine and with the eGPU you have all the power you need for those games?


many thanks!
 
Can I sidetrack here and ask:
is it possible to play games on a new MBP 16“ like the upcoming flight simulator or unreal tournament 99 or something like that...

maybe someone could explain where the shortcomings are, what’s possible and what not and where the difference is.

I’ve never understood what a eGPU really does. Can you have the smallest machine and with the eGPU you have all the power you need for those games?


many thanks!

I probably wouldn't want to play games for a sustained time on an expensive Macbook Pro but from my point of view the GPU (you don't say if it's the 5300 of the base SKU or the 5500 of the top SKU) would probably work well if you reduced the resolution down.

You would soon hit issues with heat from the CPU as well and then noise from the fan too - if that annoys you - and long term could be shortening the life span of the MBP due to that.

An external GPU takes some of the load away from the system - allowing the onboard GPU to rest - but you'd need to hook up an external monitor to it. I've never done this with my Macs because of the sheer cost vs benefit.

You could fit a graphics card into an eGPU enclosure that blows away anything in any Mac if you were willing to shoulder the cost. But you would have to buy the enclosure for several hundred $$$ first. And there's limits to how useful extra GPU compute is to average users beyond being able to play games or render video faster. I think it's more of an edge case for a lot of potential users, and the sheer cost of the option should drive it off the table of options for many.

If you are only looking at casual gaming then I daresay you could probably build an entire budget PC for the price of the enclosure that you would be looking to put your PC graphics card into. This assumes you have a screen to plug the PC into - you'd have to have one for the eGPU too.

By the end of the year next generation games consoles will be out, which should satisfy a lot of gamers into the kinds of games that get released on consoles. If you wanted to play PC games then building a PC would be better too - imagine the kind of PC build you could make with, say, $300-400 excluding GPU and monitor.

At the end of the day Apple is seeing a graphics card as a way to drive screens effectively. The iGPU is deemed good enough for machines like the Mac mini or any Mac below the 15" or 16". Video editors looking for better render performance need to pay more and gamers really ought to be looking elsewhere to satisfy their gaming needs.

Coming back on topic, though, while the eGPUs are the only option for MacBook Pro users who would benefit from more GPU grunt it's not cost effective for Mac mini users unless the dock could be shared with occasional laptop users too. The issue of cost of entry for GPU for Mac mini users can only really be answered by adding a dGPU to a future Mini SKU.

The optimal way forward for Mini users would be for Apple to look at Intel's Ghost Canyon setup - they have done away with the Skull Canyon and Hades Canyon solution of having everything inside a tiny rattly box that made a lot of noise under pressure.

That box allows a suitable size PC graphics card to be added to the setup.

In an Apple version the motherboard would have to fit BGA CPU (soldered) so people couldn't replace with their own CPU, the RAM would at least be replaceable (2 slots, up to 128Gb) while the Apple SSD would be on a custom board but at least replaceable and Apple would get bonus points if they used a standard small form factor PSU.

Look at the recently launched i9-10980HK which has an official TDP-up 65w mode allowing base clock of 3.1GHz on 8 cores, 16 threads. Does that interest Mac mini users for a next model? There would be lower model 45w Comet Lake H CPUs available in due course...

And logically, there would be one PCIe 16x slot which was electrically 8x - with the other 8 lanes being split between the Apple SSD and a Titan Ridge controller looking after 2 Thunderbolt 3 ports.

An Apple variant of this could easily top $2.5k but would be the spiritual successor to the trashcan Pro price wise.

In effect it would be akin to Apple building a Mac mini into its own eGPU case...
 
  • Like
Reactions: AppleMango
Because a custom-built PC will never have problems? I don’t understand your point here.

Are you kidding me? Your post is a prime example of whataboutism.

Doing a PC build needs research. It will not work if components are incompatible.

The Mac Mini has no such expectations. It should just work as fanboys here frequently tout.
 
Are you kidding me? Your post is a prime example of whataboutism.

How so? You seem to be implying something inherent about the eGPU approach that makes those problems more common. I don’t see that. Attaching PCIe through a card isn’t inherently less error-prone than through a Thunderbolt cable. The signaling is the same. There’s just more latency.

Doing a PC build needs research. It will not work if components are incompatible.

The Mac Mini has no such expectations. It should just work as fanboys here frequently tout.

Yes, that’s true. It should. Apple needs to do better.

But I don’t get your point. Are you saying people shouldn’t bother with mass-market computers because they’re not perfect anyway, so one might as well custom-build?
 
Don't be, still enjoying it very much, it is great with all of its ports. Nicely complements the Vega 64 16GB in my iMac Pro, and the Nvidia 2080TI inside a razor core X chroma attached to an Ubuntu-based Intel-NUC Frost Canyon.

Oh the ports, oh wow, no computer should carry enough ports nowadays right, look at it sitting next to that razer core with an 2080, must be amusing.
 
That can't be the whole story, though. Then TB3, USB4 and TB4 would all be the same thing.

As far as I can deduce, TB4 is just going to be an Intel-certified implementation of USB 4 with all the trimmings (including full TB1,2,3 backwards compatibility).

Whereas generic USB4 = USB 2 + optional USB 3 @ 5Gbps + optional USB 3 @ 10Gbps + optional PCIe @ 20Gbps x 2 (the latter based on TB3 and optionally backward compatible with legacy Thunderbolt devices - I read somewhere that full backwards compatibility with TB devices required some odd legacy data rates that weren't required by the USB 4 spec. Sorry, can't find the link again).

...so, basically, sounds like Intel are hoping to capitalise on the USB-IF's propensity for creating mass confusion. They did say that, post-USB4, they were going to focus on certification and developer support.

Given that TB4 will run on PCIe 4, it seems silly not to use that opportunity to double its bandwidth.

The TB4 details are sketchy, but Intel has clarified that it is 4x faster than the 10 Gbps version of USB 3 - so exactly the same max speed as Thunderbolt 3.

Even if the controller is plugged into PCIe4, doubling the bandwidth of a bit of external wire without restricting the cable length is non-trivial...
 
As far as I can deduce, TB4 is just going to be an Intel-certified implementation of USB 4 with all the trimmings (including full TB1,2,3 backwards compatibility).

Whereas generic USB4 = USB 2 + optional USB 3 @ 5Gbps + optional USB 3 @ 10Gbps + optional PCIe @ 20Gbps x 2 (the latter based on TB3 and optionally backward compatible with legacy Thunderbolt devices - I read somewhere that full backwards compatibility with TB devices required some odd legacy data rates that weren't required by the USB 4 spec. Sorry, can't find the link again).

...so, basically, sounds like Intel are hoping to capitalise on the USB-IF's propensity for creating mass confusion. They did say that, post-USB4, they were going to focus on certification and developer support.

I'm reluctant to dig deeper into the USB specs coming forward at the moment - it's already a minefield trying to buy various flavours of USB-C cable (depending on whether or not you're charging, transmitting data, or video signals, or all of the above for use in a dock) so can well understand a big by Intel to bring forth Thunderbolt 4 as a "full fat" version of the various USB3 and 4 implementations.

No wonder Apple have been reluctant to get involved in getting USB-C ports into Phones. Imagine the coming confusion when people try and use all manner of unbranded cheapo cables to work with an iPhone 12 Pro for more than just charging!
 
Because when the end user put in an unsupported graphics card (we are talking Apple users), they get stuck with the support costs.

Perhaps Apple don't think people will buy what is effectively a headless MacBook Pro 16" that people are asking for?

How long can Apple sell the base model iMac Pro with this Vega 56 card?

The base Mac Pro its offered with an AMD 580X. :)

The implication here is that GPU is offered only on the assumption that it's driving a big retina screen and Apple won't give time of day to anything else where an iGPU is considered 'capable enough' of driving a smaller retina screen (MBA and 13" MacBook Pro for example).

This leaves us with a Mini where Apple are clearly assuming that the iGPU is suitable for the majority of users who are either running it headless (eg co-location guys) or perhaps even running lower dpi cheaper monitors - such as 1440p or 1080p screens.

Let's be absolutely fair here and say at least Apple have offered a solution - eGPU - albeit a very expensive one!
 
So can other egpu set ups. The quietness of the set up couldn't outweigh the lack of an upgrade path imo. Probably why we're here now with the product being discontinued.

Show me another EGPU setup that can drive the Pro Display XDR at 6k without another titan card or bridge or whatever
 
1. Don't use a off-the-shelf GPU discrete card.
2. Vertical cooling stack design.


The DisplayPort video streams from GPU are not on a card edge. There are some direct output ports) , but also the video out can be relayed out through the Thunderbolt ports. On external PCI-e card enclosures the discrete GPU card's output all come out through the card's edge. None of those card edges provision a Thunderbolt port (at this time. some years down the road that may not be as true as it is now). Therefore, can't really drive most Apple Mac targeted Thunderbolt displays all that well.

Because don't have a card edge and trying to mix cooling and video out into the same small contained space, they can do all the cooling in a different direction. So basically orthogonal to one another. Also not particularly "max frame rate at any cost" design oriented so don't overclock as much. ( with AMD GPU chips backing off the overclock is a significant power saver. )

No magic. Just solid straightforward engineering that isn't crippled to design constraints from the late 80's and early 90's.
That isn't possible to easily achieve with a home made design for less money?
 
I'm not sure why the majority of people here don't seem to understand the ultra-quiet nature of the Blackmagic eGPU and how it can run 100% 24x7, 365 without even breaking a sweat.

Some people prioritize these things over cheap and card-swap, etc.
 
FYI; I'm using the BM Pro Vega 56 on my 2013 Mac Pro for occasional Windows gaming. I've also had it running on my 2017 MacBook Pro, but don't use the mobile arrangement much anymore. Honestly, it's not hard to get working at all, same steps as any other eGPU running bootcamp, egpu.io is the ultimate source for info about getting an eGPU running on Mac Windows. For what it's worth, eGPU running Windows on the Mac is not officially supported by Apple/Bootcamp and requires workarounds regardless of eGPU unit.

How can you run it on 2013 Pro? doesn't it need TB3?

Also, why did you pay $1200 for eGPU, you could have bought a dedicated PC.
 
Last edited:
I'm not sure why the majority of people here don't seem to understand the ultra-quiet nature of the Blackmagic eGPU and how it can run 100% 24x7, 365 without even breaking a sweat.

Some people prioritize these things over cheap and card-swap, etc.

Oh, I understand it alright. The design of the BM Pro was beautiful (love the cooling aspect and love the look), but costing $600 extra, it was something I could give up.
 
With the pro displays just releasing I would like to imagine black magic will do an update. If not, then Mini 2018 with LG5K or Pro Display will be out of luck...?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.