Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
RDNA's rollout has been very maddening. Even on Windows, drivers have been a mess. I think yields must be low since AMD is still relying so much on Vega for APUs and for Mac Pro.

Bingo! And yet people here will scream from the rooftops Apple should switch to Zen CPUs.

I applaud Dr. Su and her team’s efforts turning around AMD, but they still cannot walk and chew gum at the same time.

Perhaps within another 2-3 years this won’t be an issue, they really need someone more like Tim Cook or a skilled operations person to help them. Perhaps it’s simply engineering head count, but there is an issue with execution in the different departments right now.

Apple moving to Zen would give AMD a heart attack.
 
RDNA is almost a year old come June and it's now just ramping up. Vega 56/64 also were MIA for months after launch. I'll give NVIDIA this, they do launch with actual products shipping but they are still on 12/16nm.
 
  • Like
Reactions: Zdigital2015
Bingo! And yet people here will scream from the rooftops Apple should switch to Zen CPUs.

I applaud Dr. Su and her team’s efforts turning around AMD, but they still cannot walk and chew gum at the same time.

Perhaps within another 2-3 years this won’t be an issue, they really need someone more like Tim Cook or a skilled operations person to help them. Perhaps it’s simply engineering head count, but there is an issue with execution in the different departments right now.

Apple moving to Zen would give AMD a heart attack.

Yeah blame AMD!

- They make Intel look slow
- They forced Apple and Blackmagic to make the eGPU
- They forced Apple to use AMD GPU only

/s
 
Last edited:
Blackmagic still listing them, but since they signed an exclusive with Apple to sell, none will be available to purchase:
 
I have internal Asus RX580 8GB, just for FCPX (not a gamer). Though, games also look astonishing too. Looking towards new AMD models, maybe 5700 in next year or so.
 
As far as I am aware, the Blackmagic is still the only eGPU that can run the LG Ultrafine 5K monitor. Or does anyone know about another solution?

I bought a "Blackmagic Pro" just today, as I want to run a LG Ultrafine 5K and a LG LG34WK95U (4.5K) in parallel on the same eGPU. But I would much prefer an upgradable solution (Razer Core X, etc.), but haven't found a way yet to implement this.
 
  • Like
Reactions: Dan08 and jimthing
Yeah blame AMD!

- They make Intel look slow
- They forced Apple and Blackmagic to make the eGPU
- They forced Apple to be AMD only

/s

I’m not blaming AMD, I’m rooting for them all the way. They have pushed Intel and to a lesser extent NVIDIA, but they are still struggling with supply on the scale that Apple would need from them for Apple’s particular needs.

Can anyone argue cogently that the RDNA/NAVI rollout and been a good or smooth or sequential? It’s been fits and spurts for almost a damn year and we still don’t have BIG NAVI GPUs yet.

AMD is going through growing pains and I empathize, but they wouldn’t be able to fulfill all their other PC OEM orders and be Apple’s exclusive CPU supplier at this time.
[automerge]1587143215[/automerge]
Apple could un-cripple the mini and put in a real GPU, but what fun would that be? :rolleyes:

Or Intel could have equipped their desktop CPUs with decent Iris GPUs as they did at one time instead of that UHD crap, but they didn’t. Intel Mac mini’s have never had a dGPU (AMD 6630M notwithstanding), so you’re asking for a new SKU. A double thick mini or something akin to the Xbox Series X square tower. I would mind seeing something like that myself, but I’m not holding my breath.
 
I love the idea of these - especially as someone with the LG 5K monitor and I believe this is the only eGPU with that support - but my question is at what point is TB3 the bottleneck, and should we all wait for a TB4 eGPU that can presumably handle a higher throughput to fully take advantage of a 2080ti type power card. I believe currently TB3 has only x4 PCI - would TB4 be able to do a x16 type card? Someone pleasssseee explain

The x4 PCI 3.0 link is a bottle neck but only makes a circa 10% difference in performance between that and a x16 link for graphics cards. eGPU users do get a drop in performance but I think it's more important that the price of adding an eGPU to a Mac via Thunderbolt 3 makes to total package cost around double that of the price of adding a card directly into the average desktop PC.

Imagine paying $300-500 on top of an expensive AMD graphics card, and the deal only gets worse in value if you only wanted a modest AMD GPU to run some basic jobs on your Mac that the iGPU couldn't handle - games or basic video rendering. We may be years away from decent performing Intel iGPU showing up.

Thunderbolt 4, probably around 2 years away from a Mac, would only see the speed of the link double. I would expect it to be effectively be the speed of a 4x PCIe 4.0 link which is itself roughly the speed of 8x PCIe 3.0 lanes. As already pointed out above in effect TB4 isn't going to overly increase the effective speed of a GPU - it's the more advanced GPUs that will be able to produce more performance in a couple of years time so I wouldn't be worried about GPU performance over cost.

I believe the VEGA56 in the Blackmagic eGPU would be superseded by the Radeon VII but that's a hugely expensive, noisy space heater of a GPU even though it does major on the kind of compute that Apple are interested in. The cost effective option would surely be to go for RDNA or RDNA2 based GPUs such as Radeon RX 5700 range (which are RDNA based). The RDNA2 GPUs are due at the end of the year and are supposedly going to have a big performance increase.

While eGPUs are a nice addition to laptop users who have the portability factor, they are pricey additions for Mac mini users who would see the total cost of their machine skyrocket if they wanted to add extra graphics power. Those who wanted graphics power from the start in a headless Mac remain a market that Apple don't want to tap but I believe an all-SSD iMac product which at least allows easy replacement of RAM might force a change of mind for some users at least.
 
Apple could un-cripple the mini and put in a real GPU, but what fun would that be? :rolleyes:

In a way they did. By their own account they managed to put themselves into a thermal corner.... within a cylinder shaped device. Imagine how the mini would fare... ;)
 
It's way cheaper and smarter to just roll your own. :D

Cheaper yes, smarter, not really. The Blackmagic units, while expensive, were whisper quiet and also doubled as hubs- they had plenty of USB-A ports and also HDMI.

You're right though I would definitely just stick a 5700XT blower into a Sonnet box for a $600 total price and call it a day

PS- there needs to be a mac mini model that has a dedicated GPU. It's been like 15 years now Apple, just do it. Stick a 5300M 4G or 5500M 8GB or Vega 20 and call it a day. Surely if the 16" MBP can handle it than a mac mini with desktop-esque cooling can do it
 
From a not so rich user perspective... my wife and I wanted to be able to play games on our 2017 MBP 13' laptops. When these EGPUs came out I was initially ecstatic. However, upon learning the cost, there was just no way I was going to buy two of these.

My wife and I ended up getting the Sonnet EGPUs with the RX 580 for a fraction of the cost. They've served us well for years now. I even took the GPU out of one of them and put it in a new system I built (Windows).

Just puts me off to buy something like this - not upgradeable or usable in other systems?!

Sonnet EGPU is $250 for the 500w edition. In 2018 the 250w edition with the RX 580 was $449.
 
  • Like
Reactions: NickName99
Imagine that. Maybe they should have made it user upgradable.

That wasn’t the point of the Blackmagic or Blackmagic Pro units. Integrated functionality was the point. Any idiot can glom an eGPU and put a card in a chassis as long as his opposable thumbs work. There are several other solutions on the market, and this is a lease, not a buy for most people. Once they come off lease, they’ll be cheap expansion for some users.
 
That wasn’t the point of the Blackmagic or Blackmagic Pro units. Integrated functionality was the point. Any idiot can glom an eGPU and put a card in a chassis as long as his opposable thumbs work. There are several other solutions on the market, and this is a lease, not a buy for most people. Once they come off lease, they’ll be cheap expansion for some users.

They had the USB C passthrough along with charging and supposedly silent operation right? That is at least what I remember their big sell was. That price tho was too rich for my blood. :p
 
  • Like
Reactions: jimthing
Cheaper yes, smarter, not really. The Blackmagic units, while expensive, were whisper quiet and also doubled as hubs- they had plenty of USB-A ports and also HDMI.

You're right though I would definitely just stick a 5700XT blower into a Sonnet box for a $600 total price and call it a day

PS- there needs to be a mac mini model that has a dedicated GPU. It's been like 15 years now Apple, just do it. Stick a 5300M 4G or 5500M 8GB or Vega 20 and call it a day. Surely if the 16" MBP can handle it than a mac mini with desktop-esque cooling can do it

The integrated PSU is the piece that can’t handle a 35w dGPU and a 65w CPU in the Mac mini and I would prefer not to have another brick hanging out of the mini as I did with my 2009. Perhaps once Apple moves to ARM/Arm or Rocket Lake-S with an integrated Xe GPU is released, the mini will be fine.
 
All they had to do was make it an enclosure and people would still want it. I can't figure out why they didn't see this as a bad idea. The whole point of external GPU was so that you could upgrade the GPU without buying a new enclosure.
 
  • Like
Reactions: LiveM
They had the USB C passthrough along with charging and supposedly silent operation right? That is at least what I remember their big sell was. That price tho was too rich for my blood. :p

Yes, they have extra USB ports, HDMI or Display Ports, also Pro Display XDR support for unsupported Macs with TB3 and Thunderbolt Display support for the dozen or so Thunderbolt displays on the market (mostly LG and BenQ).

The RX580 wasn’t horrible, but the Vega 56 was too expensive and too late. If I was leasing equipment and my designers needed this solution, it’s safer to go with something like this, but I would never buy one at retail cost. I have an ASUS XG Station Pro with a Sapphire Nitro+ RX580 that cost me $520 out the door, so the BM eGPU at $699 wasn’t horrible back then, but it is now.

AMD just needs to make sure the drivers are stable, and Apple too, which is why we just got the 5700X a couple of days ago.
 
  • Like
Reactions: BigMcGuire
The integrated PSU is the piece that can’t handle a 35w dGPU and a 65w CPU in the Mac mini and I would prefer not to have another brick hanging out of the mini as I did with my 2009. Perhaps once Apple moves to ARM/Arm or Rocket Lake-S with an integrated Xe GPU is released, the mini will be fine.
or just make it a little thicker and stop thinking that the smallest thing possible at the cost of performance is what everyone wants. I would not care at all if the Mac mini were double its thickness if they could get better cooling options.
 
  • Like
Reactions: amartinez1660
PS- there needs to be a mac mini model that has a dedicated GPU. It's been like 15 years now Apple, just do it. Stick a 5300M 4G or 5500M 8GB or Vega 20 and call it a day. Surely if the 16" MBP can handle it than a mac mini with desktop-esque cooling can do it

And nobody will be satisfied either. The midrange mobile solutions are still too mediocre for gaming, too slow for pro 3D graphics and compute, and only give you a slight boost over integrated, for a lot more power and cost. The only people you satisfy is a slice of lower-demand professional applications like CAD/CAM, which doesn't exist on the Mac.

Intel spends more transistors on the GPU over the CPU and are just as fast as the 30 W desktop dGPUs. They also avoid the PCIe bottleneck, and are highly efficient when data is shared between CPU and GPU.

A lot of people still are thinking about integrated graphics like they were 10 years ago, when that view is no longer relevant.

What's more important is the driver and software support. GPUs aren't easy to program for. NVidia has more software engineers than hardware engineers. Their easy to use CUDA API let them take over the complete GPU compute market. Intel is finally learning this by cleaning up their drivers.
 
  • Like
Reactions: amartinez1660
or just make it a little thicker and stop thinking that the smallest thing possible at the cost of performance is what everyone wants. I would not care at all if the Mac mini were double its thickness if they could get better cooling options.

Apple eeks our the most life from the limited number of chassis they create for their devices. The tooling (CNC milling) isn’t a cheap process for them and I appreciate that they build things out of Ai as they do. That being said, a double thick mini or a tower styled like the Xbox Series X would be a nice change of pace. I don’t mind the eGPU route, really, but the limited number of lanes on Intel’s CPUs does limit practical expansion on the mini or any other Mac not using a Xeon. Rocket Lake may fix that, but will Apple have already moved on by the time Intel releases it?

The fact that a lot of users here dislike Apple’s engineering decisions doesn’t lessen their validity relative to how other OEMs do things. Intel dragging their ass on PCIe 4.0, LPDDR4X, DMI 4.0, et al. ends up being Apple’s problem, which is why ARM/Arm is where they are headed.
 
They can drive the pro display and are really quiet compared to a build your own.

Yeah I've heard they're basically dead silent much like the Mac Pro...I wish this attention to noise pollution was also addressed in the recent MBPs, I like a very quiet work environment unless I'm really pushing the system.
 
TB4 will actually have the same bandwidth as TB3 according to Intel. Not sure what changes are there to warrant a name change, maybe the usage PCIE 4.0 lanes.

I also like how in an indirect/convoluted way Intel compared TB4 to TB3 by saying TB4 will be 4x faster than USB3.2 Gen2 instead of just simply saying there will be no bandwidth gain over TB3 lol.

That can't be the whole story, though. Then TB3, USB4 and TB4 would all be the same thing.

Given that TB4 will run on PCIe 4, it seems silly not to use that opportunity to double its bandwidth.
 
That can't be the whole story, though. Then TB3, USB4 and TB4 would all be the same thing.

Given that TB4 will run on PCIe 4, it seems silly not to use that opportunity to double its bandwidth.

USB4 has some technical cleanup and improvements, so it isn't strictly identical or directly compatible, but the speeds are identical. TB4 appears to be merely Intel's marketing name for their chips implementing a set of optional USB4 features.

The idea for going to PCIe 4 is to lower the number of lanes for the same bandwidth, easing PCB layout, allowing cheaper packaging, system is more scalable, etc.
 
  • Like
Reactions: avtella
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.