Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah, that sounds reasonable. I don't know much about RX 5500 vs RX 580. Wouldn't the RX 5700 be a bit of a downgrade compared to what it had previously? But, maybe going top-of-the-line wouldn't gain that much but add too much cost?

One thing to watch on prices, is how much RAM. I noticed with the Mac Pro add option, people were quoting $400'ish for the price and complaining about Apple's markup, but I think Apple's card has 2x the RAM as the ones most people are buying.

But, bottom line for me, I'd love to see Blackmagic maintain a 'regular' and 'pro' model of eGPU with this design, keeping whatever is the reasonable 'normal' and 'high-end' GPUs in each model. I hope that is what is going on here... though one would think it would be best to have the replacement when you announce you're dropping the current.

The RDNA GPUs will be better value in the future as support ought to be better; AMD drivers aren't in a good place generally at the moment.

Speed of RAM is also a major factor. GDDR6 is faster than GDDR5, HBM is even faster still but costs escalate as AMD use the faster RAM to help them get performance. There's even talk of the Sony PS5 skipping traditional DDR4 RAM and going straight to high performance GDDR6 ram to eliminate bottlenecks in the system - they are already using high performance NAND and not the cheap stuff. It's all about loading times for these games consoles.

The issue with Blackmagic's eGPU is the fact it's non replaceable within the enclosure - within 3 years it'll be old news and you're left with something that will be effectively an expensive doorstop with an increasingly poor performance per watt.

This wouldn't be as much of an issue if the unit itself did something unique - for instance if there was a cooling solution that made it properly silent for a long time if being caned - either by video rendering or games playing. I can't imagine that Blackmagic will have sold many of these either - which would also have contributed to the high unit cost.

With so few users then, there's probably not too much clamour to fix things when (like the AMD driver) things go a bit shaky.

The sad fact is there's nothing in the Mac arena to encourage developers to sit up and look for games development - even with the presence of Metal on the scene which I'd imagine is more for iPad developers than Mac developers.

And the most cost effective solution for Mac users wanting a headless Mac with better graphics would actually be to get one with a discrete GPU already onboard - something which the Mini isn't built to cater for.

Try and do a comparison with an entry level MacBook Pro 16", and then price up a Mac mini with comparable storage - adding an enclosure bought from Amazon UK.

Off the shelf Macbook Pro 16" with 512Gb SSD and 5300M graphics: £2399

Mac mini i5 16Gb RAM, 512Gb SSD (matching the MBP16): £1299.
Razer Core Chroma Enclosure with Thunderbolt Cable: £400 (I know cheaper ones exist but Amazon UK just bumped the price of the Core X up quite high)
Radeon 5500XT: £200

Mini system: £1899 vs MBP for an extra 500 quid - and then ask yourself how much a retina screen is and the ability to carry the MBP away or the ability to survive a power cut due to the built in battery. I use my MBP in clamshell mode - I have rarely used the built in screen so far so in effect I already have my Mac mini with a built in GPU.

Yes there will be performance variances between GPU and CPUs between the systems but shockingly the CPU benchmarks don't seem to be far apart. Yes, I'd accept the GPU difference is far greater between a 5500XT and a 5300m. And the MBP can't have memory upgraded after purchase, whereas the Mini can. And the 5500XT can be upgraded again too.

But I'm happy to leave the gaming to the consoles or PC and go with decent built-in GPU performance on an external monitor. And there's nothing to stop me in the future from purchasing an eGPU and suitable graphics card - the MBP has 4 Thunderbolt 3 ports as well. And I can pick the MBP up and go somewhere else and do some work if I wanted to.

Bear in mind this is before you find a variety of deals on the MBP. Shortly after the 16" MBP was introduced you could purchase it at a substantial discount as an off the shelf system and MacRumors is always listing deals for stock MBP16".

It's a just a bit harder to justify the cost of building up a Mac mini to compete. And there's possibly a major reason why Apple are showing no interest in beefing the Mini up with a permanent discrete GPU.

which seems to indicate that the 5500 works fine - he even got a damn login screen! (which is apparently something others have had issues with).

Totally agree with the guy's line that Sapphire cards appear to be better for Macs. They generally seem to get decent reviews.
 
  • Like
Reactions: arvinsim
Yeah, that sounds reasonable. I don't know much about RX 5500 vs RX 580. Wouldn't the RX 5700 be a bit of a downgrade compared to what it had previously?

The 5700 seems to be about 37% better on average compared to the 580, and only about 5% better than the Vega 56.

But, bottom line for me, I'd love to see Blackmagic maintain a 'regular' and 'pro' model of eGPU with this design, keeping whatever is the reasonable 'normal' and 'high-end' GPUs in each model.

Yup.

I hope that is what is going on here... though one would think it would be best to have the replacement when you announce you're dropping the current.

Yeah, that's what has me worried.
 
  • Like
Reactions: SteveW928
cooling of which? I can't imagine that helps the mini cooling (well unless the BM regularly runs cooler than the mini's body?)

The mini. Even when not being heavily used, the fan slowly spins which creates air draw from the bottom up through the top. So, I have a constant, slight air-draw across the mini case. I'm not sure how much that helps ultimately, but it has to help a bit.

When it is running harder and the fan spins up more, it should have even more impact.

The BM runs extremely cool. When it is maxed out, it seems to hover between 50-60 degrees (at the GPU).

I feel bad about the people who paid a premium for this and now its no longer a product line that Black Magic is persuing. I did question the fact that you couldn't change the GPU, its not ideal in that sense, and other egpus make more financial sense, so that's probably why it failed

Well, don't feel bad, I LOVE mine. I'll buy another if my needs change (ie. I need more GPU power, assuming they make a replacement).

And, I wouldn't assume they aren't pursuing this any longer. If they aren't then I'll have to contact some hardware maker connections and start up a business. There isn't any competition for this thing currently.

are they not in fact still selling the non-pro model? isn't the biggest difference the GPU inside itself?

I think they still sell the non-pro, which makes me think we'll be seeing an update to the pro. If they were indeed abandoning the concept, I'd think they'd just announce EOL for both.

Yeah, the GPU itself, and I think port/controller-wise, the 'pro' can drive more 5K+ displays. In fact, I think it is the only option currently to drive the XDR.

The RDNA GPUs will be better value in the future as support ought to be better; AMD drivers aren't in a good place generally at the moment.

Speed of RAM is also a major factor. GDDR6 is faster than GDDR5, HBM is even faster still but costs escalate as AMD use the faster RAM to help them get performance. There's even talk of the Sony PS5 skipping traditional DDR4 RAM and going straight to high performance GDDR6 ram to eliminate bottlenecks in the system - they are already using high performance NAND and not the cheap stuff. It's all about loading times for these games consoles.

Thanks, yeah, I just haven't followed the 5500 at all. I was thinking of it more in comparison to the Vegas Apple is putting in the new Mac Pro, for example. I'd guess Blackmagic would/should go more that direction for the pro model.

As an aside, I find it interesting that both the PS5 and new Xbox are both going with AMD, despite apps like Minecraft making a big deal of Nvidia's ray-tracing.

The issue with Blackmagic's eGPU is the fact it's non replaceable within the enclosure - within 3 years it'll be old news and you're left with something that will be effectively an expensive doorstop with an increasingly poor performance per watt.

This wouldn't be as much of an issue if the unit itself did something unique - for instance if there was a cooling solution that made it properly silent for a long time if being caned - either by video rendering or games playing.

Yeah, the non-replaceable nature of the GPU is the downside, but I don't think it would be possible to make anything close if that weren't the case. It would have to be WAY bigger to put a standard card inside.

And, I'm not sure what you're talking about in terms of cooling and silent. It absolutely cools perfectly at 100% load, no matter how long you run it. I've run mine 100% for weeks at a time, and it hangs between 50-60 degrees.

While it isn't technically 100% silent, if there is any kind of room-noise, you'll not hear it. If you are in a super-quiet room - AND know what you're listening for - you can hear a very slight low-pitched 'woosh' kind of sound. When I've heard most people talking about this, they just don't get how quiet we're talking about here. The things they call quiet are loud by comparison.

As for doorstop... would you say the same thing about Apple's entire product line then, save the Mac Pro? If someone buys an iMac Pro, or that new 16" MBP you mention, is it a doorstop in 3 years? If not, then no different for the BM. And, with the BM, I can easily switch to a different one and keep my same rest of the system.

And the most cost effective solution for Mac users wanting a headless Mac with better graphics would actually be to get one with a discrete GPU already onboard - something which the Mini isn't built to cater for.

Try and do a comparison with an entry level MacBook Pro 16", and then price up a Mac mini ...

... and then ask yourself how much a retina screen is and the ability to carry the MBP away or the ability to survive a power cut due to the built in battery. I use my MBP in clamshell mode - I have rarely used the built in screen so far so in effect I already have my Mac mini with a built in GPU.
...

But I'm happy to leave the gaming to the consoles or PC and go with decent built-in GPU performance on an external monitor.
...
And there's possibly a major reason why Apple are showing no interest in beefing the Mini up with a permanent discrete GPU.

The problem with that is the MBPs just aren't built for serious GPU use (or at least they haven't been in the past). I've ruined a couple MBPs over the years doing 3D rendering, video encoding, or Folding@home, etc. on them. While a mini isn't necessarily built for that either, I think I'm going to have pretty good luck, long-term, from off-loading the GPU work, and then disabling Turbo Boost on the i7 most of the time. It doesn't run so hard that way. (And, then there is the T2 to help offload some more of the video-encoding work.)

And... I just don't want a laptop, so while that sounds like a good deal, it isn't that compelling to me. I survive a power cut just fine, as I'd not even consider running w/o a UPS.

Remember, GPUs are about a lot more than gaming. I tend to mostly use my console, too. I use my GPU for 3D/CAD, Folding@home, etc. (Primarily CAD right now, but if I start doing more involved 3D work, I'll maybe want a better GPU some day.)

re: mini - it just can't handle a dGPU. They'd have to radically alter the design, I'd think. I'm not opposed to that, but with the advent of eGPUs, I'm not sure there is much of a point any more. Unless Apple makes a low-end Mac Pro (ie. case with card-slots), a dedicated GPU is pointless. The iGPU is fine for the average user and those who need it can add an eGPU (or a couple).
 
  • Like
Reactions: - rob -
That wasn’t the point of the Blackmagic or Blackmagic Pro units. Integrated functionality was the point. Any idiot can glom an eGPU and put a card in a chassis as long as his opposable thumbs work. There are several other solutions on the market, and this is a lease, not a buy for most people. Once they come off lease, they’ll be cheap expansion for some users.

Who the hell leases computer components? Can I turn my 1080 Ti into a lease so I can get a 2080 Ti soon?
 
Yeah, the non-replaceable nature of the GPU is the downside, but I don't think it would be possible to make anything close if that weren't the case. It would have to be WAY bigger to put a standard card inside.

And, I'm not sure what you're talking about in terms of cooling and silent. It absolutely cools perfectly at 100% load, no matter how long you run it. I've run mine 100% for weeks at a time, and it hangs between 50-60 degrees.

While it isn't technically 100% silent, if there is any kind of room-noise, you'll not hear it. If you are in a super-quiet room - AND know what you're listening for - you can hear a very slight low-pitched 'woosh' kind of sound. When I've heard most people talking about this, they just don't get how quiet we're talking about here. The things they call quiet are loud by comparison.

As for doorstop... would you say the same thing about Apple's entire product line then, save the Mac Pro? If someone buys an iMac Pro, or that new 16" MBP you mention, is it a doorstop in 3 years? If not, then no different for the BM. And, with the BM, I can easily switch to a different one and keep my same rest of the system.

I read that the Blackmagic is probably the most civilised of the eGPU solutions (noise wise) but I just can't get over the price of it and inability to upgrade/replace it. The big plus point is that everything is built to fit the case so no great surprise that it's a great product. But £1200 for a Radeon 56, or £600 for an AMD Radeon 580 (from the UK Apple Store)?

I shudder at those prices but then I always said that I would pay for more a civilised enclosure - so perhaps I am missing a trick.

The problem with that is the MBPs just aren't built for serious GPU use (or at least they haven't been in the past). I've ruined a couple MBPs over the years doing 3D rendering, video encoding, or Folding@home, etc. on them. While a mini isn't necessarily built for that either, I think I'm going to have pretty good luck, long-term, from off-loading the GPU work, and then disabling Turbo Boost on the i7 most of the time. It doesn't run so hard that way. (And, then there is the T2 to help offload some more of the video-encoding work.)

And... I just don't want a laptop, so while that sounds like a good deal, it isn't that compelling to me. I survive a power cut just fine, as I'd not even consider running w/o a UPS.

Remember, GPUs are about a lot more than gaming. I tend to mostly use my console, too. I use my GPU for 3D/CAD, Folding@home, etc. (Primarily CAD right now, but if I start doing more involved 3D work, I'll maybe want a better GPU some day.)

re: mini - it just can't handle a dGPU. They'd have to radically alter the design, I'd think. I'm not opposed to that, but with the advent of eGPUs, I'm not sure there is much of a point any more. Unless Apple makes a low-end Mac Pro (ie. case with card-slots), a dedicated GPU is pointless. The iGPU is fine for the average user and those who need it can add an eGPU (or a couple).

Good arguments, if I were caning a machine for hours on end I'd certainly not choose a MacBook Pro, but for occasional mixed use a Macbook Pro could make a sensible upgrade for people who just want a slightly upgraded GPU. The Mini could make a decent workhorse, and separating it from the GPU in this case might have been interesting if the total noise could be kept down.

Your use case is very specialised and appear to play straight into the Mac Pro but I guess the value for money or affordability might be difficult to square off.

I'd have liked to see what Blackmagic could do with an RDNA2 GPU next year, if they decide to continue with a Blackmagic eGPU Pro it should be with an RDNA GPU that will last 3 years and still have enough juice left over for the used market.

If I want to game a Mac would never ever be on my list of tools for that job. PC, console, or even iPad would come first.
 
  • Like
Reactions: SteveW928
I play games for my enjoyment, I have had zero desires to record any, and besides, you're assuming that I have a Mac, which I don't.

Ok, then you don't need a quiet system, I guess. Others do.

Who the hell leases computer components? Can I turn my 1080 Ti into a lease so I can get a 2080 Ti soon?

Well, if they don't lease them, they write them off over a few years. In those types of businesses, there often isn't a lot of upgrading happening. Equipment (if kept) gets shifted as to use.
 
I read that the Blackmagic is probably the most civilised of the eGPU solutions (noise wise) but I just can't get over the price of it and inability to upgrade/replace it.

Yeah, and it isn't even close. The next most quiet eGPU solution is quite noisy when running hard. That might not matter to some users, but for those who it does, there really isn't any competition.

I'll fully admit it isn't a great value aside from that aspect (unless you just like the design, form-factor). It has a few other capabilities that set it apart (like ability to drive the XDR), but that won't matter to many.

The big plus point is that everything is built to fit the case so no great surprise that it's a great product. But £1200 for a Radeon 56, or £600 for an AMD Radeon 580 (from the UK Apple Store)?

I shudder at those prices but then I always said that I would pay for more a civilised enclosure - so perhaps I am missing a trick.

Yes, it certainly isn't for everyone. That said, aside from needing a quiet system while recording or other such needs, noisy computers also drive me nuts. When my wife brings home her work laptop and we work in the same space, I'm glad that isn't an all-the-time thing. Same with laptops and other computers I've owned in the past. Drives me nuts. :)

The actual quality is another aspect, I suppose, if you enjoy that kind of thing. The build-quality and design kind of surpass my expectations, actually. It is a thing of beauty (and I mean more than the looks).

Good arguments, if I were caning a machine for hours on end I'd certainly not choose a MacBook Pro, but for occasional mixed use a Macbook Pro could make a sensible upgrade for people who just want a slightly upgraded GPU. The Mini could make a decent workhorse, and separating it from the GPU in this case might have been interesting if the total noise could be kept down.

The main problem with the mini, is the CPU. It is kind of like a MBP in terms of thermal behavior, though the fan is slightly less annoying. Maybe more like the 16" (I've heard it is better than previous units too for noise tone). But, I discovered that turning off Turbo Boost leaves me with like 80%+ the performance but the fans run WAY, WAY less often and hard. And, I'm more about cores than peak performance anyway.

When we lived in a quieter location (we have a fairly busy street outside now), I could hear the mini, especially if I pushed it. Now, I don't typically hear it over the general noise for any kind of typical thing I do. Hitting all the cores 100% for long will make it heard, but nothing I do typically does that besides 3D rendering or certain video encoding (though with HEVC/T2, that's mostly a thing of the past, too).

Before I disabled Turbo Boost, the fans would spin up quite easily, even when launching some apps. I was like, oh crud, this won't do! So, Turbo Boost Switcher Pro, has become one of my most beloved utilities.

Your use case is very specialised and appear to play straight into the Mac Pro but I guess the value for money or affordability might be difficult to square off.

Yeah, a budget thing. As much as I'd like a new Mac Pro, I can't justify THAT much expense right now. I really wish there was a more middle option, but the mini + eGPU is a lot more powerful than most people realize. The only thing Apple sells that is faster is the iMac Pro (& maybe the new 16" MBP?).
 
Ok, then you don't need a quiet system, I guess. Others do.
I am quite happy with how cool my Razer runs, they have an extremely efficient cooling system that allows my laptop to run without throttling something that the MBP and iMac cannot do. By the way in not bothered by the fans
 
I am quite happy with how cool my Razer runs, they have an extremely efficient cooling system that allows my laptop to run without throttling something that the MBP and iMac cannot do. By the way in not bothered by the fans

That's good. It will save you money and peace of mind.

(Trust me, I am sometimes envious that people on other platforms can buy systems that can cool themselves properly under load, that don't cost $5000+)
 
  • Like
Reactions: maflynn
Who the hell leases computer components? Can I turn my 1080 Ti into a lease so I can get a 2080 Ti soon?

At my previous employer, we leased everything that we could and monitors and such were always valid items that we would lease. The Blackmagic eGPUs were allowed items for lease. I tried one, liked it, but Photoshop didn’t take advantage of it the way that we hoped it might, so we passed on them. If they had worked for us, we would have leased them, but never bought them outright as the soldered GPU and cost would have not been my first choice. However, the integration is way better than the DIY kit, so there are trade offs.

So, I am who the hell leases computer components. Gotta problem with that, sugar plum?
 
Well, if they don't lease them, they write them off over a few years. In those types of businesses, there often isn't a lot of upgrading happening. Equipment (if kept) gets shifted as to use.

We leased through Apple and did 36 months for Macs and 24 months for iPad Pros (separate leases). The Blackmagic was a consideration, but the work was Photoshop work which showed no benefit. We pared back the lease for a couple of reasons and the BM was the first thing cut. Amortizing over three years works well, but I would never have told my bosses to buy these. I like them, but the value prop for the Vega 56 was just not there. The RX580 was slightly better, but these got caught in the gap between the end of GCN and the beginning of RDNA. I hope Blackmagic has an updated eGPU and eGPU Pro out the door sooner rather than later. Unfortunately, AMD’s GPU rollout has just been really slow and arduous.
 
  • Like
Reactions: SteveW928
... I like them, but the value prop for the Vega 56 was just not there. The RX580 was slightly better, but these got caught in the gap between the end of GCN and the beginning of RDNA. I hope Blackmagic has an updated eGPU and eGPU Pro out the door sooner rather than later. Unfortunately, AMD’s GPU rollout has just been really slow and arduous.

I agree, they cost more than they should. I think they are used to a particular market and priced accordingly (and that market will probably buy them at the prices they ask w/o too much question). If they broadened their market a bit, I think they could make up what they lose in margin on quantity and then some.

But, the key is keeping a top-of-the-line GPU in that pro model, and keeping the base model up-to-date enough to be compelling at a low enough entry point. I'm not really the target market, but their features pulled me in. I'm likely a bit of an odd case, though.
 
  • Like
Reactions: Zdigital2015
At my previous employer, we leased everything that we could and monitors and such were always valid items that we would lease. The Blackmagic eGPUs were allowed items for lease. I tried one, liked it, but Photoshop didn’t take advantage of it the way that we hoped it might, so we passed on them. If they had worked for us, we would have leased them, but never bought them outright as the soldered GPU and cost would have not been my first choice. However, the integration is way better than the DIY kit, so there are trade offs.

So, I am who the hell leases computer components. Gotta problem with that, sugar plum?
Companies.

That makes zero sense. How are you supposed to sell assets or give them to employees if you're just renting them?
 
That makes zero sense. How are you supposed to sell assets or give them to employees if you're just renting them?

Sorry, what? Leasing computers, components, cars, and other nouns that start with a 'c' is a perfectly common thing for companies to do.
 
That makes zero sense. How are you supposed to sell assets or give them to employees if you're just renting them?
Uh, we write off the usage over the lease term and then return the old tech to Apple and start a new lease with new tech. We don’t have time to sell assets or give them to employees, that’s not how that works. Leasing has worked incredibly well to keep us current on tech and separate Mac and iPad leases allow for different terms to keep up on iPad tech changing more often. Buying $40K worth of gear ties up too much capital and then it has to be disposed of. Why would I buy when I want to keep things refreshed at 24 or 36 month intervals? That’s crazy!
 
I agree, they cost more than they should. I think they are used to a particular market and priced accordingly (and that market will probably buy them at the prices they ask w/o too much question). If they broadened their market a bit, I think they could make up what they lose in margin on quantity and then some.

But, the key is keeping a top-of-the-line GPU in that pro model, and keeping the base model up-to-date enough to be compelling at a low enough entry point. I'm not really the target market, but their features pulled me in. I'm likely a bit of an odd case, though.

Problem is there's clearly a cost involved around low production quantity and because the GPU is sealed in they would have to replace the product every 3 years. For it to remain a competitive product in the third year the GPU would have to be of a high spec to start with. It's a bit of a risk to lower the price on this to try and gain more sales.

I noted that they put the usual USB ports etc on it which is nice.

What could have been even more interesting would have been some internal drive bays and a SATA bridge to bring in people who might have added inexpensive storage - and maybe a second Thunderbolt controller to look after all that because a GPU should keep a Thunderbolt 3 link busy by itself.
 
What could have been even more interesting would have been some internal drive bays and a SATA bridge to bring in people who might have added inexpensive storage - and maybe a second Thunderbolt controller to look after all that because a GPU should keep a Thunderbolt 3 link busy by itself.
If it needs two TB3 cables you're kind of losing any benefit by sticking them in a box together - you're just adding heat.
 
If it needs two TB3 cables you're kind of losing any benefit by sticking them in a box together - you're just adding heat.

Indeed, and that's why these connections aren't more common sadly. They'd be stealing bandwidth from a GPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.