Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Newfiejudd

macrumors regular
Original poster
Jul 8, 2010
222
29
I would love to see a an external GPU utilizing TB2 for the rMBP. Who else would be interested in such device?

I know right now these do exist, but they are home made. Would be great if we could get a high end Dedicated GPU in an external enclosure, even make it user upgradeable. With TB2 we should be able to utilize all 16 lanes :eek: .

I know some companies have mentioned this, but I don't believe they ever released anything. Just a pipe dream I guess, but what a way to make an impact on mobile gaming scene. rMBP or Air with the ability to use a 780m GTX, I'm sure it could be done. I'm also thinking companies are just to worried about sales decline over providing the public what they want..

High end mobile GPU with internal power supply and a TB2 connection. Work and game on the same system. :D
 
For my humble needs, I'd say its not needed. I'm happy with my 2012 rMBP's GPU as it stands
 
If they could do it cheaply enough this would be cool. Just a box with power and a PCIe slot so you can choose your own card.
 
I would love to see a an external GPU utilizing TB2 for the rMBP. Who else would be interested in such device?

I know right now these do exist, but they are home made. Would be great if we could get a high end Dedicated GPU in an external enclosure, even make it user upgradeable. With TB2 we should be able to utilize all 16 lanes :eek: .

I know some companies have mentioned this, but I don't believe they ever released anything. Just a pipe dream I guess, but what a way to make an impact on mobile gaming scene. rMBP or Air with the ability to use a 780m GTX, I'm sure it could be done. I'm also thinking companies are just to worried about sales decline over providing the public what they want..

High end mobile GPU with internal power supply and a TB2 connection. Work and game on the same system. :D

I would buy one, but never to use ANOTHER Mobile class GPU. If anything, a desktop class 770 or 780, even though they are going to be severely hampered by bandwidth. However it will probably always be better than a mobile GPU. TB2 if memory is right is like 8 PCIe 2.0 lanes by the way, not 16 lanes. If memory is not right, it's like 4 lanes.

Anyhow, if they're not for sale it's rumoured that it is because Intel does not license TB for this use. But there's some guides on how to get the thing done, I might try at some point just as a little project.
 
I'd love to see some of those home-made guide links:)
Was looking into that back then when I got my first Mac (with TB1 though) and figured this would be awesome. Especially considering the whole "heat/thermal"-discussions related to dedicated gfx cards..
 
I would buy one, but never to use ANOTHER Mobile class GPU. If anything, a desktop class 770 or 780, even though they are going to be severely hampered by bandwidth. However it will probably always be better than a mobile GPU. TB2 if memory is right is like 8 PCIe 2.0 lanes by the way, not 16 lanes. If memory is not right, it's like 4 lanes.

Anyhow, if they're not for sale it's rumoured that it is because Intel does not license TB for this use. But there's some guides on how to get the thing done, I might try at some point just as a little project.

I stand corrected.

TB2 = 20Gbps "Thunderbolt 2" (x4 2.0), this gives 88% (NVidia) or 94% (AMD) of full x16 2.0 desktop performance
TB = 10Gbps "Thunderbolt 1" (~x2 2.0 + 12.5%), this gives 73% (NVidia) or 86% (AMD) of full x16 2.0 desktop performance
EC/mPCIe = 5Gbps (x1 2.0)

----------

There is more info than you can grasp in a day:
http://forum.techinferno.com/diy-e-gpu-projects/

Wow thanks for this. Obviously there is quite the following for this. Why aren't manufacturers jumping on board with this?
 
I stand corrected.

TB2 = 20Gbps "Thunderbolt 2" (x4 2.0), this gives 88% (NVidia) or 94% (AMD) of full x16 2.0 desktop performance
TB = 10Gbps "Thunderbolt 1" (~x2 2.0 + 12.5%), this gives 73% (NVidia) or 86% (AMD) of full x16 2.0 desktop performance
EC/mPCIe = 5Gbps (x1 2.0)

----------



Wow thanks for this. Obviously there is quite the following for this. Why aren't manufacturers jumping on board with this?

If gaming is the intended use, this is more than enough since gaming doesn't travel too much back and forth, it sends the chunks and the GPU processes them, so it's fairly acceptable.
 
Sonet makes a commercial TB exclosure for GPUs, they also announced a TB 2.0 coming soon. But - it costs 500 euro alone. Autsch. For 100 I would actually get one. But 500?
 
Wow thanks for this. Obviously there is quite the following for this. Why aren't manufacturers jumping on board with this?

If you read a little more into the forum you'll see the bitter truth about thunderbolt development for eGPUs.

It appears to stem from two unwilling parties which you'd thought are central to the development of thunderbolt.
 
Eventually when the prices are right I intend to get myself a 3840x2160 external screen as a replacement for my current 1080p 23". When that happens I either need a gaming desktop or some external GPU or a new notebook. Chances are the first two are cheaper and better. eGPU is somewhat mobile too.
I think it could be great but it has to compete in price with a small gaming box. If in 1-2 years a custom little desktop is not significantly more expensive it isn't really worth it.

If the box costs too much I would rather go for a desktop, which offers other benefits as well. It also has to be able to handle GPUs in the $150-$250 range. So the external power supply needs to be strong enough and those aren't all that cheap. You get a 80W external power supply for next to nothing but >150W are quite expensive.
Also the GPU needs to be user replaceable not too noisy and not too low end or mobile. 760/770; 7870/7950; is about what I would aim for or the next generations on that spot.

I think it will always have some hassle associated with it and lots of cables. It has to come at a reasonable price. If you rather get a console or a small gaming desktop it isn't going to find much adoption.
 
Intel won't allow it. SilverStone product is as good as dead.

Do you remember MSI GUS II shown year+ ago? It was a TB first gen enclosure. Dead, Intel wouldn't certify it, and you know what does mean.

Go to Village Instruments Facebook page and ask them how it is going with their solution and why it's not in the market. Check what they will say about Intel "support" and "certification" processes.

---

The truth is, Intell wants you to buy their Crystallwell (GT3) GPU now and next version when Broadwell comes up. Allowing external GPU solutions would result in making their GPUs obsolete.

Who would pay more for internal GPU, if you can go with bare minimum to display OS interface without problems and then switch to external box using TB2? I don't know about you, but I wouldn't.
 
Intel won't allow it. SilverStone product is as good as dead.

Do you remember MSI GUS II shown year+ ago? It was a TB first gen enclosure. Dead, Intel wouldn't certify it, and you know what does mean.

Go to Village Instruments Facebook page and ask them how it is going with their solution and why it's not in the market. Check what they will say about Intel "support" and "certification" processes.

---

The truth is, Intell wants you to buy their Crystallwell (GT3) GPU now and next version when Broadwell comes up. Allowing external GPU solutions would result in making their GPUs obsolete.

Who would pay more for internal GPU, if you can go with bare minimum to display OS interface without problems and then switch to external box using TB2? I don't know about you, but I wouldn't.

This sounds very depressing. I hope it isn't true.
 
A how to:

http://www.journaldulapin.com/2013/08/24/a-thunderbolt-gpu-on-a-mac-how-to/

And here's a video of an Air getting it going on it's little screen with a GTX 760:

http://www.youtube.com/watch?v=pxxtd2kVf0I

This is very problematic (you will have lots of 'code 12' problems to work on, noone will guarantee it will work seamlessly) and not very efficient (TB to express card slows the link by a lot) solution.

I need safe and stable solution, something I can relay on. Unfortunately can't waste time for setup and fixing it every software update or something else goes wrong.
 
This is very problematic (you will have lots of 'code 12' problems to work on, noone will guarantee it will work seamlessly) and not very efficient (TB to express card slows the link by a lot) solution.

I need safe and stable solution, something I can relay on. Unfortunately can't waste time for setup and fixing it every software update or something else goes wrong.

I think that what you want is quite far away.

Personally, I would take this as a proof of concept or a little side-project for fun, like a hackintosh. Not a reliable thing since the vendors don't seem to want to get onto it.

Maybe in a couple years...
 
I would be interested in the idea. In a few years, instead of upgrading the whole laptop, just get an external GPU. I'd also like to see apple build it into a 4k monitor. No laptop could drive that well, but an nvidia 660 or something wouldn't have any trouble.
 
I might do it, but will it work under OS X? I suspect it would work without problems in Windows because you can update the drivers yourself.

So I think the trick is, getting a GPU that is used by the Mac Pro or iMac and use that?

Or simply wait for Apple start selling external GPU's.
 
We're looking at 2.5 GB/s, and based on this article (the first thing I found when searching for Kepler bandwidth tests): http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa/1

...We could expect to see something in the 80-90% of peak potential performance for a 680. I'm not sure what that means for a 780, but I expect results would be similar under most conditions (in other words it should still be faster than a 680 even with the bandwidth limits).

Maybe not ideal (get that to 30Gbps/3.75 GB/s and it'd be nearly 100%), but still that's quite impressive. 80% of the performance of a desktop class GPU is nothing to scoff at. With even the rMBP's mid-range 2.3 chip going toe to toe with a desktop-class 3770K (at stock clocks obviously) you'd see some very solid performance out of this arrangement.

Is Intel holding this back somehow? I'd expect to see commercial eGPU enclosures appearing already otherwise. I realize this is being discussed above, but do we know it's actually true?

Honestly I see eGPU as a big advantage for Intel; Their iGPUs are not likely to beat Nvidia's offerings anytime soon, at best they'll be able to stay on track with dGPU options and maintain a fairly narrow gap like what we see this year. However because that gap is so small the existence of eGPUs would make iGPU only systems far more appealing; Why pay extra or deal with glitchy graphics switching for a 20% frame rate boost? Just hook into a desktop-class eGPU when you're at home or at a desk and use Intel's very solid iGPU on the go.

If anyone could sell this to Intel it'd be Apple, so I hope they're interested in pursuing it.
 
Last edited:
This is indeed a very interesting subject, and for sure there's going to be a raising market when 4K monitor's prices drop, I would love to hook up a 4K monitor to my rMBP in a near future by means of an external GPU. This could be a very good opportunity for GPU makers, not much people can afford a whole new laptop in order to use a 4k monitor.
 
I might do it, but will it work under OS X? I suspect it would work without problems in Windows because you can update the drivers yourself.

So I think the trick is, getting a GPU that is used by the Mac Pro or iMac and use that?

Or simply wait for Apple start selling external GPU's.

Check the links I posted right above yours.

----------

We're looking at 2.5 GB/s, and based on this article (the first thing I found when searching for Kepler bandwidth tests): http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa/1

...We could expect to see something in the 80-90% of peak potential performance for a 680. I'm not sure what that means for a 780, but I expect results would be similar under most conditions (in other words it should still be faster than a 680 even with the bandwidth limits).

Maybe not ideal (get that to 30Gbps/3.75 GB/s and it'd be nearly 100%), but still that's quite impressive. 80% of the performance of a desktop class GPU is nothing to scoff at. With even the rMBP's mid-range 2.3 chip going toe to toe with a desktop-class 3770K (at stock clocks obviously) you'd see some very solid performance out of this arrangement.

Is Intel holding this back somehow? I'd expect to see commercial eGPU enclosures appearing already otherwise. I realize this is being discussed above, but do we know it's actually true?

Honestly I see eGPU as a big advantage for Intel; Their iGPUs are not likely to beat Nvidia's offerings anytime soon, at best they'll be able to stay on track with dGPU options and maintain a fairly narrow gap like what we see this year. However because that gap is so small the existence of eGPUs would make iGPU only systems far more appealing; Why pay extra or deal with glitchy graphics switching for a 20% frame rate boost? Just hook into a desktop-class eGPU when you're at home or at a desk and use Intel's very solid iGPU on the go.

If anyone could sell this to Intel it'd be Apple, so I hope they're interested in pursuing it.

From the link above of a guy who has got it working on OS X, it looks as if getting it to work with OS X is bounds and leaps easier than on Windows, however with external displays.

I'll probably jump into it next month.

----------

This is indeed a very interesting subject, and for sure there's going to be a raising market when 4K monitor's prices drop, I would love to hook up a 4K monitor to my rMBP in a near future by means of an external GPU. This could be a very good opportunity for GPU makers, not much people can afford a whole new laptop in order to use a 4k monitor.

This would change the rules of the game, and from one part I agree with Atomic Walrus that it should be good for Intel, but on the other side, they could want to avoid competition in GPU power so that you don't go and buy the weakest Iris offering to drive the laptop while on the go and then hook the eGPU at work/home.

Tricky.
 
Intel won't allow it. SilverStone product is as good as dead.

Do you remember MSI GUS II shown year+ ago? It was a TB first gen enclosure. Dead, Intel wouldn't certify it, and you know what does mean.

Go to Village Instruments Facebook page and ask them how it is going with their solution and why it's not in the market. Check what they will say about Intel "support" and "certification" processes.

---

The truth is, Intell wants you to buy their Crystallwell (GT3) GPU now and next version when Broadwell comes up. Allowing external GPU solutions would result in making their GPUs obsolete.

Who would pay more for internal GPU, if you can go with bare minimum to display OS interface without problems and then switch to external box using TB2? I don't know about you, but I wouldn't.

This sounds very depressing. I hope it isn't true.

It is true and I cannot deny how upset it has made me for the past two years knowing that affordable expansion could exist, but the big guy doesn't allow it to. Village instruments were denied a simple developer license without a reasonable explanation. Other eGPU chassis/slot producers were requested to recall their products.

Zodiac has accurately summed up how the very creators of thunderbolt are stymieing is uptake and accessibility.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.