Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

winston1236

macrumors 68000
Original poster
Dec 13, 2010
1,902
319
Hey everyone, I'm looking for some help with a purchase.

I'm a professional motion graphics artist, my work machine is a beast but I need a laptop thats a lot more portable than a Mac Pro.

So that leads me to the Macbook Pro and I'm stuck on the graphics. I run primarily After Effects, Photoshop, Illustrator, and Cinema 4D.

I've been reading conflicting info on these two graphics options, which would be better for my use? Anyone have any experience with the two?
 

thekev

macrumors 604
Aug 5, 2010
7,005
3,343
Illustrator would be identical. Any differences in photoshop would be trivial. After Effects you would have to check, because Iris Pro can't use CUDA. The features that use CUDA are very specific, and you may never use them. For Photoshop any difference is trivial in actual use.

Cinema 4D is the only thing where I'm a little torn. It makes extensive use of OpenGL drawing for its viewports. If you assign a shader to one of your models, there's a shader used for offline rendering and another that is used for shading objects in the viewport. The gpu has nothing to do with offline rendering unless you use a specific renderer that mentions it. It does however influence the speed of viewport interaction and playback framerates. You can check OpenGL benchmarks. If you decide to go with Iris Pro, order directly from Apple. That way if you realize after the fact that you underestimated your needs, you have their return policy available. Do note that cpu + 750m both maxed out simultaneously is probably more power than the peak 85W that the charger can supply.
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
Hey everyone, I'm looking for some help with a purchase.

I'm a professional motion graphics artist, my work machine is a beast but I need a laptop thats a lot more portable than a Mac Pro.

So that leads me to the Macbook Pro and I'm stuck on the graphics. I run primarily After Effects, Photoshop, Illustrator, and Cinema 4D.

I've been reading conflicting info on these two graphics options, which would be better for my use? Anyone have any experience with the two?

I'd go with the "Iris Pro/GeForce GT 750M" combo pack that you get on the higher-end model. The latter will give you flexibility and power should you need it, and the former will kick in to help you save battery life if you don't. Win-Win. Plus, if you configure the lower-end model with a higher amount of storage, the differences between that and the higher-end model are much slighter in price.

For the most part, the lower end model without the discrete GPU is there for those whose needs are comparable to the needs of 13" MacBook Pro customers, but who want the larger screen.
 

Toltepeceno

Suspended
Jul 17, 2012
1,807
554
SMT, Edo MX, MX
For the most part, the lower end model without the discrete GPU is there for those whose needs are comparable to the needs of 13" MacBook Pro customers, but who want the larger screen.

I notice the difference in the iris 5100 (13") and iris pro 5200 (15") is the edram cache. Any idea what that really translates to?

Since the 13 is dual processor and the 15 is quad the needs usually start out to be different between the two. Just pointing that out. I do know there is probably not a lot of difference in graphics, but I won't claim to know what difference the afore mentioned edram cache makes.
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
I notice the difference in the iris 5100 (13") and iris pro 5200 (15") is the edram cache. Any idea what that really translates to?

Since the 13 is dual processor and the 15 is quad the needs usually start out to be different between the two. Just pointing that out. I do know there is probably not a lot of difference in graphics, but I won't claim to know what difference the afore mentioned edram cache makes.

I don't know what specific difference that element makes. The two IGPs are close, but that Iris Pro is better. Iris Pro is supposedly close to the levels of performance offered by the NVIDIA GeForce GT 650M used in the Ivy Bridge 15" MacBook Pros and higher-end Ivy Bridge 21.5" iMacs. I'd guess that Iris (non-Pro) is probably comparable to the AMD GPUs used in the 15" and 17" MacBook Pro from the Early and Late 2011 generation (the two Sandy Bridge generations).
 

Allograft

macrumors 6502
Oct 19, 2014
334
238
For the most part, the lower end model without the discrete GPU is there for those whose needs are comparable to the needs of 13" MacBook Pro customers, but who want the larger screen.

That's me. Office apps, Safari, Evernote, no real multimedia heavy duty stuff. Integrated graphics work great
 

dollystereo

macrumors 6502a
Oct 6, 2004
907
114
France
Discrete GPU in laptop tend to fail...
I would never buy a Macbook pro with discrete GPU again (I have a 2011 MBP). Anyway, you could get the entry model with IRIS Pro, and a eGPU far more powerful than the crappy G750m as a desktop docking station.
The Iris pro is a very capable GPU, 1TFlops of computing power and very good open CL performance.
 

Toltepeceno

Suspended
Jul 17, 2012
1,807
554
SMT, Edo MX, MX
I don't know what specific difference that element makes. The two IGPs are close, but that Iris Pro is better. Iris Pro is supposedly close to the levels of performance offered by the NVIDIA GeForce GT 650M used in the Ivy Bridge 15" MacBook Pros and higher-end Ivy Bridge 21.5" iMacs. I'd guess that Iris (non-Pro) is probably comparable to the AMD GPUs used in the 15" and 17" MacBook Pro from the Early and Late 2011 generation (the two Sandy Bridge generations).

Thank you for the info.:)
 

winston1236

macrumors 68000
Original poster
Dec 13, 2010
1,902
319
Discrete GPU in laptop tend to fail...
I would never buy a Macbook pro with discrete GPU again (I have a 2011 MBP). Anyway, you could get the entry model with IRIS Pro, and a eGPU far more powerful than the crappy G750m as a desktop docking station.
The Iris pro is a very capable GPU, 1TFlops of computing power and very good open CL performance.

Thats a good point, I hadn't considered an eGPU until now. The Iris Pro does seem to be better than I imagined, I think I'm probably just biased from how bad Intel gfx used to be. As for portability I would be fine using an eGPU when I needed it, I don't really need the laptop as a render machine I can always take that to the desktop.
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
Discrete GPU in laptop tend to fail...
I would never buy a Macbook pro with discrete GPU again (I have a 2011 MBP). Anyway, you could get the entry model with IRIS Pro, and a eGPU far more powerful than the crappy G750m as a desktop docking station.
The Iris pro is a very capable GPU, 1TFlops of computing power and very good open CL performance.

Thats a good point, I hadn't considered an eGPU until now. The Iris Pro does seem to be better than I imagined, I think I'm probably just biased from how bad Intel gfx used to be. As for portability I would be fine using an eGPU when I needed it, I don't really need the laptop as a render machine I can always take that to the desktop.


You can't get an eGPU, Thunderbolt (1 or 2) doesn't yet output anywhere near the bandwidth of PCIe x16 slots that GPUs tend to require. That's why you're seeing a lot of Thunderbolt versions of PCIe 1x and 4x devices but not of GPUs. That's also why the cylindrical Mac Pro still has internal graphics cards (let alone two of them). You'd figure that if you did, there'd be a WHOLE LOT of people opting for lower-end iMacs and lower-end MacBook Pros that dock their computer into the eGPU for more graphics power.

As for being burned by MacBook Pros with discrete video, I understand the woes. Every 15" and 17" MacBook Pro model from the early days of the Intel switch through the "Late 2011" models had a discrete GPU that was prone to failures and/or heating issues of some kind. That said, it appears as though the trend stopped with the NVIDIA GeForce GT 650M that was used in the Mid 2012 and Early 2013 15" MacBook Pros, which has no widespread issues of unreliability. Then again, Kepler was renowned for its thermal efficiency.
 

winston1236

macrumors 68000
Original poster
Dec 13, 2010
1,902
319
You can't get an eGPU, Thunderbolt (1 or 2) doesn't yet output anywhere near the bandwidth of PCIe x16 slots that GPUs tend to require. That's why you're seeing a lot of Thunderbolt versions of PCIe 1x and 4x devices but not of GPUs. That's also why the cylindrical Mac Pro still has internal graphics cards (let alone two of them). You'd figure that if you did, there'd be a WHOLE LOT of people opting for lower-end iMacs and lower-end MacBook Pros that dock their computer into the eGPU for more graphics power.

As for being burned by MacBook Pros with discrete video, I understand the woes. Every 15" and 17" MacBook Pro model from the early days of the Intel switch through the "Late 2011" models had a discrete GPU that was prone to failures and/or heating issues of some kind. That said, it appears as though the trend stopped with the NVIDIA GeForce GT 650M that was used in the Mid 2012 and Early 2013 15" MacBook Pros, which has no widespread issues of unreliability. Then again, Kepler was renowned for its thermal efficiency.


I was wondering why I hadn't heard of that before. I'm kind of leaning toward the Iris Pro now but I'm still not 100% sure on this.
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
I was wondering why I hadn't heard of that before. I'm kind of leaning toward the Iris Pro now but I'm still not 100% sure on this.

I would get the higher-end with Iris Pro and the GeForce GT 750M. That way that extra power is there if you need it, and it'll allow your computer to age more gracefully if you don't.

Plus, if you price out the lower-end with 512GB of storage (which, most would need nowadays anyway), then the price difference between the two narrows to the point where it's silly to consider the lower-end model.
 

Maxx Power

Cancelled
Apr 29, 2003
861
335
You can't get an eGPU, Thunderbolt (1 or 2) doesn't yet output anywhere near the bandwidth of PCIe x16 slots that GPUs tend to require. That's why you're seeing a lot of Thunderbolt versions of PCIe 1x and 4x devices but not of GPUs. That's also why the cylindrical Mac Pro still has internal graphics cards (let alone two of them). You'd figure that if you did, there'd be a WHOLE LOT of people opting for lower-end iMacs and lower-end MacBook Pros that dock their computer into the eGPU for more graphics power.

As for being burned by MacBook Pros with discrete video, I understand the woes. Every 15" and 17" MacBook Pro model from the early days of the Intel switch through the "Late 2011" models had a discrete GPU that was prone to failures and/or heating issues of some kind. That said, it appears as though the trend stopped with the NVIDIA GeForce GT 650M that was used in the Mid 2012 and Early 2013 15" MacBook Pros, which has no widespread issues of unreliability. Then again, Kepler was renowned for its thermal efficiency.

The bandwidth issue just isn't a big deal. According to the latest PCI-E bandwidth scaling (from 16x to 1x, Gen 3 to Gen 1.1), the loss is about 15% averaged for all games and scenarios considered when the bandwidth is dropped from 16X Gen3 to 4x Gen2 (what Thunderbold offers maximally).

See http://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/22.html. That's high resolution gaming, AA-enabled, and on a GTX980 (the most likely to be starved of bandwidth). I quote "Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 15%." This has been the consensus in the PC community for quite a few years now, since motherboards with 4x PCI-E slots were available with the Core 2 Duos. Of course, newer games will take bigger hits if they use a lot of post-processing, as the review article points out. However, even then, the trade off is much faster frame rates than one can muster otherwise.

The biggest hurdle from what I know, to getting Thunderbolt GPU enclosures to work are the driver issues and a smooth hand-off of video switching and GPU switching. Something like AMD's Enduro or Nvidia's Optimus GPU switching software took a long time to develop and perfect (a few years), and that's with a lot of motivation behind them. There just isn't as much motivation behind discrete GPUs housed over Thunderbolt, for now.

My 2 cents...
 

Anonymous Freak

macrumors 603
Dec 12, 2002
5,578
1,333
Cascadia
Cinema 4D is the only thing where I'm a little torn. It makes extensive use of OpenGL drawing for its viewports. If you assign a shader to one of your models, there's a shader used for offline rendering and another that is used for shading objects in the viewport.

This is the best answer in the thread. For the vast majority of your workloads, the difference is negligible. The other apps don't use the GPU for GPU render enough to make much beyond pure benchmarking difference, you won't notice day-to-day the difference between Iris Pro and GeForce 750M in Illustrator or Photoshop.

But in Cinema 4D you might. If you find yourself using GPU rendering often, and will use it in a stationary setting when you do, then you might even be able to get away with just the Iris Pro, and use an external Thunderbolt-to-PCIe cage, and put a real compute-heavy GPU in that. (As dollystereo alludes to,) for use not a display GPU, but just compute GPU. And if Apple releases a new MBP with ATI graphics, and your workload only support nVidia GPU compute, this would be your only option for GPU acceleration.


Don't be swayed by individual biases for or against individual configurations: no, discrete GPUs are not inherently more likely to fail; no, there is no mass rash of discrete GPU failures in the Retina MacBook Pros; the Thunderbolt connection is plenty fast to run an external GPU - even if it's for rendering/compute as opposed to game play.


Your best bet is to look in to each piece of software you use, and see if it supports CUDA and/or OpenCL for GPU-compute. If it does, (and does so for workloads you use it for!) then go for the nVidia. You can always use an app such as gfxCardStatus to manually force the computer to use Iris Pro graphics if you find you don't need nVidia, to save battery and/or keep heat levels lower.
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
The bandwidth issue just isn't a big deal. According to the latest PCI-E bandwidth scaling (from 16x to 1x, Gen 3 to Gen 1.1), the loss is about 15% averaged for all games and scenarios considered when the bandwidth is dropped from 16X Gen3 to 4x Gen2 (what Thunderbold offers maximally).

See http://www.techpowerup.com/reviews/NVIDIA/GTX_980_PCI-Express_Scaling/22.html. That's high resolution gaming, AA-enabled, and on a GTX980 (the most likely to be starved of bandwidth). I quote "Real performance losses only become apparent in x8 1.1 and x4 2.0, where the performance drop becomes noticeable with around 15%." This has been the consensus in the PC community for quite a few years now, since motherboards with 4x PCI-E slots were available with the Core 2 Duos. Of course, newer games will take bigger hits if they use a lot of post-processing, as the review article points out. However, even then, the trade off is much faster frame rates than one can muster otherwise.

The biggest hurdle from what I know, to getting Thunderbolt GPU enclosures to work are the driver issues and a smooth hand-off of video switching and GPU switching. Something like AMD's Enduro or Nvidia's Optimus GPU switching software took a long time to develop and perfect (a few years), and that's with a lot of motivation behind them. There just isn't as much motivation behind discrete GPUs housed over Thunderbolt, for now.

My 2 cents...

This is the best answer in the thread. For the vast majority of your workloads, the difference is negligible. The other apps don't use the GPU for GPU render enough to make much beyond pure benchmarking difference, you won't notice day-to-day the difference between Iris Pro and GeForce 750M in Illustrator or Photoshop.

But in Cinema 4D you might. If you find yourself using GPU rendering often, and will use it in a stationary setting when you do, then you might even be able to get away with just the Iris Pro, and use an external Thunderbolt-to-PCIe cage, and put a real compute-heavy GPU in that. (As dollystereo alludes to,) for use not a display GPU, but just compute GPU. And if Apple releases a new MBP with ATI graphics, and your workload only support nVidia GPU compute, this would be your only option for GPU acceleration.


Don't be swayed by individual biases for or against individual configurations: no, discrete GPUs are not inherently more likely to fail; no, there is no mass rash of discrete GPU failures in the Retina MacBook Pros; the Thunderbolt connection is plenty fast to run an external GPU - even if it's for rendering/compute as opposed to game play.


Your best bet is to look in to each piece of software you use, and see if it supports CUDA and/or OpenCL for GPU-compute. If it does, (and does so for workloads you use it for!) then go for the nVidia. You can always use an app such as gfxCardStatus to manually force the computer to use Iris Pro graphics if you find you don't need nVidia, to save battery and/or keep heat levels lower.

In response to both comments, again, there's a reason why you don't see more people strapping PCIe graphics cards to their otherwise IGP-only Macs via a Thunderbolt to PCIe adapter; BECAUSE IT DOESN'T WORK EFFECTIVELY ENOUGH! If that wasn't the case, you'd see TONS more people doing it and a whole lot less people caring about Apple's higher-end models (which all tend to have better graphics all across the board).

To the OP, it sounds like you use video-intensive applications. It also sounds like, of those applications, some might have a preference for CUDA support. With an NVIDIA dGPU, it sort of doesn't matter as you have both OpenCL and CUDA. Could you get by on Iris Pro alone, yes, probably. Your mileage will be better with a 15" MacBook Pro that ALSO includes the GeForce GT 750M. Incidentally, unless you want to only have 256GB of storage, a model with the GT 750 also happens to be the better buy. End of story.
 

dollystereo

macrumors 6502a
Oct 6, 2004
907
114
France
I was wondering why I hadn't heard of that before. I'm kind of leaning toward the Iris Pro now but I'm still not 100% sure on this.

You get up to 80% of the performance of a Geforce 780GTX over thunderbolt. So I would say that, it is very powerful. (go to tech inferno forums and read it by yourself).
The GeForce GT650 has some problems, some 2012 mbp are starting to have heat issues. Probably is the thermal design of the logic board. (that was solved in the retina macbook pro "hopefully")
The iris pro is very powerful GPU, and the discrete external option is not that expensive (around $400 for a GTX760). Anyway, I think that the GT750 is such a slow GPU that it doesn't justify the price difference and the potential hassle.

----------

Incidentally, unless you want to only have 256GB of storage, a model with the GT 750 also happens to be the better buy. End of story.

That's true.

----------

This is the best answer in the thread. For the vast majority of your workloads, the difference is negligible. The other apps don't use the GPU for GPU render enough to make much beyond pure benchmarking difference, you won't notice day-to-day the difference between Iris Pro and GeForce 750M in Illustrator or Photoshop.

But in Cinema 4D you might. If you find yourself using GPU rendering often, and will use it in a stationary setting when you do, then you might even be able to get away with just the Iris Pro, and use an external Thunderbolt-to-PCIe cage, and put a real compute-heavy GPU in that. (As dollystereo alludes to,) for use not a display GPU, but just compute GPU. And if Apple releases a new MBP with ATI graphics, and your workload only support nVidia GPU compute, this would be your only option for GPU acceleration.


Don't be swayed by individual biases for or against individual configurations: no, discrete GPUs are not inherently more likely to fail; no, there is no mass rash of discrete GPU failures in the Retina MacBook Pros; the Thunderbolt connection is plenty fast to run an external GPU - even if it's for rendering/compute as opposed to game play.


Your best bet is to look in to each piece of software you use, and see if it supports CUDA and/or OpenCL for GPU-compute. If it does, (and does so for workloads you use it for!) then go for the nVidia. You can always use an app such as gfxCardStatus to manually force the computer to use Iris Pro graphics if you find you don't need nVidia, to save battery and/or keep heat levels lower.

If you happen to do lot's of 3D, you should really consider a PC laptop workstation or a desktop with a descent GPU.
 

Maxx Power

Cancelled
Apr 29, 2003
861
335
In response to both comments, again, there's a reason why you don't see more people strapping PCIe graphics cards to their otherwise IGP-only Macs via a Thunderbolt to PCIe adapter; BECAUSE IT DOESN'T WORK EFFECTIVELY ENOUGH! If that wasn't the case, you'd see TONS more people doing it and a whole lot less people caring about Apple's higher-end models (which all tend to have better graphics all across the board).

I think you mis-read what I said. Clearly, it works well and effectively enough as the speed boost is huge and there are no real limitations that can't be worked out, even including bandwidth. May be you meant "impractical" ? What is lacking currently is demand and supply, which I suspect is the only real reason behind the scarcity of external GPU solutions. I quote Anandtech: "...pretty much plug and play..." and "The big question is whether such setup reasonably affordable in any way. Currently, the short answer is no." From their article on running a GTX780 Ti over Thunderbolt 2 http://www.anandtech.com/show/7987/running-an-nvidia-gtx-780-ti-over-thunderbolt-2. Way too expensive and niche right now (about 300 dollars USD to get started), but there are solutions already working since Thunderbolt 1 and ExpressCard days. The only thing really preventing OS X-only users is the fact they need to install Windows, last I read.

The demand and supply is going to get better. In fact, Alienware just introduced to market, their own 13 inch gaming laptop with a discrete connected GPU over external PCI-E. So I think with a bit of momentum in that direction, this is going to happen.
 
Last edited:

winston1236

macrumors 68000
Original poster
Dec 13, 2010
1,902
319
You get up to 80% of the performance of a Geforce 780GTX over thunderbolt. So I would say that, it is very powerful. (go to tech inferno forums and read it by yourself).
The GeForce GT650 has some problems, some 2012 mbp are starting to have heat issues. Probably is the thermal design of the logic board. (that was solved in the retina macbook pro "hopefully")
The iris pro is very powerful GPU, and the discrete external option is not that expensive (around $400 for a GTX760). Anyway, I think that the GT750 is such a slow GPU that it doesn't justify the price difference and the potential hassle.

----------



That's true.

----------



If you happen to do lot's of 3D, you should really consider a PC laptop workstation or a desktop with a descent GPU.


All good info here, thanks everyone. I my workload is maybe 20% max 3D and I can render that all on a separate render machine. I guess I'll go with the dual GPU model worst case it will just have a higher resale value down the road.
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
I think you mis-read what I said. Clearly, it works well and effectively enough as the speed boost is huge and there are no real limitations that can't be worked out, even including bandwidth. May be you meant "impractical" ? What is lacking currently is demand and supply, which I suspect is the only real reason behind the scarcity of external GPU solutions. I quote Anandtech: "...pretty much plug and play..." and "The big question is whether such setup reasonably affordable in any way. Currently, the short answer is no." From their article on running a GTX780 Ti over Thunderbolt 2 http://www.anandtech.com/show/7987/running-an-nvidia-gtx-780-ti-over-thunderbolt-2. Way too expensive and niche right now (about 300 dollars USD to get started), but there are solutions already working since Thunderbolt 1 and ExpressCard days. The only thing really preventing OS X-only users is the fact they need to install Windows, last I read.

The demand and supply is going to get better. In fact, Alienware just introduced to market, their own 13 inch gaming laptop with a discrete connected GPU over external PCI-E. So I think with a bit of momentum in that direction, this is going to happen.

And you DIDN'T read what I posted. No one is doing this. Also you didn't read the Update 1 section where it says that Thunderbolt 2 only has a sixth of the bandwidth of PCIe 3.0 x16 slots and that the reduction in performance can be as much as 50%.

That doesn't even bring up the issues typically associated with using a PC video card with a Mac (namely not having the EFI flashed so that you get a boot screen and boot options). Again, if this was a viable option, you'd have more Mac users doing it. As it stands today, this isn't a common thing for Mac users without internal dGPUs to be doing.
 

Maxx Power

Cancelled
Apr 29, 2003
861
335
And you DIDN'T read what I posted. No one is doing this. Also you didn't read the Update 1 section where it says that Thunderbolt 2 only has a sixth of the bandwidth of PCIe 3.0 x16 slots and that the reduction in performance can be as much as 50%.

That doesn't even bring up the issues typically associated with using a PC video card with a Mac (namely not having the EFI flashed so that you get a boot screen and boot options). Again, if this was a viable option, you'd have more Mac users doing it. As it stands today, this isn't a common thing for Mac users without internal dGPUs to be doing.

I have POSTED links to where a massive amount of benchmarking have been done to visit this exact topic you fail to understand. The effect of performance as a function of the reduction in bandwidth, to be exact. It is SHOWN over and over to be not a big deal. I am very much fully aware of TB's bandwidth, that it is equivalent to PCI-E 2.0 4x, which is shown to be sufficient for nearly all usage cases in my references. Which is why I linked a few modern, reputable sources demonstrating my point. Furthermore, computing workloads are generally not bandwidth constrained. Your "can be as much as 50%" is pure fear mongering FUD and is irrelevant. You have to be very bandwidth constrained to even come close to that, and then only for 1 or 2 newest games at the most taxing settings.

What is the real issue here ? Using dGPUs isn't common for Mac or PC users. Having a Mac isn't a hindrance to using dGPUs and neither is using a PC video card in a Mac. Who cares about the boot screen for performance GPU-related work ? You can always unplug it to visit that special screen. I have shown it to you with modern references as a viable option, and you keep stating that it is not simply because there are not enough current users. Let me remind you that all new technologies are adopted by the mass one at a time.

PS. Here is the whole quote from that update I have read a few times, so we can leave this discussion free of fear:

"...the performance seems to be around 80-90% of the full desktop performance based on synthetic benchmarks (3DMark and Unigine Heaven). Given that Thunderbolt 2 offers only 20Gbit/s of bandwidth while a PCIe 3.0 x16 slot offers 128Gbit/s, getting 80-90% of the performance is a lot more than expected. This will vary depending on the game, as based on our own PCIe scaling tests the PCIe bandwidth may cause little to no difference in some games while in others the drop can be close to 50%"

Anand even mentions that there are already people doing this, since the ExpressCard days:

"As some of you mentioned in the comments, there are cheaper alternatives available that provide about 70-90% of the desktop performance."
 
Last edited:

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
I have POSTED links to where a massive amount of benchmarking have been done to visit this exact topic you fail to understand. The effect of performance as a function of the reduction in bandwidth, to be exact. It is SHOWN over and over to be not a big deal. I am very much fully aware of TB's bandwidth, that it is equivalent to PCI-E 2.0 4x, which is shown to be sufficient for nearly all usage cases in my references. Which is why I linked a few modern, reputable sources demonstrating my point. Furthermore, computing workloads are generally not bandwidth constrained. Your "can be as much as 50%" is pure fear mongering FUD and is irrelevant. You have to be very bandwidth constrained to even come close to that, and then only for 1 game or so.

What is the real issue here ? Using dGPUs isn't common for Mac or PC users. Having a Mac isn't a hindrance to using dGPUs and neither is using a PC video card in a Mac. Who cares about the boot screen for performance GPU-related work ? You can always unplug it to visit that special screen. I have shown it to you with modern references as a viable option, and you keep stating that it is not simply because there are not enough current users. Let me remind you that all new technologies are adopted by the mass one at a time.

PS. Here is the whole quote from that update I have read a few times, so we can leave this discussion free of fear:

"...the performance seems to be around 80-90% of the full desktop performance based on synthetic benchmarks (3DMark and Unigine Heaven). Given that Thunderbolt 2 offers only 20Gbit/s of bandwidth while a PCIe 3.0 x16 slot offers 128Gbit/s, getting 80-90% of the performance is a lot more than expected. This will vary depending on the game, as based on our own PCIe scaling tests the PCIe bandwidth may cause little to no difference in some games while in others the drop can be close to 50%"

Look, I'm not trying to fear-monger here. I'm saying that you'd have a whole lot more people doing this sort of thing if it was viable for the mainstream. I can tell you with almost complete certainty that I'd never by another laptop with a discrete GPU ever again if this was viable, as docking my MacBook Pro to an eGPU would always be more preferable in damn near every way imaginable.

The fact of the matter is that it's not that viable because if it was, there'd be a whole lot more people doing it and I'm sure that Apple (who had no trouble milking the ATI Radeon HD 5870 and 5770 cards for all they were worth and then some for three years) would be doing it.

Also, while I appreciate that in some real-world tests you get 80-90% of the performance you'd get if the GPU was in an actual PCIe 3.0 x16 slot, the fact that the drop can be close to 50% means that the technology still has a way to go.

In short, wake me up when it's more mainstream, because for now, I think it's a bit more sensible to just buy the high-end 15" MacBook Pro with the discrete GPU in tow.

----------

Look, I'm not trying to fear-monger here. I'm saying that you'd have a whole lot more people doing this sort of thing if it was viable for the mainstream. I can tell you with almost complete certainty that I'd never by another laptop with a discrete GPU ever again if this was viable, as docking my MacBook Pro to an eGPU would always be more preferable in damn near every way imaginable.

The fact of the matter is that it's not that viable because if it was, there'd be a whole lot more people doing it and I'm sure that Apple (who had no trouble milking the ATI Radeon HD 5870 and 5770 cards for all they were worth and then some for three years) would be doing it.

Also, while I appreciate that in some real-world tests you get 80-90% of the performance you'd get if the GPU was in an actual PCIe 3.0 x16 slot, the fact that the drop can be close to 50% means that the technology still has a way to go.

In short, wake me up when it's more mainstream, because for now, I think it's a bit more sensible to just buy the high-end 15" MacBook Pro with the discrete GPU in tow.

And by "viable" I mean, I plug the damn thing in, install a software package that doesn't break each time I update my OS and/or require me to flash EFI firmware onto the eGPU that was never intended by the manufacturer and I get a boot screen that will allow me to at lease use EFI-level things like the option-boot interface.
 

Maxx Power

Cancelled
Apr 29, 2003
861
335
Look, I'm not trying to fear-monger here. I'm saying that you'd have a whole lot more people doing this sort of thing if it was viable for the mainstream. I can tell you with almost complete certainty that I'd never by another laptop with a discrete GPU ever again if this was viable, as docking my MacBook Pro to an eGPU would always be more preferable in damn near every way imaginable.

The fact of the matter is that it's not that viable because if it was, there'd be a whole lot more people doing it and I'm sure that Apple (who had no trouble milking the ATI Radeon HD 5870 and 5770 cards for all they were worth and then some for three years) would be doing it.

Also, while I appreciate that in some real-world tests you get 80-90% of the performance you'd get if the GPU was in an actual PCIe 3.0 x16 slot, the fact that the drop can be close to 50% means that the technology still has a way to go.

In short, wake me up when it's more mainstream, because for now, I think it's a bit more sensible to just buy the high-end 15" MacBook Pro with the discrete GPU in tow.

----------



And by "viable" I mean, I plug the damn thing in, install a software package that doesn't break each time I update my OS and/or require me to flash EFI firmware onto the eGPU that was never intended by the manufacturer and I get a boot screen that will allow me to at lease use EFI-level things like the option-boot interface.

We have a few things we disagree on, but let's just leave this for now. I am keeping an eye on this technology and may be it will catch on in the next few years. In the mean time and on a positive note, cheers to your holidays!
 

Yebubbleman

macrumors 603
May 20, 2010
6,005
2,585
Los Angeles, CA
We have a few things we disagree on, but let's just leave this for now. I am keeping an eye on this technology and may be it will catch on in the next few years. In the mean time and on a positive note, cheers to your holidays!

It's more a philosophical difference than a technical one at this point.

Either way, you let me know when it really takes off because that will be a total game changer. I'm not 100% sure that I'll ditch a dGPU-equipped 15" MacBook Pro in favor of a 13" when that time comes, but it'll sure be tempting.

Otherwise, happy holidays to you as well.
 

MacVidCards

Suspended
Nov 17, 2008
6,096
1,056
Hollywood, CA
We have recently entered into eGPU R & D.

And I can proudly state that we have already made tremendous progress.

Sadly, I can also state that the sad state of eGPU development has nothing to do with lack of bandwidth. It has EVERYTHING to do with Intel and Apple protecting their existing product lines.

For instance:

"If that wasn't the case, you'd see TONS more people doing it and a whole lot less people caring about Apple's higher-end models."

Which is exactly backwards. Apple & Intel have been quashing it from get-go for this very reason. Apple has grown fond of their "Better get a new one every 2-3 years since the old one is horribly outdated and we just intro'd some new software that requires feature "x" which isn't on that old piece of crap you bought from us 2 years ago". I'm sure the board is thrilled with the never ending increases in sales as each new Mac moves closer to the throw-away equivalence of a kleenex.

If you go to the tech inferno "DIY eGPU" board there is even an entire section of the forum devoted to exposing this nonsense.

Here is what I can tell you from just 2 months of playing around with this.

I toyed around with some simple ideas and was able to write a specific eGPU EFI that allows a TB2 connected eGPU behave as if it were inside of a TB2 Mac. And I mean full boot-screen support and ability to run OS X on external displays. On a Mini, this is downright magical. Instead of the lily-livered iGPU you can run a powerful GTX780. The boot screens pop-up, you choose which OS you want and boot right into it.

And this, mind you, via me and my buddy Netkas tossing a couple ideas around and me spending 15 minutes writing a rom. Point being, I AM A MORON compared to the level of engineers at Apple and Intel. If I could make a lovely, functional eGPU that allowed my base level 2014 Mini go from "Hobbled for Life" to "Beats nMP at many OpenGl tasks" with ease, imagine what they could do, IF THEY WANTED TO.

They don't want to, at least not now.

Have a read at that forum. See how 2 or 3 times now a wonderful eGPU product was announced and demoed at a trade show, only to vanish from sight after a few lawyers from Apple and/or Intel had a chat with them. Intel refuses to certify anything remotely useful for this. They even force the few TB to PCIE boxes out there for reasonable prices to NOT COMPLY with PCIE spec requiring 75 Watts available at slot. They make them ship strangled slots with only 25 Watts available since they know that no GPU on earth can run in such a slot.

In short, the very business practices that allow Apple and Intel execs to buy Islands by the chain are forcing them to sell you piddly, worthless TB to PCIE boxes that can't run eGPUs. They get rich, you get forced to buy a new computer every 2 years.

Guys, it is 100% feasible, they CAN work great. You are being lied to and manipulated by corporations who don't want you to have good choices, just profitable ones.

We will be offering Barefeats.com an eGPU for testing by end of January. He will be free to say whatever he likes and run whatever tests he likes. I have no team of lawyers to sick on him or other way to influence him. When the numbers come out, I think you will be surprised.
 
  • Like
Reactions: ClassicMartini
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.