Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Having a better GPU doesn't necessarily means = gaming, they're people who uses Mac for 3D related stuffs and better GPU will allow them to complete their job faster (applies to videographers, photographers and etc.). Also with how Apple is with GPU right now, guess they'll be left behind in any 3rd party VR/AR development as those developers will just develop for PC since Mac doesn't offer them the hardware required to do those development. If in the future Apple has to play catch up cause people didn't develop AR/VR, I'll say they deserved it as they put themselves into this corner when they have the resources and financial means to do better.
I agree. Over the next few months you are going to see more and more VR devices and software coming on line. FaceBook - Occulus Rift; Microsoft - Hololens, VR Minecraft, 3D Paint; Sony - PSVR; Google - Tilt Brush, Daydream, Cardboard; Samsung - Gear VR; HTC and Steam - Vive. I'm missing several others. Unity 3D works on both Mac and PC, but it seems most development is on the PC. Now, Tim Cook has stated they're more interested in augmented vice virtual reality, but I believe supporting eGPUs would allow them to hedge their bets.
 
Problem is Apple tend to be really fussy when it comes to hardware support, as they care about the user experience. They wouldn't want someone to be able to plug something in that'd completely crash their system. And so they'd want to officially support it. And even then, it'd likely only be a few select cards through their own interface (And likely costly too).

On top of this however, there's just not really a need for an eGPU at this stage, specially for Apple. They aren't gaming PCs and that's the main reason someone would want an eGPU. So the market they'd target would be workstation quality cards. Problem is people who need that kind of GPU will have a desktop, as it makes no sense to pair that with a notebook, there's no situation that would benefit from it (And again cost).

As I said, the only way I see this happening was with an Apple display with a build in GPU. Now though, they could go with LG. But it's just not going to happen... Again though, no doubt you'll be able to plug it in via bootcamp soon enough.

Bare in mind that currently, you could probably build a decent gaming system for the cost of just the eGPU+GPU card...

eGPU with Apple system are for those who doesn't want to have multiple computers, want a stable operating system for work, and plays light to heavy games. It's really cumbersome, from my experience, to have more than one main computer. This would extend Macbook's already long life to even longer from the upgradability.

So I think there's definitely a market. Some professionals prefer Nvidia over Radeon because of CUDA core. So I guess some professional could prefer this as well.
 
I dunno if the rMBP 15 would work with the Razer Core as most who have gotten it to work with non-Razer machines like the XPS 15 have had to disable the dGPU and only use the iGPU when using the Razor Core. AFAIK, there's no way to disable the dGPU on a MBP running bootcamps.
 
eGPU with Apple system are for those who doesn't want to have multiple computers, want a stable operating system for work, and plays light to heavy games. It's really cumbersome, from my experience, to have more than one main computer. This would extend Macbook's already long life to even longer from the upgradability.

So I think there's definitely a market. Some professionals prefer Nvidia over Radeon because of CUDA core. So I guess some professional could prefer this as well.
If only Apple is "innovative" enough or "courage" enough to allow eGPU at reasonable prices
 
  • Like
Reactions: fyun89
I dunno if the rMBP 15 would work with the Razer Core as most who have gotten it to work with non-Razer machines like the XPS 15 have had to disable the dGPU and only use the iGPU when using the Razor Core. AFAIK, there's no way to disable the dGPU on a MBP running bootcamps.

A guy in this thread proved that it works with Bootcamp. Of course that is 13" MBP tho.
 
It probably won't work until razor core use the new thunderbolt 3 controller the macs use there is an article about this on the macrumours front page.
 
eGPU with Apple system are for those who doesn't want to have multiple computers, want a stable operating system for work, and plays light to heavy games. It's really cumbersome, from my experience, to have more than one main computer. This would extend Macbook's already long life to even longer from the upgradability.

So I think there's definitely a market. Some professionals prefer Nvidia over Radeon because of CUDA core. So I guess some professional could prefer this as well.
I agree. The trouble with existing eGPU's is the need to reboot every time you connect/disconnect it. That's a deal-breaker, especially since Razer has solved that problem.
 
I dunno if the rMBP 15 would work with the Razer Core as most who have gotten it to work with non-Razer machines like the XPS 15 have had to disable the dGPU and only use the iGPU when using the Razor Core. AFAIK, there's no way to disable the dGPU on a MBP running bootcamps.

Don't need to disable the dGPU to have another GPU running.
 
A guy in this thread proved that it works with Bootcamp. Of course that is 13" MBP tho.
Which only has an igpu. We'll see if it works with the rmbp 15. Which actually has a quad core CPU which is required for best performance in 2016 directx12 games.
Don't need to disable the dGPU to have another GPU running.
how come Win10 laptops have to do it then to make it work like the xps 15? Are rmbp immune to this problem? Also this will be first time trying to make a tb3 enclosure work on an Apple device with a dgpu. I think you may be able to make it work by connecting the razer core to an external monitor, but the laptop LCD likely wouldn't work.
 
Which only has an igpu. We'll see if it works with the rmbp 15. Which actually has a quad core CPU which is required for best performance in 2016 directx12 games.

how come Win10 laptops have to do it then to make it work like the xps 15? Are rmbp immune to this problem? Also this will be first time trying to make a tb3 enclosure work on an Apple device with a dgpu. I think you may be able to make it work by connecting the razer core to an external monitor, but the laptop LCD likely wouldn't work.

Pass through back to the LCD requires Nvidia Optimus Prime whatever it's called
 
Pass through back to the LCD requires Nvidia Optimus Prime whatever it's called
I see, so theoretically it should work if you disable the AMD drivers for the built in GPU and just run using an external monitor. So the only way you could pipe the video back into the laptop with an external GPU was if Apple used an Nvidia GPU with optimus enabled, right?
 
A guy in this thread proved that it works with Bootcamp. Of course that is 13" MBP tho.

We haven't seen actual gaming though, I don't think?

You can get to the sort of connected device screens on a Dell XPS before tinkering: it doesn't actually work without blue screening though, if you start flooding that pipe with data.

Recognising the device happens on pretty much every Windows laptop I've seen... that's not the same as "working".
 
Having a better GPU doesn't necessarily means = gaming, they're people who uses Mac for 3D related stuffs and better GPU will allow them to complete their job faster (applies to videographers, photographers and etc.). Also with how Apple is with GPU right now, guess they'll be left behind in any 3rd party VR/AR development as those developers will just develop for PC since Mac doesn't offer them the hardware required to do those development. If in the future Apple has to play catch up cause people didn't develop AR/VR, I'll say they deserved it as they put themselves into this corner when they have the resources and financial means to do better.

eGPU with Apple system are for those who doesn't want to have multiple computers, want a stable operating system for work, and plays light to heavy games. It's really cumbersome, from my experience, to have more than one main computer. This would extend Macbook's already long life to even longer from the upgradability.

So I think there's definitely a market. Some professionals prefer Nvidia over Radeon because of CUDA core. So I guess some professional could prefer this as well.

You missed my point, I said the market would be in workstation quality cards, I.E. Quadro cards, not gaming cards. Which is what you went on to suggest.

The problem is again, there is no point in pairing a workstation card with a machine like that. It just makes no sense. GPU rendering is still a bit iffy with things like V-Ray and 3DS Max, so you use CPU rendering. The GPU is used primarily to provide RT rendering and provide viewport rendering. Production renders are still pushed through the CPU, as it just has more options for generating the render that the RT renders still lack. I used to use a Xeon thingy at work, think it had 12 cores, 32GB RAM etc. Still would take 45 minutes for production renders, RT renderer would produce a lot of artefacts and just wasn't stable enough to get the job done.

So what I'm saying, is there is no benefit to sticking a Quadro in a machine with 4 cores, it doesn't really make anything faster. And nobody who used a Quadro would pair it with a notebook. So I don't really see a market in the professional side. For Photography/Graphics work, the dGPU can handle these things no problem. So this takes me back to the gaming side as the only viable market, and as explained earlier, the costs involved don't make a lot of sense to a gamer.
 
You missed my point, I said the market would be in workstation quality cards, I.E. Quadro cards, not gaming cards. Which is what you went on to suggest.

The problem is again, there is no point in pairing a workstation card with a machine like that. It just makes no sense. GPU rendering is still a bit iffy with things like V-Ray and 3DS Max, so you use CPU rendering. The GPU is used primarily to provide RT rendering and provide viewport rendering. Production renders are still pushed through the CPU, as it just has more options for generating the render that the RT renders still lack. I used to use a Xeon thingy at work, think it had 12 cores, 32GB RAM etc. Still would take 45 minutes for production renders, RT renderer would produce a lot of artefacts and just wasn't stable enough to get the job done.

So what I'm saying, is there is no benefit to sticking a Quadro in a machine with 4 cores, it doesn't really make anything faster. And nobody who used a Quadro would pair it with a notebook. So I don't really see a market in the professional side. For Photography/Graphics work, the dGPU can handle these things no problem. So this takes me back to the gaming side as the only viable market, and as explained earlier, the costs involved don't make a lot of sense to a gamer.

For the photographers and videographers, their applications heavily rely on graphic card (Nvidia being very effective due to their CUDA Core). One of the main complaints of the macbook is that they dont ship withh Nvidia chip. SO this would absolutely have a market for those people.
Remember. Laptop's "mobile" GPU is SIGNIFICANTLY slower than desktop GPU. When you said it can handle photography/graphics work, of course it can, albeit much slower work flow that is.

Everyone values different things. This solution (if it works) can actually save money for those people who want to have a single laptop computer that is as powerful as desktop at home and have portability, want stable OSX system, and save money by purchasing cheapest Macbook option to offset the eGPU set up.
Apple charges 200 dollar for barely 50-60% faster chip (460) while desktop GPU of the same price can boost 2x to 4x. The difference between lowest end and highest end Macbook Pro is more than $1000. So you do the math which would be more valuable to opt for.

Another important factor is that this future proofs your investment (in laptop) as well. Photographers/videographers/gamers ornother application that rely on GPU than CPU will be able to use Macbook for even longer time thanks to its upgradability.

I mean, you can already see people in this thread that wants eGPU. I dont think those people dont have any idea on what they want to invest in. I certainly calculated benefit/cost scenario before thinking about eGPU. Heck, even for windows community theres aare huge websites dedicated to eGPU.
 
Photo editing apps are generally very efficient and need to support many older machines. Therefore there is very little advantage using a powerful GPU. Even Intel chips can run Photoshop at great speed (especially on Windows). I have benchmarked the hell out of Photoshop using 5 generations of recent chips (Gt120, GTX 680, GTX 980, GTX1070, RX 460, Intel 530) and there was almost zero difference.

Video apps on the other hand generally need a good card with nice amount of VRAM. But the biggest is using Windows. Video acceleration and rendering just makes macOS look like constipationOS.
 
Photo editing apps are generally very efficient and need to support many older machines. Therefore there is very little advantage using a powerful GPU. Even Intel chips can run Photoshop at great speed (especially on Windows). I have benchmarked the hell out of Photoshop using 5 generations of recent chips (Gt120, GTX 680, GTX 980, GTX1070, RX 460, Intel 530) and there was almost zero difference.


Many 2D programs are mainly CPU dependent.
 
Another eGPU option is the Acer Graphics Dock. It only provides a 960M, but that is a huge improvement over the integrated cards on the 13" MBPs. And it's substantially cheaper and more compact. It would be sweet if they released updates with a 1050 or 1060 in there.

http://www.notebookcheck.net/Acer-Graphics-Dock-with-Nvidia-GTX-960M-Review.175429.0.html

Wow didn't know this thing existed, I like it quite a bit. 960m brings it to the surface book performance level--which is pretty awesome for such a small form factor.
 
Wow didn't know this thing existed, I like it quite a bit. 960m brings it to the surface book performance level--which is pretty awesome for such a small form factor.


I'd wait before buying... I suspect it's going to need Apple co-operation for any of these things to work plug-and-play - the driver tweaking is complicated.

I've never known Apple to act like that.
 
Photo editing apps are generally very efficient and need to support many older machines. Therefore there is very little advantage using a powerful GPU. Even Intel chips can run Photoshop at great speed (especially on Windows). I have benchmarked the hell out of Photoshop using 5 generations of recent chips (Gt120, GTX 680, GTX 980, GTX1070, RX 460, Intel 530) and there was almost zero difference.

Video apps on the other hand generally need a good card with nice amount of VRAM. But the biggest is using Windows. Video acceleration and rendering just makes macOS look like constipationOS.

Have you not heard of the feature called Hardware Acceleration? They effectively use GPU with Cuda Cores. Higher end Nvidia chips have more CUDA core and therefore higher performance.
 
Have you not heard of the feature called Hardware Acceleration? They effectively use GPU with Cuda Cores. Higher end Nvidia chips have more CUDA core and therefore higher performance.

You are talking to someone who used these apps since the days of Superpaint and Quantel Paintbox. You're taking to someone who is the main hardware and software tester on the Mac Pro forum.

There is almost zero CUDA in most photo editing apps including Photoshop. A couple of simple plugins that don't even fully utilise all the cores. End of the story.
 
  • Like
Reactions: fyun89
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.