Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It is high time for the Mac owner to stand up to Apple regarding planned obsolescence.

I have recently purchased a 2015 4k Retina iMac which was released late 2015 which makes the Mac less than three years old
https://everymac.com/systems/apple/...-inch-aluminum-retina-4k-late-2015-specs.html

I would take advantage of an eGPU but would require Thunderbolt 3 to benefit from this feature. I have Thunderbolt 2 therefore I am prohibited from using an eGPU. Remember this Mac is less than three years old therefore it is not unreasonable to expect full comparability.

Apple are obsessed with USB-C/Thunderbolt 3 to the extent that is alienating consumers myself included. Of course all of this may sound like sour grapes on my part but I consider myself to be a victim of blatant planned obsolescence and I am damned if I am going to purchase the 2017 model to further line Apple's pockets.
Thunderbolt 1 and 2 were always a mess. It's an old rule never to trust any port that the regular PC manufacturers don't use. This has been happening with Apple since the early 2000s with FireWire, but to be fair, it was way better than USB for many things. I don't know whether to call it planned obsolescence or just something against establishing a standard at any cost to performance.

Oh yeah, I do hate their display outputs. Always been nonsense with them, except for the glorious 3 years of just having a frickin regular HDMI port on the rMBP.
 
Does this mean I could play some AAA games (Through Windows) with a GTX 1080ti using my 2016 matchbook pro?

Edit: NM...now Nvidia support :(

Yep 100%

Nvidia cards work just fine in Windows on Mac, in fact they work better and are easier to get working than AMD cards. Your 2016 is harder to get working than the 2017 models (my 13" 2017 worked out of the box, just plugged it all in and I was gaming) but very doable.

I have done some crude benchmarking of a selection of games here: https://egpu.io/forums/implementati...touch-bar-gtx1070-sonnet-breakaway-box-win10/
[doublepost=1522655452][/doublepost]
Remember this iMac is less than three years old being released in October of 2015. It is not unreasonable to expect full compatibility without the need to resort to hacks, thats if one should even become available. Apple have an obsession with USB-C/Thunderbolt 3 which is detrimental to many of its core consumer base.

There are no hacks for getting TB1 or 2 working in .4 atm.

3 years old isn't exactly old (well, 2.5 really) but this is the fast moving world of tech, I wouldn't expect brand new features to be supported by anything but brand new tech, but yeah, its still a ****** move from Apple seeing as TB2 was supported in the Betas.
[doublepost=1522655506][/doublepost]
Indeed. And it would be completely non-upgradeable. The GPU would be glued to the case, and it wouldn't be compatible with anything but the latest macs.

If it looked good and had a small footprint I would not care. Price be damned.
 
I don't know what to say really. eGPU made sense 5+ years ago when Sony did it with Z2. At the time GPU inside laptops was really taxing the battery and it didn't turnout the desired performance either. Today gap between desktop and laptop has pretty much closed for any kind of prosumer type of workflow that is not strictly GPU oriented.

eGPU doesn't make your portable computer portable anymore, all of those housings are gigantic to carry around which beats the purpose of portability. It's OK if you need a GPU once in a while but if your workflow is laptop + eGPU you mind as well should have bought a desktop or one of those gaming laptops with proper GPU like Aouris or MSI WS.

Future in eGPU is custom hardware game where manufacturers will make GPUs more compact or put few mobile GPUs together in a tight case. Putting enclosed desktop GPUs in another enclosure doesn't make much sense.
 
I mean, gaming on a mac is hardly ideal...And on a MBPs chip you wont push an nVidia GPU to it's max unless it's mid range and that's where AMD sits.

Why are you only talking about laptops? What about iMacs with proper 65w and 91w cpus? They won’t bottleneck high end gpus.
 
  • Like
Reactions: sd70mac
This is just the infant stages of eGPU support on the Mac. I think we’ll see gaming finally become much more viable on the Mac in the coming years. This was always the reason why I stayed away from Macs despite preferring OS X for everything else. While gaming shouldn’t be the deciding factor when choosing a machine, it’s always nice if you have the ability to play the latest stuff especially when spending over $2k on a machine. MacOS is the superior OS. It’s basically Linux with proprietary software. Yes, I know there is a big difference between a closed and open source kernel.

That's pretty much my thought process to a T. I bought a Mac Mini because I was tired of iCloud Photos ceasing to work on my PC. I basically use it for all of my normal computing tasks, with it's primary purpose being storing and archiving my media- Photos downloads my pics and videos, iTunes keeps my music and movies, and Time Machine backs it all up to a 6tb external.

Down the road, it'll likely be a MBP, a Free Sync monitor (since my current G-Sync HP Omen won't jive with the AMD card), and a Thunderbolt cable, lol.
 
Like many I have not purchased a Mac to run Windows. If I had wanted to run Windows I would have purchased a PC.
[doublepost=1522610105][/doublepost]
It very much does matter how recent the Mac is. Here is a hypothetical situation not so different to the one I am faced with. If you were to purchase a 2016 MacBook Pro only to find key features will not work on it because the manufacturers (in this case Apple) have deemed it acceptable to change the ports on the 2017 MacBook Pro. therefore rendering your 2016 MacBook Pro obsolete in just a year.

How would that deemed acceptable in any way?

It would be totally acceptable if some new feature introduced into technology for the 2017 MacBook Pro wouldn't work on a 2016 MacBook Pro. I think that's called progress. You buy a computer for features it has, and features it was promised by the vendor. Your iMac does not have Thunderbolt 3, and eGPU support was never promised for any computers with Thunderbolt 2. And NOT SUPPORTED by the company behind Thunderbolt 2 (as much as you want to whine to Apple about it.) The fact that it can be made to work anyway is considered a bonus.
 
Yes, you can. Thanks to egpu.io community, my eGPU setup, an Aorus Gaming Box (Geforce GTX 1070), works perfect with my macbook pro 15” late 2016.
I use it to play AAA steam games on Windows 10 bootcamp.
[doublepost=1522623873][/doublepost]

Check egpu.io . You can RUN CUDA stuff on macs using an eGPU.


I have a triple-boot Mac OS, Windows 7 and Linux systems on my MBP 2010. Does eGPU works well under Windows 7 and Linux?
 
Just did some testing on my system... 2016 MacBook Pro 15" with Touchbar... Mantiz eGPU case with an AMD RX 580.

Under 10.13 I got an OpenCL Score of 124019... Under 10.13.4 it dropped to 49996... Why?
 
CUDA is such a PitA under macOS anyway that it's not worth the trouble, whether it's Nvidia's or Apple's or the community's fault for lack of support. I've gone down that annoying path both with TensorFlow and PyTorch using a Mac Pro w/ GTX 1060. I'd only do it with a dedicated Linux box. Was a piece of cake to set everything up on my custom build.

I have been waiting for Apple to release a new MBP with reasonably comfortable keyboard. I don't know if they will release a new one in June. Even so, most likely they will not put Nvidia GPU in it. Somewhere on the internet I read that even the computations are done on the GPU, using TensorFlow on the Mac via eGPU is not a good idea due to bottleneck of the interface's transfer speed. Is it true? Not sure about how good (if any) are the Linux and Windows supports.

My MBP 2010 is getting old. It cannot drive a 4K display. I want to display Tensorboard on a 4K monitor. This year, I plan to buy a new lightweight laptop for daily work and basic coding. Is it better to get a Windows laptop with Nvidia GPU and install Linux or get a cheap MBP with triple boots and build a powerful Linux workstation?
 
I have been waiting for Apple to release a new MBP with reasonably comfortable keyboard. I don't know if they will release a new one in June. Even so, most likely they will not put Nvidia GPU in it. Somewhere on the internet I read that even the computations are done on the GPU, using TensorFlow on the Mac via eGPU is not a good idea due to bottleneck of the interface's transfer speed. Is it true? Not sure about how good (if any) are the Linux and Windows supports.

My MBP 2010 is getting old. It cannot drive a 4K display. I want to display Tensorboard on a 4K monitor. This year, I plan to buy a new lightweight laptop for daily work and basic coding. Is it better to get a Windows laptop with Nvidia GPU and install Linux or get a cheap MBP with triple boots and build a powerful Linux workstation?
I wouldn't deal with GPU-powered machine learning on laptops in the first place. Linux desktop workstation is all I've used for that. You can still run Tensorboard on your Mac if you point it to the machine that's running Tensorflow, but I forget how.
 
With 10.13.4 an eGPU actually can accelerate the internal display, however the developers of each game/app have to add support for it. No idea how easy or difficult it is to implement that functionality.

Really? I thought the 5k iMac is the one machine that doesn't work that way, because of the way Apple had to build 5k support into the system.

I have not seen anyone on egpu.io successfully get the internal iMac display to be accelerated by an external GPU. The MacBooks, yes (with a big performance hit vs external display), but not the 5k iMac.
 
I don't know what to say really. eGPU made sense 5+ years ago when Sony did it with Z2. At the time GPU inside laptops was really taxing the battery and it didn't turnout the desired performance either. Today gap between desktop and laptop has pretty much closed for any kind of prosumer type of workflow that is not strictly GPU oriented.

eGPU doesn't make your portable computer portable anymore, all of those housings are gigantic to carry around which beats the purpose of portability. It's OK if you need a GPU once in a while but if your workflow is laptop + eGPU you mind as well should have bought a desktop or one of those gaming laptops with proper GPU like Aouris or MSI WS.

Future in eGPU is custom hardware game where manufacturers will make GPUs more compact or put few mobile GPUs together in a tight case. Putting enclosed desktop GPUs in another enclosure doesn't make much sense.
It makes sense when you only need a powerful GPU at your desk and do not want another computer.
 
  • Like
Reactions: sd70mac
I wouldn't deal with GPU-powered machine learning on laptops in the first place. Linux desktop workstation is all I've used for that. You can still run Tensorboard on your Mac if you point it to the machine that's running Tensorflow, but I forget how.

I am trying to save some money. I need to replace my heavy laptop sooner or later. If I build a linux workstation, I will install 64GB-128GB RAM, perhaps even more for long simulations.
 
  • Like
Reactions: sd70mac
Really? I thought the 5k iMac is the one machine that doesn't work that way, because of the way Apple had to build 5k support into the system.

I have not seen anyone on egpu.io successfully get the internal iMac display to be accelerated by an external GPU. The MacBooks, yes (with a big performance hit vs external display), but not the 5k iMac.
Hmmm... it does say iMac, it could be a mistake and Apple meant iMac Pro, or maybe there just hasn’t been the appropriate developer support implemented yet.

https://support.apple.com/en-us/HT208544

It’s near the end, in the “eGPU support in applications” section, the last bullet:

  • Pro applications and 3D games that accelerate the built-in display of an iMac or MacBook Pro. (This capability must be enabled by the application's developer.)
 
Hmmm... it does say iMac, it could be a mistake and Apple meant iMac Pro, or maybe there just hasn’t been the appropriate developer support implemented yet.

https://support.apple.com/en-us/HT208544

It’s near the end, in the “eGPU support in applications” section, the last bullet:

  • Pro applications and 3D games that accelerate the built-in display of an iMac or MacBook Pro. (This capability must be enabled by the application's developer.)

Thanks, that gets me excited!

It seems like that statement is pretty clear about what they claim it can do. IIRC, the pre-2017 iMacs had issues with eGPU driving the internal display. The link you provided says Apple's support is limited to 2017 only, so perhaps it's all good news now on my 2017 iMac!

Meanwhile I'll let others test the heck out of it because it's not like I can find a decently priced GPU right now anyway. ;-)
 
Thanks, that gets me excited!

It seems like that statement is pretty clear about what they claim it can do. IIRC, the pre-2017 iMacs had issues with eGPU driving the internal display. The link you provided says Apple's support is limited to 2017 only, so perhaps it's all good news now on my 2017 iMac!

Meanwhile I'll let others test the heck out of it because it's not like I can find a decently priced GPU right now anyway. ;-)
Hopefully the work necessary to implement this won’t be too burdensome. I’m sure Apple took into account how much developer buy-in they could expect (or surveyed key developers in advance) before they went to the trouble of adding this new capability. Looking forward to developer announcements.
 
  • Like
Reactions: sd70mac
I still don't understand the choice of GPU in the 2016 and 2017 MBP, the radeon pro 455-560 are honestly so rubbish. A mobile 1060 would have been such a hugely better option in my eyes. So the fact that people (like myself) are buying eGPU's is saying something about apples current line up. Definitely is disappointing that Nvidia cards are still not natively supported and more so that thunderbolt 1/2 have been rendered useless in 10.13.4...


Nvidia GTX 1070 vs AMD Radeon Pro 555
Userbenchmark Effective 3D Gaming GPU Speed: 99.7% vs 19.6%
Nvidia GTX 1070 - 15th / 579
AMD Radeon Pro 555 - 125th / 579

Don't compare the Nvidia 1070 with AMD 555. The 1070 runs too hot and would not work in a MBP body.
The closest similar PC laptop to a MBP is the Razer with 1060 GPU.

The Nvidia 1070 chip can only be used in a big heavy dedicated gaming laptop. You need a big body with lots of fans to remove the heat a 1070 will generate.
[doublepost=1522731631][/doublepost]
Even a 1060 Max-Q has about a 70W TDP. The Radeon 560 in the 2017 MacBook Pro is supposed to be 35W. AMD actually has pretty powerful stuff for low wattage applications.

Its all about heat. The ultra thin design of the latest MBP makes it impossible to have a high wattage (lots of heat) GPU in it. Want a toaster oven sitting in your lap !

But now with eGPU you have access to a powerful desktop GPU if you want it.
 
  • Like
Reactions: sd70mac
Apple should make its own eGPU product.

I told people way back when the speculation started that this would result in $600 for a mid-range card. It seems I was correct. Apple won't make this directly, because they never release this kind of accessory under their own brand.

Not for raw compute though. And Apple wants to use OpenCL instead of CUDA so they go AMD.

Apple seems to be pushing Metal these days. I would still go with CUDA, even though their refusal to release the isa irritates me.
 
  • Like
Reactions: sd70mac
Hmmm... it does say iMac, it could be a mistake and Apple meant iMac Pro, or maybe there just hasn’t been the appropriate developer support implemented yet.

https://support.apple.com/en-us/HT208544

It’s near the end, in the “eGPU support in applications” section, the last bullet:

  • Pro applications and 3D games that accelerate the built-in display of an iMac or MacBook Pro. (This capability must be enabled by the application's developer.)

The problem here is the low bandwidth from the eGPU back to the internal display. You will have low frame rates because of this, whereas there is no bottleneck if an monitor is attached to the eGPU box.

You would end up spending a lot of money on the eGPU box + desktop graphics card, only to get poor frame rate on your internal display. Apple is recommending the best solution given the bandwidth limits of TB3.

Compute tasks can work well within those bandwidth limits.
[doublepost=1522732659][/doublepost]
Why are you only talking about laptops? What about iMacs with proper 65w and 91w cpus? They won’t bottleneck high end gpus.
iMacs have always used Mobile GPUs. (Exception: the new iMac Pro with its redesigned cooling.)

Poor cooling kills GPUs - leading to early death of your iMac.

That is why low wattage mobile parts are used in most iMacs. The elegant single body construction forces compromises on heat removal.
[doublepost=1522732854][/doublepost]
I told people way back when the speculation started that this would result in $600 for a mid-range card. It seems I was correct. Apple won't make this directly, because they never release this kind of accessory under their own brand.



Apple seems to be pushing Metal these days. I would still go with CUDA, even though their refusal to release the isa irritates me.

As a developer, the lack of any debugging tools on Metal is a very serious problem. CUDA provides excellent debugging support.

My app supports OpenCL, Metal, and CUDA on both AMD and Nvidia GPUs. So I see all of these pros/cons every day.
[doublepost=1522734152][/doublepost]
Well "like 0 ML applications that doesn't use CUDA" - that's pretty nonsense.

My own applications use OpenCL, and FP16 is much better for my training application. YMMV, but your statement is just tosh.

Yes, CUDA is more mature and more widely used, but it's not the only option out there. My Vega frontier get's me the same performance at my chosen precision level as the ridiculously overpriced Nvidia cards.
[doublepost=1522605424][/doublepost]WRT TB2 - I'm not defending Apple here at all, but I would imagine it's largely due to drivers/resource focusing.

Getting an eGPU to work (e.g. under linux) is pretty damn hard work - thunderbolt standards are a nightmare to work with, so it makes sense, in a way, to focus resources.

That said, there is a hack to make it work, but YMMV.

As another GPU software developer, I think Brian hit the nail on the head. I think it all boils down to drivers/ resources and size of Apple's Mac GPU developer team.

At the end of the day, Apple is not a GPU company and would rather spend its R&D budget on non-GPU stuff.

But the 2 biggest trends in the whole Computer device market are 1) Mobile miniaturization and 2) GPU compute.

The new big push is in AI and self-driving cars. This is all because of the massive power from GPU compute APIs.

It really is in Apple's self interest to not fall behind in the quality of their GPU drivers and platforms.
 
Last edited:
  • Like
Reactions: sd70mac
A lot of good stuff like this has happened under Tim, while this same move would have been looked down under Steve's regime. I really dont understand why Tim gets so much flak, he is actually one of the best CEO's Apple ever had. Really appreciate Tim for the eGPU support on the Mac. Every time something new happens people say the same thing "Ohh this maybe the first time Apple did something good under Tim", and then he comes up with something new in a few months like iMac pro or Upgradeable Mac (In pipeline) or iPhone with face id etc. Tim is an under appreciated CEO.
 
A lot of good stuff like this has happened under Tim, while this same move would have been looked down under Steve's regime. I really dont understand why Tim gets so much flak, he is actually one of the best CEO's Apple ever had. Really appreciate Tim for the eGPU support on the Mac. Every time something new happens people say the same thing "Ohh this maybe the first time Apple did something good under Tim", and then he comes up with something new in a few months like iMac pro or Upgradeable Mac (In pipeline) or iPhone with face id etc. Tim is an under appreciated CEO.

I really don't want to move this subject into politics. But one thing sticks out really clear here. Apple did really mess up the Mac Pro 6.1 in an effort to B.S. semi professionals with a device that can not be upgraded. But hey, give them some rope. So we did. Now the eGPU would have been the exit door and the repair patch for professionals that spent big price tags on that trash can. But wait! No -those &%$§' stopped the support for Thunderbolt 2 devices. So all 6.1 customers will not profit from any eGPU at all. This is the biggest B.S. Apple ever produced. They really think people are stupid.
What good will an eGPU do if it only can be used on Thunderbolt 3 computers that have a fairly new gpu anyways.. ??
 
I found out why Apple won't officially support nVidia anymore.
Apple haven’t and won’t because if issues with nVidia back in the past over GPU issues with 860M graphics and both companies didn’t resolve their differences.

Check out Linus Tech Tips youtube channel out = "AMD strikes back at GPP - WAN Show Mar 30 2018" and from 17:45 onwards.. he has the reason why !! Check his youtube channel.. It’s on there!!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.