Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It is high time for the Mac owner to stand up to Apple regarding planned obsolescence.

I have recently purchased a 2015 4k Retina iMac which was released late 2015 which makes the Mac less than three years old
https://everymac.com/systems/apple/...-inch-aluminum-retina-4k-late-2015-specs.html

I would take advantage of an eGPU but would require Thunderbolt 3 to benefit from this feature. I have Thunderbolt 2 therefore I am prohibited from using an eGPU. Remember this Mac is less than three years old therefore it is not unreasonable to expect full comparability.

Apple are obsessed with USB-C/Thunderbolt 3 to the extent that is alienating consumers myself included. Of course all of this may sound like sour grapes on my part but I consider myself to be a victim of blatant planned obsolescence and I am damned if I am going to purchase the 2017 model to further line Apple's pockets.
 
  • Like
Reactions: sd70mac
It is high time for the Mac owner to stand up to Apple regarding planned obsolescence.

I have recently purchased a 2015 4k Retina iMac which was released late 2015 which makes the Mac less than three years old
https://everymac.com/systems/apple/...-inch-aluminum-retina-4k-late-2015-specs.html

I would take advantage of an eGPU but would require Thunderbolt 3 to take advantage of this feature. I have Thunderbolt 2 therefore I am prohibited from using an eGPU. Remember this Mac is less than three years old therefore it is not unreasonable to expect full comparability.

Apple are obsessed with USB-C/Thunderbolt 3 to the extent that is alienating consumers myself included. Of course all of this may sound like sour grapes on my part but I consider myself to be a victim of blatant planned obsolescence.
I believe there is a hack or work around to get an eGPU on TB 2.
 
  • Like
Reactions: sd70mac
Those days, for the most part, are gone. It's sad because the last truly upgradeable Macs, like the pre-2014 mini, pre 2013 pro and pre 2012 MBP, are going to be vintage, retired, obsolete (whatever Apple calls it) and they will be un-servicable and soon the new OS's will not be compatible.

This year's Mac Pro will be upgradable to some extent. I expect it will have at least one PCIe slot for a GPU. It's not going to be cheap of course (but neither was the 2013, so...).


It is high time for the Mac owner to stand up to Apple regarding planned obsolescence.

I have recently purchased a 2015 4k Retina iMac which was released late 2015 which makes the Mac less than three years old
https://everymac.com/systems/apple/...-inch-aluminum-retina-4k-late-2015-specs.html

I would take advantage of an eGPU but would require Thunderbolt 3 to benefit from this feature. I have Thunderbolt 2 therefore I am prohibited from using an eGPU. Remember this Mac is less than three years old therefore it is not unreasonable to expect full comparability.

Apple are obsessed with USB-C/Thunderbolt 3 to the extent that is alienating consumers myself included. Of course all of this may sound like sour grapes on my part but I consider myself to be a victim of blatant planned obsolescence and I am damned if I am going to purchase the 2017 model to further line Apple's pockets.

You realize that not even Intel, you know, the company behind Thunderbolt, officially supported eGPUs until recently. It would be silly to try to "stand up to Apple" for not supporting what Intel wouldn't. And it's also silly to blame Apple for not supporting a future feature that was never promised when the machine was released. If your "planned obsolescence" theory really held water, you wouldn't even be getting software updates. They could've stopped at Yosemite.

In any case, there will be a patch that allows Thunderbolt 2 eGPUs to work, anyway. I'm using a similar iMac (2014 5K) right now with an eGPU (with a 1070 in it) to run 3D games and such in Windows, for whatever that's worth.
 
This year's Mac Pro will be upgradable to some extent. I expect it will have at least one PCIe slot for a GPU. It's not going to be cheap of course (but neither was the 2013, so...).




You realize that not even Intel, you know, the company behind Thunderbolt, officially supported eGPUs until recently. It would be silly to try to "stand up to Apple" for not supporting what Intel wouldn't. And it's also silly to blame Apple for not supporting a future feature that was never promised when the machine was released. If your "planned obsolescence" theory really held water, you wouldn't even be getting software updates. They could've stopped at Yosemite.

In any case, there will be a patch that allows Thunderbolt 2 eGPUs to work, anyway. I'm using a similar iMac (2014 5K) right now with an eGPU (with a 1070 in it) to run 3D games and such in Windows, for whatever that's worth.
Remember this iMac is less than three years old being released in October of 2015. It is not unreasonable to expect full compatibility without the need to resort to hacks, thats if one should even become available. Apple have an obsession with USB-C/Thunderbolt 3 which is detrimental to many of its core consumer base.
 
Last edited:
  • Like
Reactions: sd70mac
Remember this iMac is less than three years old being released in October of 2015. It is not unreasonable to expect full compatibility without the need to resort to hacks, thats if one should even become available. Apple have an obsession with USB-C/Thunderbolt 3 which is detrimental to many of its core consumer base.

It doesn't matter how recent the computer is. It has a port that was never intended for eGPUs, by the company that developed the PCIe tech in the port. It *is* unreasonable to expect full compatibility with future tech that was released after you bought your computer. You expect Apple to say "oh well yeah we know intel never supported this, we know it only has 1/2 the bandwidth, we know people will complain... but we will support it anyway"?

If your iMac had a Thunderbolt 3 port and a chipset with all of the required features, and Apple arbitrarily read your system type and caused eGPUs to silently fail, you'd have an argument.

People have already gotten eGPUs to work with 10.13.4 on 2013 Mac Pros, so I reckon it's only a matter of days before a workaround is documented.

USB-C/Thunderbolt 3 is one of the best things to happen to computers IMO. Transitions are always tough at the beginning, but they're the most versatile and highest bandwidth ports ever in a consumer computing device. As you already know, you can plug in a graphics card :D
 
  • Like
Reactions: star-affinity
It's cause the Nvidia CEO is arrogant and obnoxious and Jobs hated him and probably told Cook to never partner with him. Though it is too bad, the 10 series wipe the floor with the AMD stuff.
 
  • Like
Reactions: sd70mac
It is high time for the Mac owner to stand up to Apple regarding planned obsolescence.

I have recently purchased a 2015 4k Retina iMac which was released late 2015 which makes the Mac less than three years old
https://everymac.com/systems/apple/...-inch-aluminum-retina-4k-late-2015-specs.html

I would take advantage of an eGPU but would require Thunderbolt 3 to benefit from this feature. I have Thunderbolt 2 therefore I am prohibited from using an eGPU. Remember this Mac is less than three years old therefore it is not unreasonable to expect full comparability.

Apple are obsessed with USB-C/Thunderbolt 3 to the extent that is alienating consumers myself included. Of course all of this may sound like sour grapes on my part but I consider myself to be a victim of blatant planned obsolescence and I am damned if I am going to purchase the 2017 model to further line Apple's pockets.

Use Windows through Bootcamp and you should be able to use TB2. TB3 is still pretty new. Most machines don't have it at all! It's a big leap from TB2 and eGPU performance is much better using TB3. This is not planned obsolescence in this case. eGPUs will remain a niche market. Only certain devs and gamers will be interested in adding high-end GPUs on top of already pricy hardware. For now Apple recommends the RX570 and 580 which aren't expensive but if using a 1080ti in MacOS were supported you're looking at over $1k for the GPU and enclosure. Actually around $1.5k since you won't find a 1080ti anywhere near retail.
 
This year's Mac Pro will be upgradable to some extent. I expect it will have at least one PCIe slot for a GPU. It's not going to be cheap of course (but neither was the 2013, so...).




You realize that not even Intel, you know, the company behind Thunderbolt, officially supported eGPUs until recently. It would be silly to try to "stand up to Apple" for not supporting what Intel wouldn't. And it's also silly to blame Apple for not supporting a future feature that was never promised when the machine was released. If your "planned obsolescence" theory really held water, you wouldn't even be getting software updates. They could've stopped at Yosemite.

In any case, there will be a patch that allows Thunderbolt 2 eGPUs to work, anyway. I'm using a similar iMac (2014 5K) right now with an eGPU (with a 1070 in it) to run 3D games and such in Windows, for whatever that's worth.
Naw, we just need AirDrop for sharing, iTunes for media consumption and iCloud for storage.
Use Windows through Bootcamp and you should be able to use TB2. TB3 is still pretty new. Most machines don't have it at all! It's a big leap from TB2 and eGPU performance is much better using TB3. This is not planned obsolescence in this case. eGPUs will remain a niche market. Only certain devs and gamers will be interested in adding high-end GPUs on top of already pricy hardware. For now Apple recommends the RX570 and 580 which aren't expensive but if using a 1080ti in MacOS were supported you're looking at over $1k for the GPU and enclosure. Actually around $1.5k since you won't find a 1080ti anywhere near retail.
The 2017 iMac has a 580 pro. Are any of the supported eGPU’s more powerful?
 
  • Like
Reactions: sd70mac
I'm really interested in hooking up an eGPU to my late-2016 15" MBP. The problem is that the RX580 is a really a mid-range card that shouldn't cost anywhere near $400. The only enclosure that really supports 87W charging is the Sonnet Breakaway 650w. It will also have enough overhead for a really high-end GPU. The problem is the $450 price tag on top of nearly $1k for a great GPU right now.

I also want to see if Apple adds full nVidia support updating OpenGL and adding Vulkan. The GPU market also has to go back to normal. The miners have driven the prices up to really stratospheric levels. An enclosure with an 8GB RX580 would definitely be much better than the 455 in my MacBook but it's still nowhere near the 11 TFLOPS you'll find on nVidia's 1080ti. Early adopters always get screwed. Support will improve and prices should come down.

GPU performance will be superior in Windows anyway due to better drivers so it might make more sense just to build a PC for not much more than what a good enclosure with an RX580 costs although I would love having more graphical power within macOS.
 
Naw, we just need AirDrop for sharing, iTunes for media consumption and iCloud for storage.

The 2017 iMac has a 580 pro. Are any of the supported eGPU’s more powerful?

Yes, you will see a huge improvement in performance with an RX580. The Pro 560 (in the iMac and MBP to be clear) only has 2 TFLOPS while the RX580 has 6 TFLOPS. However to get the most power out of the card you have to use it in Windows 10 through Bootcamp.

The RX560 and 570 are really not worth it. I also wouldn't go for the 4GB version of the RX580.
[doublepost=1522604696][/doublepost]
Don't know if this has anything to do with it, but AMD does have much better open source drivers.

But nVidia has much better performance. It's like the difference between Intel and AMD CPUs. AMD is so embarrassing that a Sandy Bridge i5-2500k still keeps up with the some of the latest AMD stuff. Ryzen is a little better but the FX stuff is garbage.
 
Yes, you will see a huge improvement in performance with an RX580. The Pro 560 (in the iMac and MBP to be clear) only has 2 TFLOPS while the RX580 has 6 TFLOPS. However to get the most power out of the card you have to use it in Windows 10 through Bootcamp.

The RX560 and 570 are really not worth it. I also wouldn't go for the 4GB version of the RX580.
[doublepost=1522604696][/doublepost]

But nVidia has much better performance. It's like the difference between Intel and AMD CPUs. AMD is so embarrassing that a Sandy Bridge i5-2500k still keeps up with the some of the latest AMD stuff. Ryzen is a little better but the FX stuff is garbage.
My iMac has a 580 Pro. Is that different from a RX 580?
 
And in FP64 quadros blows EVERYTHING amd out of the water. Even the “workstation” and “compute” Radeon Pro wx 9100 and Radeon instinct MI25 only have 768 flops, what a ****ing joke! And did I mention there is like 0 ML applications that doesn’t use CUDA?
[doublepost=1522585794][/doublepost]
Yes the iMac pro IS gimped, the GPUs are just underclocked gaming Radeons (NOT the actual workstation WX9100, Radeon “Pro” my arse), the 8 core cpu is gimped so that people will buy the 10 core (fewer cores with lower boost frequency, really?). And on top of that the whole thing STILL thermal throttles under stress (Linus made a video).

Well "like 0 ML applications that doesn't use CUDA" - that's pretty nonsense.

My own applications use OpenCL, and FP16 is much better for my training application. YMMV, but your statement is just tosh.

Yes, CUDA is more mature and more widely used, but it's not the only option out there. My Vega frontier get's me the same performance at my chosen precision level as the ridiculously overpriced Nvidia cards.
[doublepost=1522605424][/doublepost]WRT TB2 - I'm not defending Apple here at all, but I would imagine it's largely due to drivers/resource focusing.

Getting an eGPU to work (e.g. under linux) is pretty damn hard work - thunderbolt standards are a nightmare to work with, so it makes sense, in a way, to focus resources.

That said, there is a hack to make it work, but YMMV.
 
My iMac has a 580 Pro. Is that different from a RX 580?

Sorry I thought I was reading Pro 560. No, the difference between the Pro 580 and the RX580 is very small. The Pro 580 is more efficient while the RX580 is slightly more powerful in some areas but definitely not worth it! You have to go with nVidia 1070, 1080 or 1080ti to get a significant improvement and unfortunately they're not supported in macOS, just Windows. The Pro 580 is actually pretty damn good.
 
  • Like
Reactions: sd70mac
Use Windows through Bootcamp and you should be able to use TB2. TB3 is still pretty new. Most machines don't have it at all! It's a big leap from TB2 and eGPU performance is much better using TB3. This is not planned obsolescence in this case. eGPUs will remain a niche market. Only certain devs and gamers will be interested in adding high-end GPUs on top of already pricy hardware. For now Apple recommends the RX570 and 580 which aren't expensive but if using a 1080ti in MacOS were supported you're looking at over $1k for the GPU and enclosure. Actually around $1.5k since you won't find a 1080ti anywhere near retail.
Like many I have not purchased a Mac to run Windows. If I had wanted to run Windows I would have purchased a PC.
[doublepost=1522610105][/doublepost]
It doesn't matter how recent the computer is. It has a port that was never intended for eGPUs, by the company that developed the PCIe tech in the port. It *is* unreasonable to expect full compatibility with future tech that was released after you bought your computer. You expect Apple to say "oh well yeah we know intel never supported this, we know it only has 1/2 the bandwidth, we know people will complain... but we will support it anyway"?

If your iMac had a Thunderbolt 3 port and a chipset with all of the required features, and Apple arbitrarily read your system type and caused eGPUs to silently fail, you'd have an argument.

People have already gotten eGPUs to work with 10.13.4 on 2013 Mac Pros, so I reckon it's only a matter of days before a workaround is documented.

USB-C/Thunderbolt 3 is one of the best things to happen to computers IMO. Transitions are always tough at the beginning, but they're the most versatile and highest bandwidth ports ever in a consumer computing device. As you already know, you can plug in a graphics card :D
It very much does matter how recent the Mac is. Here is a hypothetical situation not so different to the one I am faced with. If you were to purchase a 2016 MacBook Pro only to find key features will not work on it because the manufacturers (in this case Apple) have deemed it acceptable to change the ports on the 2017 MacBook Pro. therefore rendering your 2016 MacBook Pro obsolete in just a year.

How would that deemed acceptable in any way?
 
  • Like
Reactions: sd70mac
Sorry I thought I was reading Pro 560. No, the difference between the Pro 580 and the RX580 is very small. The Pro 580 is more efficient while the RX580 is slightly more powerful in some areas but definitely not worth it! You have to go with nVidia 1070, 1080 or 1080ti to get a significant improvement and unfortunately they're not supported in macOS, just Windows. The Pro 580 is actually pretty damn good.
Okay, so no real higher *supported* eGPU than a 580 Pro.

The only reason I bought the upgraded chip/card was so I can use VR at some point.

My understanding is that only the strongest 2017 iMac card will support VR.
 
It has less performance than the RX 580.

The difference is minor like I pointed out above. It's not like the improvement he would see with a high-end Pascal card like the 1080 or 1080ti.
[doublepost=1522620299][/doublepost]
Like many I have not purchased a Mac to run Windows. If I had wanted to run Windows I would have purchased a PC.
[doublepost=1522610105][/doublepost]
It very much does matter how recent the Mac is. Here is a hypothetical situation not so different to the one I am faced with. If you were to purchase a 2016 MacBook Pro only to find key features will not work on it because the manufacturers (in this case Apple) have deemed it acceptable to change the ports on the 2017 MacBook Pro. therefore rendering your 2016 MacBook Pro obsolete in just a year.

How would that deemed acceptable in any way?

The 2016 MacBook Pro is the 4th gen MacBook Pro so the 2017 wouldn't have had any reason to be a different design. They don't change the design throughout a generation of their notebooks. Those who have 2015s could be pissed but they had to have imagined that a new design was on the way since the original retina MBP came out in 2012.

Now I understand your frustration but TB3 was not ready in 2015. The mid-2017 4K iMac was the first refresh after yours and they had to include TB3 on it. Some products have a longer lifespan than others and are better buys. The iPad 2 and iPhone 4s are 2 of these. Two of the longest lasting and supported iOS devices from iOS 4 & 5 all the way to 9.3.5. The last pre-retina MacBooks could also be upgraded with SSDs and RAM extending their lifespan. Other products are killed off early like the 3rd gen "new"iPad with 30-pin. That was really an insult introducing a much better retina iPad that same year with a faster CPU and GPU, better cameras and the then new lightning connector.

Early iMac 4K buyers also paid a lot more for the machine than before they dropped the prices just a year later with improved specs as usual. Those who bought the 2016 nTB 13" MBP also got screwed with a cheaper and better machine being introduced half a year later albeit with half the storage for $1299 that can be found for $100 or more less.
 
Does this mean I could play some AAA games (Through Windows) with a GTX 1080ti using my 2016 matchbook pro?

Edit: NM...now Nvidia support :(

Yes, you can. Thanks to egpu.io community, my eGPU setup, an Aorus Gaming Box (Geforce GTX 1070), works perfect with my macbook pro 15” late 2016.
I use it to play AAA steam games on Windows 10 bootcamp.
[doublepost=1522623873][/doublepost]
I need Nvidia to run CUDA stuff. Looks like I need to forget about buying a laptop from Apple for a few years.

Check egpu.io . You can RUN CUDA stuff on macs using an eGPU.
 
  • Like
Reactions: sd70mac
The difference is minor like I pointed out above. It's not like the improvement he would see with a high-end Pascal card like the 1080 or 1080ti.
[doublepost=1522620299][/doublepost]
The difference is about %12 compared to the reference AMD card, and probably %15 for non reference rx 580s. Doesn't sound like much, but it does make a difference. On the other hand, gtx 1070 or 1080, or even Vega 56, would be a significant upgrade.
 
  • Like
Reactions: sd70mac
The difference is about %12 compared to the reference AMD card, and probably %15 for non reference rx 580s. Doesn't sound like much, but it does make a difference. On the other hand, gtx 1070 or 1080, or even Vega 56, would be a significant upgrade.
Yes, but eGPU support is limited to AMD, correct? I’m more interested in VR than AAA gaming. I have a PS 4 pro for that and I cannot game on a keyboard anyway.

From what I understand, the 580 Pro should be fine for VR and any suppprted external GPU options are not above and beyond the 580 pro.
 
Yes, but eGPU support is limited to AMD, correct? I’m more interested in VR than AAA gaming. I have a PS 4 pro for that and I cannot game on a keyboard anyway.

From what I understand, the 580 Pro should be fine for VR and any suppprted external GPU options are not above and beyond the 580 pro.
If AMD is the only manufacturer supported, Vega 56/64 would be a viable option, that would be a tier or so better than the 580.
 
  • Like
Reactions: sd70mac
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.