Haha. Yes, the RX580 is good for that, but please don't.Wait so now I can mine cryptocurrency with a MacBook?
Haha. Yes, the RX580 is good for that, but please don't.Wait so now I can mine cryptocurrency with a MacBook?
Probably because AMD and Apple have a secret exclusivity deal.WHY are Nvidia GPUs not supported???
Does this work with thunderbolt 1 or 2?
Nvidia does support OpenCL, I think Metal too. If we're pointing fingers, Apple is definitely the biggest offender in the GPU space. They've had ridiculously tight control forever, and for that reason nobody uses Macs for graphics-intensive tasks, so ofc Nvidia won't bother making drivers. And Apple still doesn't support OpenGL properly; they want everyone to use their proprietary Metal APIs.
This is unacceptable.No, definitely does not work with TB1 or TB2 at all.
They support the same or higher OpenCL and OpenGL versions that macOS does. Especially higher for OpenGL. Apple's the one who doesn't support the open standards.They don’t support them well, though.
But their consumer cards are gimped for compute.They support the same or higher OpenCL and OpenGL versions that macOS does. Especially higher for OpenGL. Apple's the one who doesn't support the open standards.
They support the same or higher OpenCL and OpenGL versions that macOS does. Especially higher for OpenGL. Apple's the one who doesn't support the open standards.
Do you have numbers for that? I'm actually not sure because my info is likely outdated. When I was doing ML research a year ago, everyone used GTX, and they were winning in the benchmarks. But many (like me) had no choice because they needed CUDA. For similar reasons, comparisons are hard to find.But their consumer cards are gimped for compute.
I'm not fond of AMD either, but RX580 is still way faster than what you have in the MBP. If you don't need Nvidia-specific features, it's a decent card.
[doublepost=1522436147][/doublepost]
Do you have a source for the speed comparison? I've always seen that Nvidia cards are faster for machine learning computations too. I always thought Apple's reason for using AMD was to fight against CUDA and the rest of Nvidia's anti-open stance.
why would I buy eGPU if I can get Internet GPU. Streaming games for a 2 months now.
CUDA is such a PitA under macOS anyway that it's not worth the trouble, whether it's Nvidia's or Apple's or the community's fault for lack of support. I've gone down that annoying path both with TensorFlow and PyTorch using a Mac Pro w/ GTX 1060. I'd only do it with a dedicated Linux box. Was a piece of cake to set everything up on my custom build.I need Nvidia to run CUDA stuff. Looks like I need to forget about buying a laptop from Apple for a few years.
The site http://EGPU.io has more information on this, such as in these articles (https://egpu.io/external-gpu-macos-10-13-4-update/ https://egpu.io/macos-external-gpu-review/), but in short, not without third-party software.Does this work with thunderbolt 1 or 2?
They should just build one with the graphics card already implemented - that's the Apple way.Apple should make its own eGPU product.
Do you have numbers for that? I'm actually not sure because my info is likely outdated. When I was doing ML research a year ago, everyone used GTX, and they were winning in the benchmarks. But many (like me) had no choice because they needed CUDA. For similar reasons, comparisons are hard to find.
You can look up the chip tables on wikipedia.Do you have numbers for that? I'm actually not sure because my info is likely outdated. When I was doing ML research a year ago, everyone used GTX, and they were winning in the benchmarks. But many (like me) had no choice because they needed CUDA. For similar reasons, comparisons are hard to find.
Latency. Both will exhibit some form of latency, but streaming will suffer more. Neither will be too bad on single player games, but multiplayer is where you will really notice a difference.
I don't give a **** what Apple do with AMD or Nvidia .. So long as there is a "pro" setup.. All professionals care is that the damn thing "just works" for what they need. Nvidia or AMD, nobody gives a ****!! So long as it is powerful enough somehow, we don't care, it's not our money!! I have 2x 1080ti in my own personal linux rig and that works fine. In my PC I have 2x WX9100 working. I also use my iMac Pro 10 core for normal day to day work.. My laptop is the only one that struggles to keep up with everything else, and even that's not apples fault!.. it's called progress..
If you're referring to flops and memory bandwidth, which is the only thing you'll find on Wikipedia, those don't mean much without an application benchmark. The drivers alone are going to throw complications in.You can look up the chip tables on wikipedia.
Problem is that what "just works" often depends on the vendor, drivers, and other nonsense for GPUs. It's hard to make sure GPU acceleration works for all your pro apps.I don't give a **** what Apple do with AMD or Nvidia .. So long as there is a "pro" setup.. All professionals care is that the damn thing "just works" for what they need. Nvidia or AMD, nobody gives a ****!! So long as it is powerful enough somehow, we don't care, it's not our money!! I have 2x 1080ti in my own personal linux rig and that works fine. In my PC I have 2x WX9100 working. I also use my iMac Pro 10 core for normal day to day work.. My laptop is the only one that struggles to keep up with everything else, and even that's not apples fault!.. it's called progress...Everyone always has to complain about something, so while they're having a go at Apple they're forgetting how bad others are!!!
It would not mean much if there were not such big differences between comparable cards.If you're referring to flops and memory bandwidth, which is the only thing you'll find on Wikipedia, those don't mean much without an application benchmark. The drivers alone are going to throw complications in.
[doublepost=1522450973][/doublepost]I have the same setup. It's been working perfectly since Nov. with a MacMini 2012 over TB1. Can you confirm that it still works with the official 10.13.4 release? I'm holding off the install until I know for sure.Yes it does. Not officially supported by apple of course.
I have the eGPU dev kit from apple, the one that come with the Pulse 580.
To make it work I bought apple TB2 to TB3 adapter, which is bidirectional. I plugged the male end of the adapter in the eGPU breakout box, then used a standard thunderbolt cable from my Mac mini 2012 to the female end of the adapter.
Good. Screw nVidia with their anticompetitive GPP BS.No nVidia support means I’m not holding my breath for nVidia options in the new Mac Pro.
I'm glad to see Apple giving AMD some love. I'll never buy another Nvidia equipped Mac.
Yes, I agree with you. Whatever works works!! I use Avid, DaVinci and FCP and nothing else. For my photography I use Phase One software. Each to their own !! I'm not here to make enemies. I'm a lover not a fighter!!!If you're referring to flops and memory bandwidth, which is the only thing you'll find on Wikipedia, those don't mean much without an application benchmark. The drivers alone are going to throw complications in.
[doublepost=1522449673][/doublepost]
Problem is that what "just works" often depends on the vendor, drivers, and other nonsense for GPUs. It's hard to make sure GPU acceleration works for all your pro apps.