Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I actually use it for playing StarCraft II. For StarCraft II, my eGPU works very well. The only catch is that to get the full performance of the eGPU, you have to use an external monitor, which is fine for me since my external monitor is 21 inches while my MacBook Pro’s built-in monitor is 13 inches. If you use the MacBook’s internal monitor for playing games with the eGPU, you’ll only get half the performance because you’ll only be getting half the bandwidth.

Though, it doesn’t seem to give me a speed advantage in iMovie. Though, the last time I tested it, the project I was rendering was on an external USB 2 hard drive, which seems like it’s reaching the end of its life. I had the project over there because I was running out of space on my MacBook’s 480 gigabyte Solid State Drive and at the time, I wasn’t willing to clear stuff off. I just cleared some stuff off a couple days ago, so I’ll have to give eGPU rendering another try.


Huh, still sounds like it's funky. No surprises there.
I just got tired of windows again last night. So I'm looking back into the external GPU solution if I can get a 5k iMac. I won't get one until they update again, though. Polaris may not be CUDA compatible, but it's supposedly a huge jump in performance and should run cooler inside the imac.
 
Last edited:
According to this article, AMD confirmed that the RX 480M based on Polaris 11 is a > 2 TFLOP chip with a 128-bit memory interface and 16 compute units (CUs), which amounts to 1024 stream processors. It has a TDP of 35W.

Assuming Apple continues to offer dGPUs in slimmed down MacBook Pros, this could be that GPU.
http://videocardz.com/61064/amd-polaris-10-and-polaris-11-specifications

Afaik, the current Radeon R9 M370X has a TDP of 50W? At 2 TFLOPs, AMD is delivering on their promise of console-quality graphics in thin & light laptops (the current Xbox One and PS4 are < 2 TFLOPs afaik).

Based on the 2.5X (now 2.8X) performance/watt claim, I was previously hoping for a 50W part with equivalent performance to the 125W M295X in my 2014 Retina iMac (3.5 TFLOPs). It's not that, but having a lower TDP makes it more likely that a thinner/lighter MBP could get it.

Guess we'll find out in the fall/Q4, presumably after macOS Sierra ships.
 
Nvidia's Pascal architecture is almost twice as energy/heat efficient compared to AMD's Polaris.

What the hell is wrong with Apple...? Why can't you people switch to a real GPU for once...

I love OS X and Apple's entire ecosystem. But for my work I need really powerful video cards for scientific computing. That is a rare/niche market to be sure, but quality GPU's are good for almost everyone, not just people like me and gamers. And the Pascal GPU's would provide HUGE battery life improvements compared to AMD.

Seriously Apple, just spend a few more bucks and upgrade the Mac lineup to use Nvidia instead of this AMD trash.
 
Nvidia's Pascal architecture is almost twice as energy/heat efficient compared to AMD's Polaris.

What the hell is wrong with Apple...? Why can't you people switch to a real GPU for once...

I love OS X and Apple's entire ecosystem. But for my work I need really powerful video cards for scientific computing. That is a rare/niche market to be sure, but quality GPU's are good for almost everyone, not just people like me and gamers. And the Pascal GPU's would provide HUGE battery life improvements compared to AMD.

Seriously Apple, just spend a few more bucks and upgrade the Mac lineup to use Nvidia instead of this AMD trash.
Almost twice as efficient? Wtf are you smoking?
 
  • Like
Reactions: fat jez
Nvidia's Pascal architecture is almost twice as energy/heat efficient compared to AMD's Polaris.

What the hell is wrong with Apple...? Why can't you people switch to a real GPU for once...

I love OS X and Apple's entire ecosystem. But for my work I need really powerful video cards for scientific computing. That is a rare/niche market to be sure, but quality GPU's are good for almost everyone, not just people like me and gamers. And the Pascal GPU's would provide HUGE battery life improvements compared to AMD.

Seriously Apple, just spend a few more bucks and upgrade the Mac lineup to use Nvidia instead of this AMD trash.

Pascal might be better efficiency wise but maybe they are more expensive for Apple to buy compared to Polaris. Seems like today Apple goes with cheap so it can increase its profit margins...There is no more vision or innovation guiding apples produts. Now it is just "buy cheap and sell premium" to the hipster fashion crowd.
 
AMD Polaris inefficiency comes from AMD's too high core clocks. Computerbase.de checked what affect has lower voltage on power efficiency, and not only at lower voltage the GPUs are able to maintain the core clocks all of the time, but also they are consuming less power.

And if we compare power efficiency alone it is not bad. RX 480 has 4.9 TFLOPs of compute power in 125W thermal envelope(reference model). GTX 1060 for example in the same thermal envelope has 4.4 TFLOPs of compute power.

If you do any scientific job you look at compute performance. Not gaming benchmarks.
 
If you do any scientific job you look at compute performance. Not gaming benchmarks.

I am not a technology wxpert but I never understood this well. Why aren't games considered a reliable factor as to if a computer or graphics card is powerful enough or not?

I mean if it can run games smoothly and in high res doesn't that meant that it will be a powerful computer that can do all other heavy tasks like graphic design and video editing smoothly too?

Don't get it when people say iMacs are not made for gaming but for professional work. But if its not made for gaming then how will they handle professional work smoothly?
 
I am not a technology wxpert but I never understood this well. Why aren't games considered a reliable factor as to if a computer or graphics card is powerful enough or not?

I mean if it can run games smoothly and in high res doesn't that meant that it will be a powerful computer that can do all other heavy tasks like graphic design and video editing smoothly too?

Don't get it when people say iMacs are not made for gaming but for professional work. But if its not made for gaming then how will they handle professional work smoothly?
Because for example you have to factor graphics API on performance of particular GPUs. AMD GPUs got gigantic boost to performance just from lifting off CPU overhead in DX12 and Vulkan games compared to OpenGL and DirectX11. Nvidia did not got anything because... their drivers already were almost no CPU overhead free. Currently all games we have seen were designed only with DX11 API, but with DX12 back end which has lifted the CPU overhead.

In future when more textures, terrain will rely on compute capabilities(Procedural Generation) of the GPUs(which it will, DX12, Vulkan, every API that came out from Mantle, including Metal) will rely on Compute pipeline as well.

However, in DX12 and Vulkan you can think that the games are showing true capabilities of the GPUs. For example Radeon Fury X is 5-10% slower than GTX 1080 in Ashes of Singularity. And what are the compute capabilities of both GPUs? 8.6 TFLOPs for Fury X, and 9 TFLOPs for GTX 1080.
 
I am not a technology wxpert but I never understood this well. Why aren't games considered a reliable factor as to if a computer or graphics card is powerful enough or not?

I mean if it can run games smoothly and in high res doesn't that meant that it will be a powerful computer that can do all other heavy tasks like graphic design and video editing smoothly too?

Don't get it when people say iMacs are not made for gaming but for professional work. But if its not made for gaming then how will they handle professional work smoothly?
Compute applications not only use different API calls, but entirely different APIs than games. Typically, CUDA and OpenCL (not GL) are used for compute. So they are utilizing entirely different parts of the GPU silicon than games. Keep in mind that 'Compute' categories can include applications like Photo and Video processing.

For maybe half a decade AMD has had decidedly better OpenCL performance than nVidia - often 2 or 3 times faster! At the same time, nVidia has tended to have slightly better gaming performance (maybe up to ~20%) and cooler running GPUs with lower power draw.

This may be changing, as nVidia seems to have improved its OpenCL performance lately. But in many cases AMD is a significantly better option for compute performance, depending on the application. It will be interesting to see how these new flagship AMD GPUs perform when they come out, since nVidia has been improving their OpenCL performance, while still supporting the (significant but apparently dwindling) market for their proprietary CUDA API.


Politics and price negotiations aside, AMD may still be the better GPU choice for machines which are targeted more at creatives and content producers than gamers.
 
Last edited:
But in many cases AMD is a significantly better option for compute performance, depending on the application. It will be interesting to see how these new flagship AMD GPUs perform when they come out, since nVidia has been improving their OpenCL performance, while still supporting the (significant but apparently dwindling) market for their proprietary CUDA API.


Politics and price negotiations aside, AMD may still be the better GPU choice for machines which are targeted more at creatives and content producers than gamers.

Perhaps I'm missing out on other creative circles, but in 3d and animation, AMD is not the way to go. Bunches and bunches of rendering engines and plugins support CUDA only. Siggraph was a heavily leaning CUDA show, with only one or two vendors talking about Open CL support.

I've got no idea what it's like for other people, like video editors dealing with tons of live action 4k footage or people who do film color grading and such. I've got a narrow view of the GPU world based on my own needs, so take with a grain of salt.
 
Pascal might be better efficiency wise but maybe they are more expensive for Apple to buy compared to Polaris. Seems like today Apple goes with cheap so it can increase its profit margins...There is no more vision or innovation guiding apples produts. Now it is just "buy cheap and sell premium" to the hipster fashion crowd.

No doubt in my mind. Nvidia is spanking AMD right now, and enjoying most of the market share especially from gamers, so AMD wants to squeeze in where they can. They're a match made in heaven for Apple, who are "margins first, customers second" when it comes to this stuff, so I don't doubt Apple is getting a very sweet price from AMD to go with their GPU in the new Macs. Nvidia is the clear leader right now and mobile versions of their new 1060/1070/1080 cards are going to blow away anything seen in a laptop to date. It would be a golden opportunity to get a good Nvidia card into a Mac but it if costs so much as a few dollars extra forget about it.
 
No doubt in my mind. Nvidia is spanking AMD right now, and enjoying most of the market share especially from gamers, so AMD wants to squeeze in where they can. They're a match made in heaven for Apple, who are "margins first, customers second" when it comes to this stuff, so I don't doubt Apple is getting a very sweet price from AMD to go with their GPU in the new Macs. Nvidia is the clear leader right now and mobile versions of their new 1060/1070/1080 cards are going to blow away anything seen in a laptop to date. It would be a golden opportunity to get a good Nvidia card into a Mac but it if costs so much as a few dollars extra forget about it.
https://www.computerbase.de/2016-07/doom-vulkan-benchmarks-amd-nvidia/ 250$ GPU within 10% of performance of 450$ GPU.

Fury X much faster than that. Similar situation we see with DX12 performance and comparing GTX 1060 with RX 480.
Direct3D12-Vulkan-Test-02.png

Direct3D12-Vulkan-Test-04.png

Direct3D12-Vulkan-Test-06.png

Direct3D12-Vulkan-Test-08.png

Direct3D12-Vulkan-Test-10.png

Direct3D12-Vulkan-Test-12.png

Direct3D12-Vulkan-Test-15.png

Direct3D12-Vulkan-Test-17.png

Perception of brand is everything, these days. Except that reality is more complex than perceptions of any particular brand.
 
Perception of brand is everything, these days. Except that reality is more complex than perceptions of any particular brand.

Especially when you start comparing CUDA benchmarks! ;)

In all seriousness, though, could we really expect to see that performance in an iMac? Are these the parts Apple would use in an iMac form factor?
 
  • Like
Reactions: jblagden
Especially when you start comparing CUDA benchmarks! ;)

In all seriousness, though, could we really expect to see that performance in an iMac? Are these the parts Apple would use in an iMac form factor?
Yes. I think it is 100% guaranteed that P10 will end in iMac, and Mac Pro(at least the two first tiers for MP GPU setups).
 
Do they have higher tiers to put in for a bit more oomph? I'm thinking top tier iMac GPU option.
 
Do they have higher tiers to put in for a bit more oomph? I'm thinking top tier iMac GPU option.
Well what I meant with tiers is for example current 3 tier dual GPU setup from Mac Pro ;).

IMO Ellesmere Pro and Ellesmere XT(RX 470 and RX 480 derieved GPUs) would end in MP as Radeon Pro DX300 and DX500 GPUs. AMD is supposed to bring one more design with HBM2 this year.
 
Well hell, after publicly throwing a hissy fit about AMD and Open CL on the mac in regards to their octane render engine and laying blame on apple's Open CL, Otoy put this in their Octane 3.0 roadmap page:

Broad cross-platform support: Using OTOY’s CUDA cross-compiler, OctaneEngine and OctaneImager is expected to support all possible CPU and GPU devices and platforms, including support for Mac platforms, and AMD GPUs.

https://home.otoy.com/octanerender-3-and-roadmap-update/

I wonder what in the world is going on there. I think they are exploring new territory so perhaps the growing pains are significant. But it just makes waiting harder. I'd hate to invest in a mac with a killer AMD GPU and find that all of a sudden this support dries up after another unforeseen problem, or that their promise of mac AMD support never comes to fruition.
 
Please bring Nvidia for dedicated cards. That would be a much better switch.
A Nvidia GT 1070 or 1080 is what I am hopping for and expecting in the new 2016 Macbook Pro. This would be the only solution for Apple to keep up with the rest of new laptops released this year. Anything less and Apple will still be outdated and playing catch-up Like they have been with the Macbook Pro over the last 5 years.
I absolutely loved the Macbook Pro when it first came out. For years it was the laptop that everyone else tried to copy, but they have milked that product out for too long. It's like they have used that products profits to create other product lines but then never used any of its profits to improve its own product line. They mikled the profits to a point where they are letting us down and loosing customer loyalty.
 
  • Like
Reactions: tomnavratil
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.