Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is a GPU that have had clocks tuned from 1266 MHz on RX 480 to "just" 1243 MHz, and it resulted in 30W lower power consumption:
http://www.tomshardware.com/reviews/amd-radeon-pro-wx-7100,4896.html
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9UL0UvNjMzNTA2L29yaWdpbmFsLzA3LVBvd2VyLURyYXctQWxsLVNjZW5lcy5wbmc=

RX 480 was consuming around 163W of power in the same site's suite.

To get to 5.5 TFLOPs, you have to declock it to 1200 MHz. So the GPU can actually be 100W.
 
so does the 580 from the current gen imac is sensible faster than M295X using openCL etc? without having the same heating problems?
 
I had moved away from Windows as well but wanted a game machine mainly for VR (HTC Vive). A Mac wasn't an option for VR, so I built a nice gaming machine when Ryzen came out. While I like being able to play games occasionally, I regret building a Windows box. There are so many irksome things I'm re-learning. Text on a 4k monitor looks horrible, must install MacType or similar. Windows doesn't seem to handle high resolution monitors nearly as well as macOS. For gaming it is great, but you won't want to use it for anything else.

I would buy one of the new iMacs now that VR is coming to macOS and iMacs have real graphics cards. You might think about buying a cheaper iMac and getting a really nice graphics card in an external enclosure. macOS High Sierra finally supports external GPUs which is really sweet. That being said... you won't be able to play all games. BF1, GTA 5 are win-only... so that may make your decision for you. You could probably use Boot Camp to install Win10 on your iMac though.

That would be my thinking -- have a desktop for the setup (big monitor, more powerful machine, etc), but Boot Camp for Windows 10 and games. Much like you're saying, I just don't know that I want to go back to Windows even though I never disliked Windows but rather tried Mac at a time where it made sense for what I was doing. As a previous poster mentioned building my own rig too, I used to do that many years ago -- we're talking Quake 3 Arena era. At this point, I'd be willing to pay a little premium to have a nice, well-done machine with top-shelf hardware.
 
Yes I'm running from a 1TB SSD. My library isn't as huge as some, currently contains 10,000 raw files. Everything is very zippy. I've tried every raw converter under the sun and Lightroom is the best performing and most stable software for me. I did have a brief flirtation with Capture One Pro (I've tried all versions since C1P7) but found it very buggy and had performance issues. A shame as I really like it.

iMac (Retina 5K, 27-inch, Late 2015)
4 GHz Intel Core i7
16 GB 1867 MHz DDR3
AMD Radeon R9 M395 2048 MB

How big is your library? Are you running totally off SSD?
 
Indeed. The "Radeon Pro" parts that Apple gets are cherry-picked and down-clocked versions of the desktop parts.

e.g. the desktop RX560 has 1024 CUs and 2.4 TFLOPs, with a TDP of ~70 W. The Radeon Pro 560 consumes roughly half the power, (the GPUs in the MBP have a TDP of 35 W) but gives you 1.9 TFLOPs.

Similar story for the RX580 and the Radeon Pro 580 in the iMac. The iMac's Pro 580 gets 95% of the performance of the 185 watt RX580, but there's no way the iMac can cool a 185 W GPU – the iMac part must be using much less power.
RX 580 uses that much power because AMD somehow overclocks it insanely to launch a refresh while it sucks at power efficiency. Its predecessor RX 480 and its embedded variant are much more considerate (95-130W).
 
Yes I'm running from a 1TB SSD. My library isn't as huge as some, currently contains 10,000 raw files. Everything is very zippy. I've tried every raw converter under the sun and Lightroom is the best performing and most stable software for me. I did have a brief flirtation with Capture One Pro (I've tried all versions since C1P7) but found it very buggy and had performance issues. A shame as I really like it.

iMac (Retina 5K, 27-inch, Late 2015)
4 GHz Intel Core i7
16 GB 1867 MHz DDR3
AMD Radeon R9 M395 2048 MB

Surprisingly my mac pro 2013 is quite slow with LR when used with a 4k monitor. How long does it take for a RAW photo to load for you with the 5k screen? 2 seconds?
 
The fact that 8 GB of video memory is available as standard and not for an extra $200 is music to my ears. It makes you wonder what will happen when Coffee Lake gets released or if Apple skips that then Cannon Lake maybe in 2019. Will Vega be available for the non-Pro iMacs?
 
It certainly doesn't look as good as running on a native 2560x1440 screen.

Why would it look worse? I mean, if you're comparing gaming on a 2560x1440 gaming monitor then its possible that the latency and refresh rate might be worse on apple's but comparing screens with identical refresh rates and identical latency the 5K should display 2560x1440 images perfectly.

Everything would be perfectly pixel aligned for pixel doubling, the 5K screen is precisely 5120x2880 (2x in both dimensions, 4x total pixels)) so when displaying a 2560x1440 image it would use 4 pixels for every 1 pixel, precisely 2 pixels tall and 2 pixels wide. There is no reason it should look jaggy, the pixel borders should be perfectly aligned.
 
Why would it look worse? I mean, if you're comparing gaming on a 2560x1440 gaming monitor then its possible that the latency and refresh rate might be worse on apple's but comparing screens with identical refresh rates and identical latency the 5K should display 2560x1440 images perfectly.

Everything would be perfectly pixel aligned for pixel doubling, the 5K screen is precisely 5120x2880 (2x in both dimensions, 4x total pixels)) so when displaying a 2560x1440 image it would use 4 pixels for every 1 pixel, precisely 2 pixels tall and 2 pixels wide. There is no reason it should look jaggy, the pixel borders should be perfectly aligned.

I tried to play Elite Dangerous on Win 8.1 in 1440p on iMac 5k (Radeon R9 M290X) and it looked very nice, a lot better than on 1440p screen, because text, HUD and everything gets scaled up very nicely, it looks nearly like 5k. The rest of the picture does not scale up so well and then looks quite exactly like in non-retina screen.

But of course nothing looks worse, but only partly better. Strange wise I could not select any higher resolution ingame, although in macOS version you can select everything up to native 5k, but not sure if upscaling from 1440p games looks as good as on Windows there. Played ED mainly on Win because no Horizons Expansion for macOS. Anyway soon I can play it on my PS4 Pro and hopefully soon with PSVR too.

I also wonder, if I should upgrade my iMac to the new model with Radeon Pro 580 8GB and sell the late 2014 model with R9 M290X 2 GB and 3,5 GHz i5.
Also TB3 brings better upgradibility with eGPU than TB2. I mainly wonder how big the performance improvements for GPU will be. Also I would need to by Win10, I guess.
Elite Dangerous I ran mostly in 1080p on TV in living room on ultra settings, or with PSVR using Trinus PSVR with lower settings. But hope PS4 pro will take that job over.
Anyway I also use GPU for Final Cut X, Adobe CC, Unity Development and more.
I prefer to play on console, but often the game I want is not available, like Star Citizen in the future. (Elite arrives soon on PS4).

I also wonder VERY MUCH, why we cannot have checkerboard 4K or 5k rendering on iMac, either on macOS or at least Windows. Anyone know games which support this?! I have only FullHD 1080p TV for PlayStation, but 5k for iMac, where I really would like to see checkerboard rendering.

And can anyone tell me how much faster Radeon Pro 580 will be compared to R9 M290X? It runs E:D in up to 1440p quite well. Maybe then 4/5k possible with Pro 580?
 
Why would it look worse? I mean, if you're comparing gaming on a 2560x1440 gaming monitor then its possible that the latency and refresh rate might be worse on apple's but comparing screens with identical refresh rates and identical latency the 5K should display 2560x1440 images perfectly.

Everything would be perfectly pixel aligned for pixel doubling, the 5K screen is precisely 5120x2880 (2x in both dimensions, 4x total pixels)) so when displaying a 2560x1440 image it would use 4 pixels for every 1 pixel, precisely 2 pixels tall and 2 pixels wide. There is no reason it should look jaggy, the pixel borders should be perfectly aligned.

You are right! I didn't do the math and realize it was an exact 1/2 of the pixels :/
 
You are right! I didn't do the math and realize it was an exact 1/2 of the pixels :/
No worries, I've got one 2013 non-retina iMacs right now and I'm looking forward to the new retina model I've ordered, so I'm definitely planning to run more demanding titles in 2560x1440 without blurring.
 
Is teraflops a good way of measuring GPU performance? I think the 580 is 5.5 teraflops. The upcoming Vega 56 is 11 teraflops (if I have my stats correctly), thus the Vega is twice as fast?
 
The fact that 8 GB of video memory is available as standard and not for an extra $200 is music to my ears. It makes you wonder what will happen when Coffee Lake gets released or if Apple skips that then Cannon Lake maybe in 2019. Will Vega be available for the non-Pro iMacs?

It's most likely that Vega will be the GPU is the next iMac update. Looks like it should be 6 cores also.
 
Is teraflops a good way of measuring GPU performance? I think the 580 is 5.5 teraflops. The upcoming Vega 56 is 11 teraflops (if I have my stats correctly), thus the Vega is twice as fast?
And what about FP16? Radeon Pro 580 has 5.5 TFLOPs FP16 performance, Vega 64 has 22 TFLOPs. Is it 4 times faster?

Vega 64 is 11 TFLOPs.

Nobody knows how high throughput does have GCN cores, in Vega architecture.

The most important part that has highest effect on hardware performance is... software.
 
And what about FP16? Radeon Pro 580 has 5.5 TFLOPs FP16 performance, Vega 64 has 22 TFLOPs. Is it 4 times faster?

Vega 64 is 11 TFLOPs.

Nobody knows how high throughput does have GCN cores, in Vega architecture.

The most important part that has highest effect on hardware performance is... software.
You mean Vega 64 is 22 tflops while 56 is 11 tflops?

Ok, so we won't know how fast until they come out as software (as in GPU driver used?) affects their speed.
 
You mean Vega 64 is 22 tflops while 56 is 11 tflops?

Ok, so we won't know how fast until they come out as software (as in GPU driver used?) affects their speed.
No. There are TWO GPUs in iMac Pro.

Vega 56 and Vega 64. Vega 64 is 11 TFLOPs FP32/22 TFLOPs FP16. We know nothing about Vega 56. The numbers indicate the number of Compute Units in each GPU.

No. Software performance has literally zero to do with Drivers. Badly designed Software will always bottleneck the hardware. For instance, Vega GPUs have native FP16 with double the rate of FP32. And without rewriting the software for this feature it will not utilize this and will be computed with just FP32 performance(11 TFLOPs).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.