Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sorry, thought I had selected those correctly. So is it a bigger difference between the two options available for the riMac, if you compare them relatively with an older MBP?
 
Hello.

This is my first post here on this forum. First let me introduce myself. I'm a 3D modeling artist, working in film & automotive industry. My main software are Maya, Mari, Mudbox, Zbrush. I also do casual/light gaming like Starcraft 2.

I have a full spec Late 2013 27" iMac running OSX Yosemite. I'm very happy with this machine. It's fast enough for my pro apps and casual gaming, cool & quiet in a slim AIO form factor.

Let's make some things clear first. I'm not an Apple fanboy or a PC fanboy, AMD or nvidia fanboy. I just buy machines that I like and works well. The iMac is still a PC to me, it's intel, nvidia, samsung SSD, LG monitor and so on. The only "magic" here is the design witch I happen to like.

Also I see a lot of people here confusing OpenGL with OpenCL. Long story short OpenGL is a graphics library meant to render game engines or 3D/2D viewports or anything else that is visual graphic related, just like DirectX in Windows. OpenCL in the other hand is a compute language, meant to do generally CPU tasks on the GPU. It's CUDA equivalent. The only thing here is that CUDA is related to nvidia GPU only, while OpenCL is royalty free. So OpenCL is not AMD technology, it was an Apple concept as an open compute standard, now developed by Khronos Group along side OpenGL. Funding members for OpenCL developing are Apple, AMD, nvidia, intel, Qualcomm, IBM and so on. So don't say that CUDA is nvidia and OpenCL is AMD, OpenCL is everyone.


I was very disappointed to see in a 2014 Retina 5K iMac a 2-3 year old GPU like M290X. M295X is a better solution but not the best. I believe a tuned 980MX (special design for iMac) just like 680MX was for the first thin iMac would be a much grater solution. Even 980M default. Some people find cinebench relevant to measure performance, but some find Unigine. I have to agree whit those that find Unigine Heaven.

Cinebench uses OpenGL 2.0 with 2 cars, some light weight textures and some basic lighting, while Unigine Heaven uses OpenGL 4.0 with high rez textures, heavy displacement maps, animated grass and trees, advanced lighting, etc. This is most relevant to me as a 3D artist. Mari also uses OpenGL 4 with a similar engine. Maya is still OpenGL 2.0 but it can be used with high rez geometry and textures, ambient occlusion, AA, depth of field, motion blur and in Windows with DirectX 11 heavy displacement maps. Maybe Autodesk will implement OpenGL 4 in Maya so OSX/Linux users can have displacement like in Windows.


Anyway this is my results in Unigine Heaven using nvidia 343.01.01f01 web drivers.

Screen Shot 2014-10-24 at 3.56.37 PM.png

Compared to this :

https://www.youtube.com/watch?v=gk-0jRqtm6c

28 fps with 106º C his vs mine 29.1 with 83º C. Really?

Not to mention GM204 is twice as mush the performance per watt compare to GK104. Meaning a GTX 980MX would have the same fps as the 780M but at @ 40º C or double the performance at 80º C. I thought Apple is all about energy efficiency and performance per watt. What just happened? On every single complex tests, GTX 980M wins at performance and wins a lot at performance per watt.


After some internet digging I have found this:

Desktop AMD Radeon 285 Tonga based with 1792 cores - 3.2 teraflop compute power rated at 190 W TDP.
"Mobile" AMD Radeon M295X Tonga based with 2048 cores - 3.5 teraflop compute power (make sense) rated at 250 W TDP

Mobile GeForce GTX 980M Maxwell based with 1536 cores - 3.1 teraflops compute power rated at 85 W TDP
Desktop GeForce GTX 980 Maxwell based with 2048 cores - 4.6 teraflops computer power rated at 165 W TDP.

Remember something! GPUs inside newer iMac are not MXM interface, meaning there is no TDP limit. Those GPU are soldered directly into the logic board.

If Apple was able to stick a 250 W TDP GPU inside the iMac, why not a GTX 980 witch will result the best GPU ever made without 106º C. GTX 980 is better than R9 290X desktop, even better than Titan Black.

Some may say is because AMD has better OpenCL performance. Well not this time:

http://www.phoronix.com/scan.php?page=article&item=nvidia_gtx980_opencl&num=1

http://www.anandtech.com/show/8526/nvidia-geforce-gtx-980-review/20

So using current Radeon graphics is not a great idea. A GTX 980 or a supposed special 980MX would be a better choice. It's obvious, Maxwell is better at everything... meaning better at DirectX/OpenGL game engines or 3D/2D viewport for games and DCC/CAD apps, better at compute stuff using OpenCL/CUDA, better performance per watt, and better overall performance and features.

There is absolutely no reasons to use AMD, unless they have a special deal, 1$ per GPU.:mad:
 
Barefeats just posted their latest benchmark, which includes testing in Geekbench, FurMark, TessMark, X-Plane 10, and Left 4 Dead 2.
http://barefeats.com/imac5k3.html

The retina iMac with M295x was matched up against a nMP with D300s and a 2013 iMac with a 780M. The retina iMac scored better than in every benchmark they tested. In X-Plane 10, it was 2.5x as fast as the nMP/D300s, and 2x as fast as the iMac/780M.

I agree with Barefeats: the Retina iMac with M295x continues to impress.
 
There is absolutely no reasons to use AMD, unless they have a special deal, 1$ per GPU.:mad:

I suspect there are three reasons why Apple went with AMD instead of Nvidia.

1) Nvidia did not have the 970m and 980m ready in time. Tim Cook is on record that he regrets the delay between announcing new iMacs in 2012 and when they shipped to customers, he vowed it would not happen again.

2) The 980m is expensive and Apple had to hit a price point with the riMac. For the last few years Nvidia have been producing better performing GPUs with lower thermals and AMD have become known for selling at cheaper prices. Considering Dell thinks they can charge $2.5k just for a 5k monitor Apple are probably not making the margin they would like to on the riMac. Over it's life they will increase, much like the Mac Pro when it was announced (those D700s were really expensive kit).

3) ATI (and now AMD) have a long standing history of working with OEMs and that's where they make the bulk of their revenue. Apple are probably happy with how the commercial arrangement of the Mac Pro panned out and are expanding their relationship with AMD. For this reason and that iMac's stick with one brand of GPU for several revisions (source: http://en.wikipedia.org/wiki/IMac_(Intel-based)#Slim_Unibody_iMac) I think AMD in iMacs is here to stay for a while. Who would contract for just one run?

----------

Barefeats just posted their latest benchmark, which includes testing in Geekbench, FurMark, TessMark, X-Plane 10, and Left 4 Dead 2.
http://barefeats.com/imac5k3.html

The retina iMac with M295x was matched up against a nMP with D300s and a 2013 iMac with a 780M. The retina iMac scored better than in every benchmark they tested. In X-Plane 10, it was 2.5x as fast as the nMP/D300s, and 2x as fast as the iMac/780M.

I agree with Barefeats: the Retina iMac with M295x continues to impress.

Yep, things are getting interesting. All I'm waiting for now is some good Windows gaming benchmarks.
 
"Mobile" AMD Radeon M295X Tonga based with 2048 cores - 3.5 teraflop compute power (make sense) rated at 250 W TDP

This is incorrect. The TDP of the R9 M295X is 100W. If you really believe that it is 250W and Apple has somehow managed to shoehorn it into the iMac without the whole thing burning down, then I want to talk to you about a bridge that I have for sale.
 
I suspect there are three reasons why Apple went with AMD instead of Nvidia.

1) Nvidia did not have the 970m and 980m ready in time. Tim Cook is on record that he regrets the delay between announcing new iMacs in 2012 and when they shipped to customers, he vowed it would not happen again.

2) The 980m is expensive and Apple had to hit a price point with the riMac. For the last few years Nvidia have been producing better performing GPUs with lower thermals and AMD have become known for selling at cheaper prices. Considering Dell thinks they can charge $2.5k just for a 5k monitor Apple are probably not making the margin they would like to on the riMac. Over it's life they will increase, much like the Mac Pro when it was announced (those D700s were really expensive kit).

3) ATI (and now AMD) have a long standing history of working with OEMs and that's where they make the bulk of their revenue. Apple are probably happy with how the commercial arrangement of the Mac Pro panned out and are expanding their relationship with AMD. For this reason and that iMac's stick with one brand of GPU for several revisions (source: http://en.wikipedia.org/wiki/IMac_(Intel-based)#Slim_Unibody_iMac) I think AMD in iMacs is here to stay for a while. Who would contract for just one run?

----------



Yep, things are getting interesting. All I'm waiting for now is some good Windows gaming benchmarks.

I'm not sure about number 1 or 2. Maybe not in massive quantities, but they are available. Apple would not hesitate to charge you an arm and a leg for an expensive part and i doubt apple expects to ship the retina 5k in quantities considering a fully deck out one is like $3,500. Laptops which are much cheaper are already sourcing the 980M and 970M.

I think someone said AMD probably worked with Apple on getting the 5K display to work and I think that's most likely.

I think the big question mark here is that the desktop Tonga chip is probably seen as a disappointment. It's a bit more power efficient, but it's not more powerful than the 280. We haven't really seen a performance Tonga card.
 
Last edited:
Now, now, people don't be to hasty with me. I'm trying to solve the mystery here :). I'm not blaming some one and/or again I'm not a fanboy.

Let's apply some logic here.

FirePro D700 is a underclocked W9000 or HD7970 with full chip Tahiti based.

FirePro W9000/ Radeon HD 7970 is Tahiti based with 2048 cores and 384 bit with 250 W TDP and 4 teraflop compute power. So it make sense for D700 to have 3.5 teraflop while have 130 W TDP. So it goes from more to less with Tahiti chips.

But Tonga is different. It physically can't go from less to more while have a lower TDP.

You can't make on the same architecture node (Tonga 28 nm) an @100 W TDP with 2048 cores and 3.5 teraflop, while his brother on desktop has 1792 cores 3.2 teraflop and 190 W TDP. It's impossible.

Also is impossible for a fan cooled 100 W TDP chip to have 106º C and throttle. GTX 780M has also 100 W TDP and apparently it didn't hit 106º C.

That difference between my Unigine results and those on the video is due to a big throttle.

Keep in mind that TDP (Thermal Power Design) is not the real power draw, so is not directly related to the power supply.

@steve23094

1. I'm not sure. Maxwell has been out for a while. Let's remember GTX 750 Ti which was the first Maxwell in the wild. Tonga came a little later. I'm 100% sure that OEMs (especially Apple) have sample chips way before release.

2. It make some sense here but still even so having a GPU thats going to throttle when using a demanding app it doesn't make any sense. Why have a cheap "powerful" GPU that you can't use it properly due to thermal design.

3. Agree.

I think it would be a lot easier to let users chose the GPU.
 
it doesn't matter now. What is matter that the high end 295x is better in all the ways than the previous high end imac 780M.

Yes it could be better...always we can say this no matter what
 
Agree! Let's hope for the next year. Maybe AMD will have a killer GPU with good power design or Apple will switch to nvidia. I don't have a problem if Apple use nvidia or AMD as long as the hardware is not in danger of throttling. But this time AMD is the wrong GPU for that matter.
 
Ubisoft just released the recommended specs for their new gen Assassin's Creed, and R9 290X is marked as recommended, what do you guys think?

http://www.polygon.com/2014/10/23/7053397/assassins-creed-unity-heavy-duty-pc-specs-revealed

Of course it won't play at that resolution, but wouldn't be possible to play using a lower, 2k resolution?

R9 M290X is, in fact, Radeon R9 270X, or AMD FirePro D300 from Mac Pro. Minimal is Radeon HD7970 which is in fact Radeon R9 280X or AMD FirePro D700 from Mac Pro.

See the difference? ;)
 
Ubisoft just released the recommended specs for their new gen Assassin's Creed, and R9 290X is marked as recommended, what do you guys think?

http://www.polygon.com/2014/10/23/7053397/assassins-creed-unity-heavy-duty-pc-specs-revealed

Of course it won't play at that resolution, but wouldn't be possible to play using a lower, 2k resolution?

Are you thinking that the R9 290X is in the base model iMac? The base model has a R9 M290X, the 'M' is important it denotes a mobile GPU.
 
Now, now, people don't be to hasty with me. I'm trying to solve the mystery here :). I'm not blaming some one and/or again I'm not a fanboy.

Let's apply some logic here.

FirePro D700 is a underclocked W9000 or HD7970 with full chip Tahiti based.

FirePro W9000/ Radeon HD 7970 is Tahiti based with 2048 cores and 384 bit with 250 W TDP and 4 teraflop compute power. So it make sense for D700 to have 3.5 teraflop while have 130 W TDP. So it goes from more to less with Tahiti chips.

But Tonga is different. It physically can't go from less to more while have a lower TDP.

You can't make on the same architecture node (Tonga 28 nm) an @100 W TDP with 2048 cores and 3.5 teraflop, while his brother on desktop has 1792 cores 3.2 teraflop and 190 W TDP. It's impossible.

Also is impossible for a fan cooled 100 W TDP chip to have 106º C and throttle. GTX 780M has also 100 W TDP and apparently it didn't hit 106º C.

That difference between my Unigine results and those on the video is due to a big throttle.

Keep in mind that TDP (Thermal Power Design) is not the real power draw, so is not directly related to the power supply.

@steve23094

1. I'm not sure. Maxwell has been out for a while. Let's remember GTX 750 Ti which was the first Maxwell in the wild. Tonga came a little later. I'm 100% sure that OEMs (especially Apple) have sample chips way before release.

2. It make some sense here but still even so having a GPU thats going to throttle when using a demanding app it doesn't make any sense. Why have a cheap "powerful" GPU that you can't use it properly due to thermal design.

3. Agree.

I think it would be a lot easier to let users chose the GPU.

The mystery is only in your mind. If 2048 Core Tahiti XT can have 130W TDP, then 2048 core Tonga can have 100W.

Check the TahitiPro TDP. OVER 225W with 1792 Cores and 384 Bit VRAM bus.

Tonga is 1792 GCN Cores with 256 Bit VRAM Bus, with 190W TDP. OVER 35W difference.

What TDP has M295X? 100W(aproximately). Does this match with maths?
 
R9 M290X is, in fact, Radeon R9 270X, or AMD FirePro D300 from Mac Pro. Minimal is Radeon HD7970 which is in fact Radeon R9 280X or AMD FirePro D700 from Mac Pro.

See the difference? ;)

Oh, I see now. Why are those things are so complicated? It almost seems that companies want to confuse us about the quality of it's GPUs :p

So I guess I'll order the cheapest retina macbook, continue playing on consoles and forget about all this GPU on Macs mess.

Look at all the discussion on this thread, it's just not worth it.
 
Oh, I see now. Why are those things are so complicated? It almost seems that companies want to confuse us about the quality of it's GPUs :p

So I guess I'll order the cheapest retina macbook, continue playing on consoles and forget about all this GPU on Macs mess.

Look at all the discussion on this thread, it's just not worth it.

Its complicated only if you don't read carefully, without understanding what you are reading ;).

However, about gaming you may have a point. Lately I checked good Windows/Console and Mac Games. Only games that I would install on Bootcamp partition are Mass Effect 3 and Assassin's Creed 3.

Rest is on consoles or OSX.

What a funny times...
 
There is absolutely no reasons to use AMD, unless they have a special deal, 1$ per GPU.:mad:
I see you are using the website Phoronix, do ask that question over there ;) Be prepared to get flamed like there is no tomorrow. When it comes to UNIX/Linux and drivers there's only AMD and Intel. Search YouTube for Linus Torvalds opinion on Nvidia, should say it all :)

There is more than performance. Having proper support, stable drivers and so on are very important. Nvidia may have lost in those areas.
 
Its complicated only if you don't read carefully, without understanding what you are reading

"R9 M290X is, in fact, Radeon R9 270X, or AMD FirePro D300 from Mac Pro. Minimal is Radeon HD7970 which is in fact Radeon R9 280X or AMD FirePro D700 from Mac Pro."

M290x = 270x = D300

very straightforward and not confusing at all... I see..

It's so "not confusing", that the thread has more then 1000 replies and you guys still can't agree if it is or it isn't a good card...

Now sorry to bother you guys with your hobby, bye
 
"R9 M290X is, in fact, Radeon R9 270X, or AMD FirePro D300 from Mac Pro. Minimal is Radeon HD7970 which is in fact Radeon R9 280X or AMD FirePro D700 from Mac Pro."

M290x = 270x = D300

very straightforward and not confusing at all... I see..

It's so "not confusing", that the thread has more then 1000 replies and you guys still can't agree if it is or it isn't a good card...

Now sorry to bother you guys with your hobby, bye

Well it was straight said that it is good card. Question is - for "what" ;).

For FullHD is plenty enough. For the rest its questionable.
 
it doesn't matter now. What is matter that the high end 295x is better in all the ways than the previous high end imac 780M.

Yes it could be better...always we can say this no matter what

Well, we rarely say "it can be better, right now" considering the ongoing assumptions are 980M is more power efficient and more powerful than the 295x. a win-win.

considering there are like no news of the 295X in teh wild, i'd imagine the 980 should be in higher quantities than ATIs. another assumption.
 
So if this is true: M290x = 270x = D300

Then is this also true?

Those on a budget looking for a midrange card will be well-served by the AMD Radeon R9 270, which is our new Editors' Choice for midrange graphics cards. The R9 270X, while a balanced option at a fair price, doesn't have a dramatic advantage over its cheaper cousin.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.