Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
and the thunderbolt 2 ports, and the new i7 4ghz

yeah for sure - specifically meant for gaming, the two you mention are a no brainer for my other workloads. I just thinking everyone is taking a pretty negative view on the GPU. If its not good enough, wait for the year-2 refresh and get something with a little more capability.
 
Hm i think its quite funny to read this thread.

On the one side you have people who have bought the iMac and are defending the m295x.

On the other side you have people who are bashing the GPU, calling it lackluster etc. Or maybe not bashing, but you get the idea.

For me, i think its baffling you are comparing this GPU to the 780M, which is like 2 generations ago now. I get that you compare it to the latest iMac gpu, but i dont get why you dont compare it to whats available. Ofcourse this iMac will have a better GPU when the old one was 2 years old.

However im quite disappointed to see the m295x being so far behind.

Im also weirded out by people posting 3d mark scores when we dont know the gnu being used. Its already been told on notebook review forum (which is into gaming) that the 3d mark posted here is NOT from a m295x.

People need to get their facts in order. Nvidia has been outperforming AMD heavily in desktop and mobile marked the last year and a half.

Is the m295x better then the 780m? Yes by a small margin. But then again its worse then the gtx980m and even the gtx970m. Which is very disappointing. Especially considering the TDP off the M295X (110W) vs GTX980M/970M (85W/65W)
 
Last edited:
People need to get their facts in order. Nvidia has been outperforming AMD heavily in desktop and mobile marked the last year and a half.

Reading sites like Anandtech and tomshardware on a regular basis, I could say that this blanket statement is incorrect, but then I would probably be accused of being an "AMD fan boy" and, at my age, these school boy "discussions" don't really interest me.
 
The M295X was going to be launched in Windows laptops, but the rumours went cold over the summer. Now its quite obvious that AMD has worked with Apple to get a 5k panel working over current display port tech on a single link.

I just hoped that Apple would ask nvidia to help them out instead. So we could get a faster GPU.

----------

Reading sites like Anandtech and tomshardware on a regular basis, I could say that this blanket statement is incorrect, but then I would probably be accused of being an "AMD fan boy" and, at my age, these school boy "discussions" don't really interest me.

My intention was never to start a war on this. But after maxwell AMD has nothing to answer with. But i guess its no point in discussing this here.
 
I get the sense that the m29X chips that apple is using have been tuned to work better at higher resolutions. running standard benchmark programs at 1080p may be limited by cpu tech being comparable in 2012/2014. I won't know for sure until I get mine and load bootcamp, or Anandtech releases their surely exhaustive review, but my gut tells me we are so far not pushing the system in the manner in which it is most capable.

either way, if you don't have a 2013 780 nvidia iMac then this system is a no brainier if you're looking for a Mac desktop solution. when pricing the two out the retina was ~300 more than a relative non retina system from 2013... that's a reasonably small cost to justify to get a 5K screen and the fastest single core processor on the market in my opinion.
 
The M295X was going to be launched in Windows laptops, but the rumours went cold over the summer. Now its quite obvious that AMD has worked with Apple to get a 5k panel working over current display port tech on a single link.

I just hoped that Apple would ask nvidia to help them out instead. So we could get a faster GPU.

----------



My intention was never to start a war on this. But after maxwell AMD has nothing to answer with. But i guess its no point in discussing this here.

No worries. Your first bit made me think though.... I would imagine that cost and margins have a lot to do with these decisions, looking at the debate of why AMD "beat" Nvidia in the battle to supply APUs for the next gen consoles.
 
yes amd 295x is behind 980m, but with the new drivers later on this year will be better than 970m for sure.

people upgrading to 295x from 680mx or earlier and is an significant upgrade..you can compare yourself i think a 35% compared to the 680mx and more to the earlier gpus
 
yes amd 295x is behind 980m, but with the new drivers later on this year will be better than 970m for sure.

people upgrading to 295x from 680mx or earlier and is an significant upgrade..you can compare yourself i think a 35% compared to the 680mx and more to the earlier gpus

People, the architecture behind the M295X is not an unknown factor.

Just look at reviews (this one from AnandTech, for example) comparing the Radeon HD 7970 and Radeon HD 285 to the Geforce GTX 970 and 980.

No amount of driver tinkering will change the fact that the GM204 chip is inherently faster than Tonga, no matter how you slice it. The rest is just wishful thinking.
 
Pitting AMD's M295X against the 980M (each manufacturer's top line mobile part) is just embarrassing for AMD. Nvidia have stomped all over AMD, and then dropped a piano on them just for good measure.
 
i think the today table of dGPU is:
-980M
-M295x
-970M on bar with 780M possible
-680MX
etc
 
Hm i think its quite funny to read this thread.

On the one side you have people who have bought the iMac and are defending the m295x.

On the other side you have people who are bashing the GPU, calling it lackluster etc. Or maybe not bashing, but you get the idea.

For me, i think its baffling you are comparing this GPU to the 780M, which is like 2 generations ago now. I get that you compare it to the latest iMac gpu, but i dont get why you dont compare it to whats available. Ofcourse this iMac will have a better GPU when the old one was 2 years old.

<snip>

Is the m295x better then the 780m? Yes by a small margin. But then again its worse then the gtx980m and even the gtx970m. Which is very disappointing. Especially considering the TDP off the M295X (110W) vs GTX980M/970M (85W/65W)

For me, it is useful to see how the 780 compares to the 295 because that is what we have available. You can complain about not having the 980 in the iMac, but that doesn't mean it is going to start appearing in it magically. And do we think that after switching back to AMD that in the next couple months we will have an iMac appear magically with the 980?

If you want an iMac _today_, your choices are the 780 or the 295x. If you want to wait to see what cards are put into future iMacs, then go right ahead. It only makes sense that a system purchased 6 months, 1 year, 2 years from now will be better than the one bought today.
 
i think the today table of dGPU is:
-980M
-M295x
-970M on bar with 780M possible
-680MX
etc

Are you sure about that? I know there is a lot of info still to come in and we need some proper benchmarking done but preliminary results are just not showing that.

I have put together a spreadsheet that shows under OS X the R9 is on average 14% faster than the 780M. Under Windows the 970M is on average 156% faster than the 780M. There is not a record of the R9 being tested under Windows so I can't present that info.

My spreadsheet includes source links. Let me know if you can't access the sheet or some of my numbers look wrong.

Here is the link:-
https://docs.google.com/spreadsheets/d/1xu1XjJ22ddiN5MZx_7qeueDMENrvgPH5Bq2wLI25jrM/edit?usp=sharing
 
Last edited:
Are you sure about that? I know there is a lot of info still to come in and we need some proper benchmarking done but preliminary results are just not showing that.

I have put together a spreadsheet that shows under OS X the R9 is on average 14% faster than the 780M. Under Windows the 970M is on average 156% faster than the 780M. There is not a record of the R9 being tested under Windows so I can't present that info.

My spreadsheet includes source links. Let me know if you can't access the sheet or some of my numbers look wrong.

Here is the link:-
https://docs.google.com/spreadsheets/d/1xu1XjJ22ddiN5MZx_7qeueDMENrvgPH5Bq2wLI25jrM/edit?usp=sharing

I'm wondering if I should remove the Driver Overhead from the results?

GFXBench's explanation is 'Measures the OpenGL driver's CPU overhead by rendering a large number of simple objects one-by-one, changing the device state for each item. The frequency of the state changes reflects real-world applications. To see how your device performs at it's native resolution, run the Onscreen test. To compare your scores to other devices, use the Offscreen version that runs at 1080p on all devices.'

What do you think, remove the Onscreen or both?
 
Last edited:
I'm wondering if I should remove the Driver Overhead from the results?

GFXBench's explanation is 'Measures the OpenGL driver's CPU overhead by rendering a large number of simple objects one-by-one, changing the device state for each item. The frequency of the state changes reflects real-world applications. To see how your device performs at it's native resolution, run the Onscreen test. To compare your scores to other devices, use the Offscreen version that runs at 1080p on all devices.'

What do you think, remove the Onscreen or both?

Okay, added an Offscreen only (for all tests) column. You guys will have to let me know if it's relevant.
 
My spreadsheet includes source links. Let me know if you can't access the sheet or some of my numbers look wrong.

Thanks for taking the time to put this together.

For calculating the %difference, the equation should be: (newGPU - oldGPU)/oldGPU
I guess it's not really an old gpu but rather the reference in which to compare.

thanks again.
 
Thanks for taking the time to put this together.

For calculating the %difference, the equation should be: (newGPU - oldGPU)/oldGPU
I guess it's not really an old gpu but rather the reference in which to compare.

thanks again.

Thanks for the tip. I have made the adjustments you have recommended.
 
People are posting a lot of benchmarks.

Im just interested in 3d mark tests, so we can get a real comparison, and also the standard tests of games like

Battlefield 4
etc etc

All in 1080p ultra. Just because thats whats commonly used when comparing with the gtx970m / gtx980m. All these OSX tests really doesn't mean a thing.
 
Why are people talking about driver improvements down the road? Does Apple include new GPU drivers in updates for bootcamp, etc.? Serious question
 
Here are som benchmarks on the gtx980m/gtx970m (see attached files)

Now to get a sense of how the m295x compare you should do it properly.

Install windows 8, and either use a ingame stress test (like in shadow of mordor) or use fraps and measure avg fps. And its important to test in 1080p and with the same settings as below. Dont stare at a wall ingame and say the m295x manages 200fps for example.

I guess it will take some time for people to do this. But im excited to see the results, if anyone ever does it.


Side note #1: The cpu in this test was a i7 4710q
Side note #2: gtx970m sli looks pretty sweet when the game has good sli scaling
 

Attachments

  • Screen Shot 2014-10-22 at 16.40.44.png
    Screen Shot 2014-10-22 at 16.40.44.png
    1.9 MB · Views: 255
  • Screen Shot 2014-10-22 at 16.40.52.png
    Screen Shot 2014-10-22 at 16.40.52.png
    2 MB · Views: 135
  • Screen Shot 2014-10-22 at 16.40.25.png
    Screen Shot 2014-10-22 at 16.40.25.png
    2.1 MB · Views: 139
  • Screen Shot 2014-10-22 at 16.40.32.png
    Screen Shot 2014-10-22 at 16.40.32.png
    2 MB · Views: 160
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.