Separate names with a comma.
Discussion in 'iMac' started by eroxx, Nov 9, 2014.
How is that possible?
Eh? Which benchmarks?
Help us out, post a link.
All I'm seeing looking at their site says the opposite.
It's not Maybe you read it wrong?
Ah, you scared me there for a moment Op!
I was most worried about the Tessellation Factor. Whew, you were mistaken.
"Tessellation is the ability of the GPU to divide a polygon into smaller ones (according to the tessellation level factor) in order to increase polygons density: the higher the tessellation level, the higher the polygon density. A high tessellation level requires much GPU processing power. We chose 16 levels (usually the max that games use) at 2560x1440 Windowed with anti-aliasing disabled. (HIGHER number means FASTEST in Frames Per Second.)"
Thank goodness my polygon density is superior with the 5K! Now I can retire at peace for the evening.
Here's the link I was talking about. Don't know what to make of it:
And check this from macrumors:
Compared to the 2013 Mac Pro lineup, the Retina iMac offers faster single-core performance, but all 2013 Mac Pro models beat out the iMac when it comes to multi-core performance. Results for the Core i7 iMacs are expected to be similar, but according to Poole, the higher-end Retina iMac may be faster than the 4-core Mac Pro.
Ah you're correct sorry. But see above link. 2013 non retina beats it in certain tasks. And what's with the controversy over the upgraded gpu?
Not in that link you have provided. Some of those benchmarks are time measured in seconds, lower is better. Is that what you are getting confused about?
it isnt, so stop posting crap.
Also what I am thinking. The '14 model beats the '13 model in every benchmark.
And what graphic card upgrade controversy?
Lots are saying the 4gb 295 overheats and throttles down.
Where did you read that?
Here's a little:
What about this page?
There are 3 specific tests in which the i7 + M295X did not come up as the best:
1. Photoshop CC - Noise Reduction: Winner is i5 + M290X
2. World of Warcraft - 2560 x 1440 Ultra: Winner is i7 + G780M
3. X-Plane 10 - 2560 x 1440 FPS: Winner is i7 + G780M
Here's one more:
I would put my money on driver support. Not uncommon for a newer better card not always performing better early on, especially against a top end card with months more time on the market.
But that is just my WAG, as I'm certainly not an engineer.
Abusing your computer and then to complain that is not working well is like,
well... im sure there is a joke that explain it better.
So you guys are happy with your retina 5k?
You're being incredibly selective to try and prove a point. There are ten benchmarks on the page and the M295X did come out on top in seven, and was only beaten by 3% in one of those it was not. I expect driver improvements would easily swallow that difference.
Also don't forget some benchmarks favour Nvidia or AMD. For example Tomb Raider's tessellation heavily favours AMD so these settings should be disabled during testing, if they are not then results should be taken with a grain of salt.
The reason for the 295X's underwhelming performance for now is because of the lack of drivers in the Windows side. The M295X is an OS X-only card.
Until Windows drivers are released, the performance will remain that way.
Try this one:
It is ABSURD that both AMD and Nvidia continue to label mid-level GPUs as real desktop version.
The "GTX780M" is a watered down GTX680.
You guys should be mad about this.
I could see 10-20...maybe 30%, but to have the iMac GPUs be 40-50% slower is a criminal act. Especially when you consider that the cMP are stuck in 2009 Tech. PCIE 2, etc.
And yet they still ran rings around the iMac GPUs.
Be mad about what exactly? It's called a 780M for a reason.
"Criminal"? Lighten up. It's just a damn computer. Who cares. They have a 14 day return policy for a reason.
Agree 100% MVC.
I will also add that is also continually absurd they compare a desktop GPU with a 'mobile' variant. They are incredibly different beasts in reality in terms of performance period. Like comparing a tiny apple with a Granny Smith. It is simply just a marketing ploy so the mobile types match the current desktop range models period. If you compare the wattages and detailed specs between the two types, its there for anyone to see.
I am afraid until both Nvidia and AMD can shrink their silicon process from 28 microns where it has been for both for the past four years their mobile versions for inside the TDP of an iMac or notebook it will still be a joke. Only when they are down to 20 microns will there be any big performance gain on the mobile GPU front.