Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Jay9495

macrumors regular
Original poster
Jul 11, 2011
125
0
Melbourne Australia
I did not think the 7970m would be this good, this makes me happy.

Ps. I know they won't be this good on an iMacs native resolution, but still.


ImageUploadedByTapatalk1337689007.274558.jpg

ImageUploadedByTapatalk1337689027.856777.jpg

ImageUploadedByTapatalk1337689041.326676.jpg

ImageUploadedByTapatalk1337689057.258944.jpg
 
7970m is a marvel of modern technology.

the only problem is imac could get gtx680M which scores P4900in 3dmark '11

according to rumors (same ones claiming retina imac) imac is getting 680m
 
What resolution?

High details (not "ultra") and 2560x1440 native res and a minimal stable 30 FPS is all I care about.

If the GPU does not handle most modern games at min. 30 FPS STABLE at native 2560 on the newer iMac, you won't even have the option to scale down to 1920 in a year from now.

So anything under 2560 benchmarkes is completely, utterly irrelevant to me. Only exception is if the 1920 FPS is like 120 FPS or more, in which case one might reasonably expect it to run OK at native. 80 FPS won't for example, because the cost is not linear. Some shaders just make everything crawl when you go big resolution, but are much more easy on the GPU at lower res.

So 2560 it is.

---

Also note that the card being tested has 2048MB (2GB video memory).

And the page is here...

http://www.notebookcheck.net/AMD-Radeon-HD-7970M.72675.0.html
 
What resolution?

High details (not "ultra") and 2560x1440 native res and a minimal stable 30 FPS is all I care about.

If the GPU does not handle most modern games at min. 30 FPS STABLE at native 2560 on the newer iMac, you won't even have the option to scale down to 1920 in a year from now.

So anything under 2560 benchmarkes is completely, utterly irrelevant to me. Only exception is if the 1920 FPS is like 120 FPS or more, in which case one might reasonably expect it to run OK at native. 80 FPS won't for example, because the cost is not linear. Some shaders just make everything crawl when you go big resolution, but are much more easy on the GPU at lower res.

So 2560 it is.

1920x1080

but bear in mind, everything cranked up, including aa, af, fxaa etc.

if you play games without post processing like aa, etc. you'll be able to play every game in the next few years on native resolution and everything set to max

7970m is at least 50% faster then 6970m, which is a huge upgrade (28nm + gcn = once in a lifetime)
 
(...) according to rumors (same ones claiming retina imac) imac is getting 680m

We'll have to see.

One comforting note is that I remember benchmarks which showed recent nVidia cards have a more stable min FPS. So the equivalent ATI GPU may win on some benches, but the quality gameplay experience is dependent also on minimal frame rates. I'm sorry I don't find the link right now but it has a graph of FPS over time that shows it more clearly.
 
It really is a hell of a card. I hope they use it. Overclocked and paired with an ivy bridge i7 this thing would easily handle any game I want to play in the foreseeable future at native res.
 
(...) 7970m is at least 50% faster then 6970m, which is a huge upgrade (28nm + gcn = once in a lifetime)

Sweet, that'll be a darned sweet upgrade from my ol' 4850M :)

Removing AA sounds fair. I do most of the time.

However both I and a friend who's got a PC with a 27 inch IPS monitor agree that the aliasing can be seen, even at 2560. It depends on games, I would say.

AA is nice to have but the performance cost/visual improvement is often not worth it.

That said, there are newer, post-process AA solutions appearing which are more efficient and looks almost as good. In the January ATI drivers they added support for some of these (AAA, SSAA, etc).

More info here:

The Radeon HD 7970 Reprise: PCIe Bandwidth, Overclocking, & The State Of Anti-Aliasing
http://www.anandtech.com/show/5458/the-radeon-hd-7970-reprise-pcie-bandwidth-overclocking-and-msaa/3
 
That said, there are newer, post-process AA solutions appearing which are more efficient and looks almost as good. In the January ATI drivers they added support for some of these (AAA, SSAA, etc).

SSAA is a much older technique than the usual (MS)AA and also slower and always will be. Adaptive Anti-Aliasing has been an ATI feature since 2006. Both are not really post-process solutions. You must have misread the article ;) The post-process algorithms they are talking about are smart blurring methods (like nvidias FXAA algorithm).
 
My bad, I posted what appeared to be a relevant link.

What I had in mind are the post process filters like FXAA which can be injected via modified DirectX DLLs (such as FXAA Post Process Injector for Skyrim) when the game doesn't support it.

http://www.hardocp.com/article/2011/07/18/nvidias_new_fxaa_antialiasing_technology

And there is also an "injector" for SMAA, though on my ol' 4850M there was no noticable FPS gain over traditional AA. But it may well be faster on more recent cards.

http://www.iryoku.com/smaa/

http://mrhaandi.blogspot.com/p/injectsmaa.html
 
7970m is a marvel of modern technology.

the only problem is imac could get gtx680M which scores P4900in 3dmark '11

according to rumors (same ones claiming retina imac) imac is getting 680m

If this is True i will Not buy the new iMac. The 768 SP Kepler Gpu is Not enough for a New 2000€ device.

Could you give us the source?

680m World also mean that the New iMac could First ship in August!!!
 
Last edited:
Honestly, I can't find the rumor about the 680M on the iMac. Closest I can find is one regarding the MacBook Pro, and even that one notes that it contradicts a prior rumor that Apple is avoiding Kepler because of supply issues. Given the constraints of the desktop 680 card, and the lack of any announcent regarding availability of the mobile card, it would surprise me if Apple went that way.

Also, where did you see a benchmark for the 680m? Last I saw, there still wasn't anything at notebookcheck.com in the benchmark lists.
 
Last edited:
http://www.cultofmac.com/167105/the...pple-hands-out-retina-display-upgrades-rumor/

but hold your horses, guy here says that leaked p4900 results are invalid, the gtx 680M breaks p6000 with 75W (opposed to 7970m 100W)... that could be interesting

Thanks, that one wasn't coming up in my google search.

680 CUDA could be nice. I know the OpenCL on those hasn't been great, and the CUDA performance no better than the Prior generation. Could be just what I need for Blender's GPU based renderer though.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.