Hey all. Just wondering. With the last gen (like mine) having the nVidia 330M GT (with 512MB) and now having the Radeon 6950 HD with 1GB, roughly how much extra processing power does it give relative to the old card? Thanks
There are plenty of benchmarks around here for the 6750. You can find a basic comparison at www.barefeats.com
The 6490 is in between the 320m/8600m and the 9600m while the 6750 is slightly less powerful than the 4850 in the iMac.
9400 < 320 <= 8600 < 6490 < 9600 < 330 << 6750 < 4850
Would be helpful if 3dmark actually meant anything.
aznguyen316 said:Actually the 6490 is better than a 9600GT found in the 2009 models. It's closer to the 330m than we first expected.
Thanks for contributing.
No graphics benchmark "means anything" if we don't know what one is using the GPU for. They are strictly for cross-comparisons between hardware.
And 3dmark06/vantage are still the defacto standard for generic gaming benchmarks.
I have the new 17": here are my results: COD MW2: 1920x1200 maxes out settings 40-80 fps Starcraft 2: 1920x1200 maxed out all settings, shadows, aa, etc.: 35+fps. Really good performance all around, im loving it
Take it SC2 is via Bootcamp too then?
Either way, I wasnt expecting that fps on such a high resolution. Should be great on my high end 15 AG.
What?
Real world performance is the defacto standard, not synthetic benchmarks like 3dmark06.
Just to show you how ridiculous synthetic benchmarks like 3dmark06 is.
3dmark06:
320m = 4155
Intel H3000 = 5053
So the Intel H3000 is supposedly much better than the 320m right?
Almost every good review tests hardware on real world performance to draw conclusions about the performance of hardware.