Weird... Nonsensicle Windows Benchmarks Inside...


macrumors regular
Original poster
Aug 16, 2006
Ok... I was doing some of the oft-requested overclock vs stock clock 24" 7600GT XP Gaming Benchmarks... when things began to be weird...

I'm wondering if Apple optimized the firmware or other pieces/parts of the 7600GT for the super-high 1920x1200 resolution of the 24" monitor.

I had chosen QUAKE4 as the test bed for my overclocking experiments...

I decided to throw a LOT at QUAKE4 and see just how bad the frame rates would get.

I wasn't dissapointed :D

I had the resolution set to 1920x1200, Wide Screen, Medium Texture Quality, No Vertical Sync, all options on except the top choice (special effects I think), and also 16x FSAA.

Thats right, FSAA.

I figured that would show the best benchmark for overclocking :D

I wasn't dissapointed.

I chose the opening scene in a new game where the soldier is floating, half blown to bits, in space.

With all the above... I got 15FPS MAX and 13FPS MIN during that scene.
I overclocked using nVidia's autodetect settings, and got the following change:

18FPS MAX and 17FPS MIN.

That's about a 20% Increase in FPS... of course... 3FPS isn't much, but when you only have 15 to start with... it's pretty much REALLY WELCOME :)

(Oh, just FYI... the same settings above only with 2x FSAA instead of 16x FSAA gets you a solid 60FPS ;) )

Now... HERE is where it gets weird... and where I thought some of you might find this interesting... as you are specifically choosing the 20" for a better *gaming* system as the video card will not have to render as many pixels, so theoretically should be faster at the lower resolution... esp. if you are not planning on upgrading to the 7600GT.

So... I thought I'd compare the stock clock rates of the video card @1920x1200 to the 1600x900 mode (next resolution down)... ALL OTHER SETTINGS THE SAME... EVEN THE 16x FSAA...

And... drumroll...


:mad: :eek: :confused: :confused: :confused: :confused:

The LARGER resolution was about 2x as fast as the next-smaller resolution...

I'm going to do further testing... but would love to see someone else try this and see if it is a bug, or... do you think Apple REALLY optimized the firmware / silicon for the 24" iMac?

It can't be the drivers, I do not believe, as I'm using the Apple-Supplied Graphics Drivers (which APPEAR to be the generic GENERAL RELEASE nVIDIA DRIVERS... and are the latest NON-BETA versions as well!!!)...

What do you guys think?

EDIT: I have rerun the tests 2X now going back and forth between the resolution settings... it IS repeatable on my machine...



macrumors 65816
Mar 3, 2006
Not at the beach...
MacProGuy said:
What do you guys think?
That pies are horribly over priced. I mean $9.75 for a dinky little pie!?!?
Come on!
I'm writing my congressmen.

And that it's to bad you don't have a copy of Windows XP and some windows games hanging around to test as well.
To see if it's drivers or pie.


I mean hardware.


macrumors regular
Original poster
Aug 16, 2006
risc said:
Interesting so the what you are saying is on either resolution the game is unplayable?

Nope. It's unplayable at 16X FSAA... if you noticed, at 2X FSAA I get a solid 60FPS...

16X FSAA is some serious FSAA... lol... just testing the theoretical limits of the hardware...