After having scoured the internet for just such a benchmark, gauging the iMac's ability to run the most intensive graphics out there, I had little to no success. So I decided to run my own tests. I'll just leave them here in case anybody is interested. Late '07 model aluminum 20" iMac ATI Mobility Radeon 2600 XT (or the HD 2600 Pro, as Apple calls it) ATI Catalyst 8.9 Drivers (September '08) 4gigs ram 64-bit Vista Home Premium Edition Crysis 1.21 Now, as you may already know, the ATI Mobility Radeon 2600 XT that Apple put in the iMac is underclocked (meaning Apple tweaked it to be slower than a normal Mobility 2600 XT - presumably for temperature reasons). Here is a comparison: iMac Mobility 2600 XT Memory: Roughly 700 mhz Core: Roughly 600 mhz Stock ATI Mobility 2600 XT Memory: Roughly 750 mhz Core: Roughly 700 mhz I used "AMDGPUClockTool" to modify these settings and overclock the card to its non-iMac settings. Note: overclock your card at your own risk. Severely overclocking your GPU can pretty much kill your computer. I did it at very safe levels, considering most manufacturers slightly underclock their GPUs anyways, just to be on the extra extra extra safe side. Apple just took this to a whole new level. Anyways, on to the benchmark. I used CrysisBenchmarkTool1.05, running the GPU test map (first level of Crysis) 3 times on each setting. These are the results in DX10 @ 1680x1050 resolution (the max resolution for the 20" iMac) Vsync off: iMac stock/default GPU clock speed: 600 core / 700 mem Run #1- DX10 1680x1050 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 4.055 Run #2- DX10 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 6.36 Run #3- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 14.525 Run #4- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 29.575 Overclocked to ATI standards: 700 core / 750 mem Run #1- DX10 1680x1050 AA=No AA, 64 bit test, Quality: VeryHigh ~~ Overall Average FPS: 4.705 Run #2- DX10 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 8.545 Run #3- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 17.37 Run #4- DX10 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 35.39 So, as you can see, there is roughly a 20% performance increase (according to the Crysis benchmarks, anyways) between Apple's underclocked settings, and the GPU's actual default settings. This is quite impressive, considering it was run at a relatively high resolution in DX10 mode. You will get a large performance boost out of Crysis by lowering the resolution to a less intense setting (say 1280x800). Most performance junkies will also customize and tweak the graphics settings resulting in a better looking game without sacrificing too much FPS. I also ran these tests in DX10 mode which uses more advanced shaders (i.e. looks a tiny bit nicer), but drops the fps a bit. If you're running the game in DX9 mode (XP or Vista 32-bit, which most people will have), you will get an extra 2-5 fps increase over DX10. So yeah, most people will want to add 5 fps or so to the above results. Also, it is probably safe (I use the word "safe" very loosely) to overclock the Mobility 2600 XT even further, past the ATI standard speeds - as that would be actual overclocking, instead of just resetting it to ATI factory speeds. EDIT (10/10/08): Here are some quick benchmark results of the Mobility 2600 XT slightly overclocked in DX9. 750 core mhz / 875 memory mhz. Run #1- DX9 1680x1050 AA=No AA, 64 bit test, Quality: High ~~ Overall Average FPS: 9.835 Run #2- DX9 1680x1050 AA=No AA, 64 bit test, Quality: Medium ~~ Overall Average FPS: 20.25 Run #3- DX9 1680x1050 AA=No AA, 64 bit test, Quality: Low ~~ Overall Average FPS: 40.79 Run #4- DX9 1280x960 AA=No AA, 64 bit test, Quality: Custom ~~ Overall Average FPS: 22.58 There is no default Very High setting in DX9 mode. The Custom quality setting is what I usually play the game in, which is everything set to high except shaders and shadows. I run it at a lower resolution (1280x800 - the benchmark tool only has a 1280x960 setting) to get a better boost out of it. Screenshots: http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/a-DX9Low.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/b-DX10Low.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/c-DX9Med.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/d-DX10Med.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/e-DX9High.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/f-DX10High.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/g-DX10VeryHigh.jpg http://i32.photobucket.com/albums/d44/Doukutsu/Crysis/h-DX9Custom.jpg The last screenshot is my personal graphics settings (1280x800 all settings to high except shaders/shadows - plus a few advanced tweaks). Its the most optimal blend I've yet found between FPS and quality. In hindsight, I probably should have taken the screenshots from a better perspective. As you can see, DX10 adds very little (if any) noticeable quality improvements. It does improve motion blur and depth of field effects a bit, as well as water reflections and shadows, but all of these are extremely hard to notice on anything save Very High (which is DX10 exclusive anyways). The 2600 in the iMac seems to handle DX10 mode remarkably well compared to other GPUs, though, with only slight FPS drops.