There seems to be a difference in performance under Mac OSX and under Windows XP:
From
Barefeats themselves:
PC GAMING vs Mac GAMING?We installed a copy of Prey under both Mac OS X and Windows XP Pro. In our first test, we ran at 960x600 in windowed mode, max quality settings, 4X FSAA, 4X anisotropic filtering. Under Mac OS, the 17" MacBook Pro 2.4GHz scored 37fps. Under Windows XP Pro, it scored 73fps. Hmmm.
And for the
killer:
More "VRAM Wars" -- 15" MacBook Pro 2.2GHz (128M VRAM) versus 2.4GHz (256M VRAM).
Under Windows XP Pro, I ran 3DMark06 at 1440x900, 4X FSAA, 4X Anisotropic Filtering:
SM2.0 Gaming
128M = 641 rating
256M = 1279 rating (or 100% faster)
HDR/SM3.0 Gaming
128M = 554 rating
256M = 1063 rating (or 92% faster)
Under Windows XP Pro, I ran Prey 1.3 at 1440x900, 4X FSAA, 4X Anisotropic Filtering:
128M = 31 fps
256M = 46 fps (or 48% faster)
There is a suggestion that the Windows drivers written by nVidia are superior to the Mac OSX drivers written by Apple, thus the VRAM difference simply doesn't show up for OSX games, but does when running games under XP, as the OP intends to do. This is separate to the difference in clocking of the video card under the two OSes.

