The performance delta between OS X and Windows 7 is large. Larger than it is for TF2. Blizzard has said they've a bit of optimization left to do, and there's the chance that 10.6.4's regressions are affecting it as well. But Windows 7 pushing 80% more frames is a heckuva difference.
Hardware
- Mid-2010 Mac Mini (MacMini4,1)
- Intel Core2 Duo, 2.66GHz ("Penryn" P8800, 3MB L2 cache, 1066MHz FSB)
- 8GB PC3-8500 DDR3 Memory
- 500GB Seagate Momentus XT, 7200rpm, 32MB cache, 4GB SLC NAND
- nVidia GeForce 320M, 256MB VRAM (shared memory)
Settings (Default)
- Display Mode: Fullscreen
- Resolution: 1280x800
- Texture Quality: Medium
- Graphics Quality: Medium
- Shaders: Medium
- Lighting: Low
- Shadows: Medium
- Terrain: Medium
- Reflections: Off
- Effects: Medium
- Post Processing: Medium
- Physics: Low
- Models: High
- Unit Portraits: 3D
- Movies: Low
OS X 10.6.4 (Snow Leopard)
- Cutscenes: 30-35 fps
- "Liberation Day" Mission: 15-20 fps
Windows 7 Ultimate, x64 (Boot Camp)
- Cutscenes: 55-70 fps
- "Liberation Day" Mission: 30-35 fps
Disappointing and not at the same time. The plus is a Mac Mini, MacBook, or 13" MacBook Pro can play a new game at playable framerates at medium settings. The downer of course is you have to constantly switch into Windows to get said framerates.
To put this in perspective, here are the 3DMark06 scores of some Macs, and some desktop GPUs for comparison...
- MacBook Pro 15" Core2 Duo, ATi Mobile Radeon X1600 - 1800
- Mid-2010 Mac Mini, nVidia GeForce 320M - 4420
- MacBook Pro 15" Core i5, nVidia GeForce 330M GT - 6080
- iMac 21.5" Core2 Duo, ATi Mobile Radeon 4670 - 6840
- iMac 27" Core i5, ATi Mobile Radeon 4850 - 9950
- Core2 Duo 2.66GHz & GeForce 8800GT 512M - 12,700 (desktop I built in late 2007)
- iMac 27" Core i7, ATi Mobile Radeon 5750 - 14000 (estimate)
- Core i5 & ATi Radeon 5850 - 22180 (desktop that could be built for < $1000)
Keep in mind, DirectX 7 or DirectX 8 is when Microsoft had gotten crazy serious about optimizing Windows for games. Removing as many hoops as possible for applications to jump through to maximize performance. That was nearly a decade ago. Plus video card manufacturers optimize much more for Windows than OS X since the difference in how large those markets are is huge. Plus games developers know DirectX much better than OpenGL, as DirectX has been what's in favor - by and large - for about 7 years now.
The reality is for the performance gap to shrink to more around... 15-20%... that's going to take a few years for Apple, GPU manufacturers and developers to make that happen.