I'm still puzzled why people think the HD 3000 is inferior to the 320M. Every benchmark I've looked at, including games, suggest the 3000 is slightly faster in most cases. Is it embarrassing that the 3000 is only "slightly" faster than the aging 320M? Sure it is. But is that reason enough to give up Sandy Bridge and Thunderbolt?
Someone show me a current benchmark that demonstrates the 3000 being vastly inferior to the 320M. And then I'll shut up.
I haven't seen those benchmarks. But being a gamer, if you can post one of those links here I'd appreciate it greatly, then I'll shut up.
Certainly the consensus on this forum suggests otherwise.
FWIW, I posted these recent benchmarks (more games tested inside link) on another Mini thread a few days ago. They suggest HD3000 is relatively pants compared to 320M, in both OS X & Windows 7:
Stats below for MacBook Pro (2010) with 320M v MacbookPro (2011) with HD3000, in both OS X & W7:
http://www.techyalert.com/2011/02/25/macbook-pro-2010-vs-macbook-pro-2011/
Overall, 320M is better than HD3000, but what's interesting is how much better Windows 7 64-bit performance is with both cards over OS X 10.6.6.
For example: In Windows 7 64-bit: Left 4 Dead (SP Campaign No Mercy)
Low Settings
2011 MBP with Intel HD 3000 (avg fps) = 57
2010 MBP with nVidia 320m (avg fps) = 69
Medium Settings
2011 MBP with Intel HD 3000 (avg fps) = 49
2010 MBP with nVidia 320m (avg fps) = 64
High Settings
2011 MBP with Intel HD 3000 (avg fps) = 36
2010 MBP with nVidia 320m (avg fps) = 43
In OS X 10.6.6: Left 4 Dead (SP Campaign No Mercy)
Low Settings
2011 MBP with Intel HD 3000 (avg fps) = 51
2010 MBP with nVidia 320m (avg fps) = 48
Medium Settings
2011 MBP with Intel HD 3000 (avg fps) = 41
2010 MBP with nVidia 320m (avg fps) = 44
High Settings
2011 MBP with Intel HD 3000 (avg fps) = 27
2010 MBP with nVidia 320m (avg fps) = 30