Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Serelus

macrumors 6502a
Original poster
Aug 11, 2009
673
132
Vm9pZA
Well the title says it all.
I've been wondering for a while, but I am not that technologically inclined to know such specifications.

Is it better or not?
 
That's not correct. The GF 320m has 48 shader cores whereas the 9600GT has 32.

More importantly, I thing you're looking at the 3DMark numbers on the Notebook check for each card, but the 320M has is using 3DM06 and the 9600M is using 3DM05. That's why theres such a large discrepancy.

Using Orb, a 9700m GT gets about 6100 in 3DM06. The 320m gets 4700. The 9700m is roughly 10% faster than the 9600m, so by just doing some guesstimation, the 9600m would probably run about 5500 in 3DM06 (there's not 9600m results for 3DM06 in Orb).

So I'd say they pretty comparable. Odds are, the Geforce 320m chipset takes a significant clock speed hit because it needs to be kept cool and the lack of dedicated memory hurts. But, there are 16 extra shaders over the 9600/9700 to make up for it.

The bottom line is: If you would have been happy with the 9600m GT in the previous 15/17 MBP's, you shouldn't have a problem accepting the 320m chipset in the 13". They are very close to one another.
 
The 320m also appears to really overclock easily. Plus, since it's already running at low voltage and temperature, there's very little risk. Although there are a few reasons to dislike the 13", the 320m is not one of them.
 
9600 GT had dedicated VRAM which is a big bonus. MacWorld show it to have about a 50% advantage in Call of Duty 4 (60 FPS on 2009 model 15" 2.66 GHz vs. 39.1 FPS on 2010 model 13" 2.66 GHz).
 
9600 GT had dedicated VRAM which is a big bonus. MacWorld show it to have about a 50% advantage in Call of Duty 4 (60 FPS on 2009 model 15" 2.66 GHz vs. 39.1 FPS on 2010 model 13" 2.66 GHz).

This is probably one of the more telling tests, although the faster processor would account for some of the difference.
 
5340 3dM06 Points here with my 320M overclocked at 730Mhz/1309Mhz.
macbook pro is running cool to the touch, splinter cell conviction and starcraft 2 run decently.
 
This is probably one of the more telling tests, although the faster processor would account for some of the difference.

It's hlghly game dependent and on the settings. Can't say based on 1 game how much better tha 9600M is or the 330M than the 320M.

There are some games under certain settings where a 330M can keep up with a ATI 5650, but on other games under different settings, the ATI 5650 completly destroys the 330m ( literally twice as fast as a 330M )
 
It's hlghly game dependent and on the settings. Can't say based on 1 game how much better tha 9600M is or the 330M than the 320M.

There are some games under certain settings where a 330M can keep up with a ATI 5650, but on other games under different settings, the ATI 5650 completly destroys the 330m ( literally twice as fast as a 330M )

That's not true. 5650 is better card but not much if you see the links below

http://www.notebookcheck.net/NVIDIA-GeForce-GT-330M.22437.0.html
http://www.notebookcheck.net/ATI-Mobility-Radeon-HD-5650.23697.0.html

Depends on game but generally 5650 doesn't get more than few FPS more
 
That's not true. 5650 is better card but not much if you see the links below

http://www.notebookcheck.net/NVIDIA-GeForce-GT-330M.22437.0.html
http://www.notebookcheck.net/ATI-Mobility-Radeon-HD-5650.23697.0.html

Depends on game but generally 5650 doesn't get more than few FPS more

The ATI 5650 destroys the 330M and it is sometimes twice as fast as the 330M, especially the slower 330M that is in the MBP. And the funny thing is, the ATI 5650 uses alot less power than a 330M :rolleyes:

http://www.notebookcheck.net/Computer-Games-on-Laptop-Graphic-Cards.13849.0.html

Now remember, this 330M version is a faster one than the one that is used in the MBP.

Bad Company2: ATI5650 = 15 fps ( ultra ), 330M = 9 fps ( ultra )
CoD6: ATI5650 = 40 fps ( high ), 330m = 30 fps ( high )
Sims3: ATI5650 = 132 fps ( high ), 330m = 49 fps ( high )
GTA4: ATI5650 = 43 fps ( medium ), 330m = 36 fps ( medium )
Farcry 2: ATI 5650 = 44 fps ( high ), 330m = 34 fps ( high )
Crysis: ATI 5650 = 51 fps ( medium ), 330m = 39 fps ( medium )
World in conflict; ATI 5650 = 37 fps (high), 330m = 29 fps ( high )

But there are a few games where the 330m can perform close to the ATI5650. ( which is why Nvidia sucks. Their are cards are garbage what you get for power, price and performance. Hence why Fermi will fail also. Their top card the GTX480 uses the same power as 2 x HD5870 and it performs about the same as a single HD5870 and a GTX480X is also more expensive. Nvidia is a bad GPU maker )

edit: We need to see more games to see how much faster the 330M and 9600GT is relative to the 320M. As you can see with the example of the ATI5650 vs 330M, it depends on the game.
 
The ATI 5650 destroys the 330M and it is sometimes twice as fast as the 330M, especially the slower 330M that is in the MBP. And the funny thing is, the ATI 5650 uses alot less power than a 330M :rolleyes:

http://www.notebookcheck.net/Computer-Games-on-Laptop-Graphic-Cards.13849.0.html

Now remember, this 330M version is a faster one than the one that is used in the MBP.

Bad Company2: ATI5650 = 15 fps ( ultra ), 330M = 9 fps ( ultra )
CoD6: ATI5650 = 40 fps ( high ), 330m = 30 fps ( high )
Sims3: ATI5650 = 132 fps ( high ), 330m = 49 fps ( high )
GTA4: ATI5650 = 43 fps ( medium ), 330m = 36 fps ( medium )
Farcry 2: ATI 5650 = 44 fps ( high ), 330m = 34 fps ( high )
Crysis: ATI 5650 = 51 fps ( medium ), 330m = 39 fps ( medium )
World in conflict; ATI 5650 = 37 fps (high), 330m = 29 fps ( high )

Yea it is faster but the main thing in MBPs is the auto-switcing ;) Not sure is it possible with ATIs. You listed games where there are difference but what I looked, there were many games where performance was about the same but yea, it is better card
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.