Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
why are people defending the argument that the 9600M GT isn't that much better than the 8600M GT with random details that are hardly relevant. BareFeats, Notebookreview and PC magazine say the 9600m gt is almost twice as fast as the 8600m gt. there are a couple of reasons for this, 1. the GPU is much better, Nvidia chipsets help the graphics performance.
 
why are people defending the argument that the 9600M GT isn't that much better than the 8600M GT with random details that are hardly relevant. BareFeats, Notebookreview and PC magazine say the 9600m gt is almost twice as fast as the 8600m gt. there are a couple of reasons for this, 1. the GPU is much better, Nvidia chipsets help the graphics performance.

The largest contributing factor to the 9600m GT, is due to the 1600 mhz FSB. That's it. From what I heard, it's a 3% improvement if it was clocked down to a 800 mhz bus.

That is why you'll get ~40% increased performance.

The 8600 GT is a great card. I had one in my windows machine and now in my MacBook Pro. It's the evolution of tech that some things will get faster over time.

I don't know if for daily use, you will really see a big improvement. Games and really intense graphical programs are another story.

Fear not, my fellow 8600 users. We still outnumber those 9600 people.

P.S - it's not "much better." It only allows data to travel in and out quicker. The processing power is too small of a difference to notice.
 
This is what I see. The laptop can't do SLI because the chips are not the same. Apple set the laptop up so one chip turns on and one turns off depending on the task to game or save battery. SLI in that thin laptop would give you 1st degree burns on your lap and drain the battery fast. I think nVidia gave Apple the chips for practicaly free. oh yah 9600m GT = overclocked 8600m GT with stability.

and then its clocked back to the 8600m GT so that it doesnt overheat :p
 
First of all, I like to thank everyone in this thread for keeping me on my toes. I had to go back and re-read a ton to arrive at the conclusion that I have. :D

The 2.4 Ghz MBP comes with 256 and the 2.5 Ghz comes with 512. I figured you'd know that since you seem to understand so much on the subject. I know there would be very little difference in gaming performance (but I never said there would be in the first place), but there must be some positive effect from it.

I do know that, I am saying the difference is from that processor bump and not the card. I personally feel that the card is not clocked high enough (along with the aforementioned bus width) to make any use of the extra RAM.

I want everyone to take a look at this article showing the differences of two identical systems with a 7800GT with 256 in one and with 512 in the other.
http://www.pureoverclock.com/article33.html
First off, we need to see the difference between the 7800 GT and the 8600M GT. The 7800GT has a core clock relatively similar to the later released 8600M (note I am not using the 9600M as reliable performance numbers are not available at this time, take note that Apple also UNDERCLOCKED the 8600M, so if they do the same with the 9600M GT you will get even less performance then reported.) but it also has the ever important 256 bit bus width.

Here are your results:



And yes, I did leave out the charts that didn't support my point. :p
And remember, those WITH performance gains show so because of the bus width, IMO. It's just that this was the best I can find to directly compare two identical systems with only the card beings swapped out.

Even with those advantages, even WITH the 256 bit bus width, you only see gains up around 2048x1536 with max settings, and the framerate on most games was not playable anyway. To top it off, the article notes that even with a massive 100MB dump to the system memory, performance didn't take a noticeable hit till afterward. What does that tell me?

1) Even with the 256-bit interface, clock speed is a major factor when it comes to VRAM utilization.
2) Dumping to system memory isn't going to ruin your numbers until you get above 100MB.
3) The TYPE of memory used (512MB of GDDR2 vs 256 of GDDR3) has an even bigger effect on the card.
4) In order to see the benefits, your other hardware (CPU, RAM, etc) must exceed what a notebook can provide (without bursting into flames :D)

With that in mind, I can honestly tell you that the extra RAM in the 9600 will do next to nothing thanks to the bus width and the clock speed. (This is slightly dependent on Apple's habit of underclocking but I'd stand to say you wouldn't see a boost even if they didn't.) To answer you directly: Unless you are looking at a higher clocked card coupled with a fast CPU and good RAM, the extra memory will give you next to nothing (<1%). It simply shifts the bottleneck.

Wow, I came here to talk about Macs and I end up writing an English paper! :D
 
why are people defending the argument that the 9600M GT isn't that much better than the 8600M GT with random details that are hardly relevant. BareFeats, Notebookreview and PC magazine say the 9600m gt is almost twice as fast as the 8600m gt. there are a couple of reasons for this, 1. the GPU is much better, Nvidia chipsets help the graphics performance.

Again, WRONG. They're basically THE EXACT SAME CHIP. Anyone claiming otherwise doesn't have a clue what they're talking about. It's just a die shrunk 8600GT.
 
Is it possible that when in 9600 mode it's using hybrid SLI? That could account for the large increase in speed over the 8600...
 
Again, WRONG. They're basically THE EXACT SAME CHIP. Anyone claiming otherwise doesn't have a clue what they're talking about. It's just a die shrunk 8600GT.

So basically what you are saying is that 9600 is to 8600 just like Penryn is to Merom.

To simplify it.
 
By having 512 MB VRAM instead of 256, doesn't that mean the card can hold more textures in the memory? Even if it only has 128 bit bus, the extra "storage" should help a little in gaming? It feels like it cannot be completely wasted..
 
Those results are seriously very very flawed. There's no way a 9600M GT outperforms a 8600M GT by that much.

You have proof of this?

By having 512 MB VRAM instead of 256, doesn't that mean the card can hold more textures in the memory? Even if it only has 128 bit bus, the extra "storage" should help a little in gaming? It feels like it cannot be completely wasted..

Exactly. I feel like there's no way it couldn't make any positive difference. Surly, it's beneficial in some way.

So yeah, anyone interested in a Feb. 2008 2.5 Ghz MacBook Pro? :D
 
512MB of VRam can help in many many situations. It may not be that important for some games, depending on resolution.

But What if you want to plug in one of those nice new ACD's, and run a game at that resolution? The Extra VRam will help tremendously.
 
Is it possible that when in 9600 mode it's using hybrid SLI? That could account for the large increase in speed over the 8600...

Nope. That dosen't happen. Any big difference is due to flawed testing methodology. I'd suspect they're using results from different drivers, or different CPUs, etc. There's some variable there, and it's not the GPU. As someone's mentioned, you could overclock an 8600 a bit and blow past a 9600 (not that I recommend doing that...the point is they're the same thing).

I'm really disappointed in how many 8600 parts Nvidia's released, and how the new ones have such deceptive names. The first real upgrade is the 9700GTS.

So basically what you are saying is that 9600 is to 8600 just like Penryn is to Merom.

To simplify it.

Less of a difference. The 8600 and 9600 are virtually the exact same thing. Penryn was actually around 10% faster at the same clock speed on average than Conroe.

You have proof of this?

Look at the specs. It's the same part. I'm sure it's been tweaked some besides the die shrink, but it's effectively the same thing physically.
 
Nope. That dosen't happen. Any big difference is due to flawed testing methodology. I'd suspect they're using results from different drivers, or different CPUs, etc. There's some variable there, and it's not the GPU. As someone's mentioned, you could overclock an 8600 a bit and blow past a 9600 (not that I recommend doing that...the point is they're the same thing).

I'm really disappointed in how many 8600 parts Nvidia's released, and how the new ones have such deceptive names. The first real upgrade is the 9700GTS.



Less of a difference. The 8600 and 9600 are virtually the exact same thing. Penryn was actually around 10% faster at the same clock speed on average than Conroe.



Look at the specs. It's the same part. I'm sure it's been tweaked some besides the die shrink, but it's effectively the same thing physically.


Reminds me of the ATI Radeon 9600 and 9700 chipsets from the Powerbook G4 days.

Similar, miniscule evolution.
 
Good point. And those were also deceptively named...the desktop parts were really far apart :(
 
In light of your above information, I would add that doing such heavy lifting on a notebook is a little misguided, desktops can power through that stuff much faster. So while you would be better off with the extra ram for heavy CAD applications, you'd be even better off spending that money on a desktop in the first place! :p

Very true. For the money of the middle MBP, you can buy a maxed iMac with the 8800, and a 3.06 GHz CPU, which will of course toast the MBP in everything. Or, go a long way towards a nice Mac Pro.

For the comparison: From Wikipedia (I know, I know!)
8600M GT versus 9600M GT
Stream Processors: Both 32
Power: 8600 22W, 9600 23W
Core Clock: 475 vs 500 MHz
Memory Clock: 1400 vs 1600 MHz
Lithography Process: 80 vs 65 nm.

So, yes, based on these numbers, you would suggest that the two cards are almost identical in performance. But, we all know the 8600M GT is underclocked considerably in the old MBPs. Maybe, due to the smaller lithography process, it isn't underclocked so much. But whatever, it doesn't really matter, clearly from the reviews there is something that makes the 9600 better than the 8600. Maybe it's just the faster memory and FSB.

But really, we're talking like 50% better in games max. This is still not going to make Crysis playable on all High, although might make all Medium playable, maybe even at native res. It's better than nothing, but it's still no gaming machine, although as always, it's an excellent all-round machine.

Having 512 MB of VRAM over 256 is going to help in a lot of cases, but the benefit is not going to be noticeable in most cases. Therefore it seems pointless to pay substantially more just for extra VRAM. So if for some reason you're going for an MBP predominantly for gaming, it seems silly to go for anything but the basic version. Of course, for non gamers, there are extra perks that justify the cost, HDD, CPU and RAM.
 
But whatever, it doesn't really matter, clearly from the reviews there is something that makes the 9600 better than the 8600.

Far, FAR more likely, the reviews simply don't have a valid methodology to their tests. Most likely the results on the older system were with older drivers.

But really, we're talking like 50% better in games max.

Impossible. Completely impossible.
 
So wait, I currently have a 2.5GHz MBP with a 512MB NVIDIA 8600 GT. Would it be better for me to go with the 2.4GHz MBP 256MB 9600M GT or the 2.51 GHz 512MB 9600M GT then if I upgraded since you said VRAM doesn't matter in MBP's.

Also, I could throw the 2.4Ghz MB into the mix seeing how I only play games like EVE Online, Spore, and The Sims 2 on my notebook when I'm not at my desktop. I do some Photoshop and Dreamweaver CS3 work too. Plus I love the savings.

So do you think I should go 2.4 GHz MB 9400M, 2.4GHz MBP 256MB 9600M GT, or 2.51 GHz 512MB 9600M GT
 
why are people defending the argument that the 9600M GT isn't that much better than the 8600M GT with random details that are hardly relevant. BareFeats, Notebookreview and PC magazine say the 9600m gt is almost twice as fast as the 8600m gt. there are a couple of reasons for this, 1. the GPU is much better, Nvidia chipsets help the graphics performance.

Would you like to point out where it's stated so? I'm very sure you're talking about the 9600 GT vs the 8600 GT (desktop versions).

This thread from Notebook Review says exactly the opposite: http://forum.notebookreview.com/showthread.php?t=307054&9600m


Is it possible that when in 9600 mode it's using hybrid SLI? That could account for the large increase in speed over the 8600...

The 9600M doesn't support Hybrid SLI. :)

Source: http://www.nvidia.com/object/hybridsli_notebook.html


You have proof of this?

A 9600M GT outperforms a 8600M GT by 15 - 25% at max. It's impossible to achieve double the performance in a 'fair' test, where all are variables but the graphics card is constant.

A 9600 GT (desktop) achieves double the performance of a 8600 GT (desktop), as it has 64 vs 32 stream processors, and a 256-bit memory bus vs a 128-bit memory bus. However both the 9600M GT & 8600M GT (notebook) both have only 32 stream processors and a 128-bit memory bus. Due to only a marginal boost in clock speeds, it is impossible for it to achieve such a wide performance margin over the 8600M GT in Crysis.


For the comparison: From Wikipedia (I know, I know!)
8600M GT versus 9600M GT
Stream Processors: Both 32
Power: 8600 22W, 9600 23W
Core Clock: 475 vs 500 MHz
Memory Clock: 1400 vs 1600 MHz
Lithography Process: 80 vs 65 nm.


So, yes, based on these numbers, you would suggest that the two cards are almost identical in performance. But, we all know the 8600M GT is underclocked considerably in the old MBPs. Maybe, due to the smaller lithography process, it isn't underclocked so much. But whatever, it doesn't really matter, clearly from the reviews there is something that makes the 9600 better than the 8600. Maybe it's just the faster memory and FSB.

I'd like to point out that the memory clock of the 8600M and 9600M is 700 and 800 respectively. :p

Also, a similary clocked 8600M/8700M/9500M/9600M/9650M/9700M GT should perform almost the same. (<5% difference at max)
 
So wait, I currently have a 2.5GHz MBP with a 512MB NVIDIA 8600 GT. Would it be better for me to go with the 2.4GHz MBP 256MB 9600M GT or the 2.51 GHz 512MB 9600M GT then if I upgraded since you said VRAM doesn't matter in MBP's.

Why would you "upgrade" at all? The hardware's virtually identical. Just wait at least one more upgrade cycle, or whenever it is they do a real upgrade (or whenever you need more power).

Also, I could throw the 2.4Ghz MB into the mix seeing how I only play games like EVE Online, Spore, and The Sims 2 on my notebook when I'm not at my desktop. I do some Photoshop and Dreamweaver CS3 work too. Plus I love the savings.

Okay, that would be a gigantic DOWNGRADE. Why would you do that when you already own an up to date system?
 
Why would you "upgrade" at all? The hardware's virtually identical. Just wait at least one more upgrade cycle, or whenever it is they do a real upgrade (or whenever you need more power).



Okay, that would be a gigantic DOWNGRADE. Why would you do that when you already own an up to date system?

Not to mention that it took Apple SEVEN MONTHS to get their drivers not to COMPLETELY CRASH FREEZING THE ENTIRE MACHINE REQUIRING HARD REBOOT when the 8600M GT first came out. Seven months!!! (8600MGT debut: 7/6/07. Leopard Graphics Update 1.0: 2/11/08.)

Let me emphasize something. SEVEN MONTHS!

The real reason to upgrade to the new MBP is to have the "beautiful glossy screen" so you can pop your zits without a mirror and the "solid body construction" since that's the only way you're going to get a solid body.

But seriously, the faster RAM and FSB speeds are phenomenal boosts to performance. The ability to easily upgrade the hard drive is freaking awesome (ever tried to do it on the previous ones? hahah lol qq). Sure the GPU is 25% faster but then the RAM and FSB and CPU are also faster (esp. the 2.8GHZ model... drool...) so it's a solid upgrade.

What Apple needs to do is make a 17" model that has dual discreet GPUs as an option (such as dual 9600M GT or 9800M GT).
 
Not to mention that it took Apple SEVEN MONTHS to get their drivers not to COMPLETELY CRASH FREEZING THE ENTIRE MACHINE REQUIRING HARD REBOOT when the 8600M GT first came out. Seven months!!! (8600MGT debut: 7/6/07. Leopard Graphics Update 1.0: 2/11/08.)

Seriously?!? I used one for about a week, and didn't have it crash, but who knows :( (On the Windows side I just installed Nvidia's regular reference drivers and that seemed okay, though I only messed with it for a few hours like that.)

I just want Blu Ray and an upgraded GPU...though I'd even settle just for Blu Ray :(
 
Graphics performance

All this concern over performance - and yet from a graphics perspective a Playstation 3 has to be 10x quicker then the MBP for a fraction of the cost.

Non shameful graphics performance in a laptop is a bonus but it can't be taken seriously as an intensive 3D gaming platform.
 
All this concern over performance - and yet from a graphics perspective a Playstation 3 has to be 10x quicker then the MBP for a fraction of the cost.

Well...yes and no. Really the 8600GT is pretty similar to the GPU in the Playstation 3 overall (and overall better than the 360's GPU). It's got less brute force in some ways, but more programability, and ends up pretty similar on the PC. Better in some games, worse in others (particularly older games).

EXCEPT, since games get super tailored for a particular piece of hardware on the consoles, a vastly less powerful piece of hardware can do results that end up looking pretty nice, and maybe pretty comparable. I mean even the PS2 is able to do some graphics that look pretty nice (Tomb Raider: Anniversary on my HDTV actually looks pretty amazing...though I still ended up going for the 360 version when that came out!)

So the end result is the PS3/360 will end up having much better looking games (probably) than the Macbook Pro can run...maybe, probably, but only because they're able to target a particular piece of hardware and exploit it for all it's worth, really design the art and everything around it...although on the other hand, my Geforce 4 was more powerful than any of the last-gen systems, and was useable and looked good for years. It was able to pump out visuals pretty comparable to earlier 360 games...so maybe the era of that type of thing is over...oh well.
 
So wait, I currently have a 2.5GHz MBP with a 512MB NVIDIA 8600 GT. Would it be better for me to go with the 2.4GHz MBP 256MB 9600M GT or the 2.51 GHz 512MB 9600M GT then if I upgraded since you said VRAM doesn't matter in MBP's.

Also, I could throw the 2.4Ghz MB into the mix seeing how I only play games like EVE Online, Spore, and The Sims 2 on my notebook when I'm not at my desktop. I do some Photoshop and Dreamweaver CS3 work too. Plus I love the savings.

So do you think I should go 2.4 GHz MB 9400M, 2.4GHz MBP 256MB 9600M GT, or 2.51 GHz 512MB 9600M GT

If you are not going to buy the 2.8Ghz processor (and even then...) go with the base MBP if you must buy. In all reality, you're fine with what you have. Buying a Macbook would reduce your performance substantially.
 
Well...yes and no. Really the 8600GT is pretty similar to the GPU in the Playstation 3 overall (and overall better than the 360's GPU). It's got less brute force in some ways, but more programability, and ends up pretty similar on the PC. Better in some games, worse in others (particularly older games).

EXCEPT, since games get super tailored for a particular piece of hardware on the consoles, a vastly less powerful piece of hardware can do results that end up looking pretty nice, and maybe pretty comparable. I mean even the PS2 is able to do some graphics that look pretty nice (Tomb Raider: Anniversary on my HDTV actually looks pretty amazing...though I still ended up going for the 360 version when that came out!)

So the end result is the PS3/360 will end up having much better looking games (probably) than the Macbook Pro can run...maybe, probably, but only because they're able to target a particular piece of hardware and exploit it for all it's worth, really design the art and everything around it...although on the other hand, my Geforce 4 was more powerful than any of the last-gen systems, and was useable and looked good for years. It was able to pump out visuals pretty comparable to earlier 360 games...so maybe the era of that type of thing is over...oh well.

Well I see your point, but in terms of raw power

9600M GT = 120 GFlops
PS 3 = 6 x 204 GFlops per cell processor = 1224 Gflops. (Total system 2 TFlops)

So about 10x raw power.

What do you think? Am I comparing Apples and pears? ;)

Sources: Wikipedia
http://en.wikipedia.org/wiki/GeForce_9_Series#Technical_Summary
http://en.wikipedia.org/wiki/PlayStation_3_hardware
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.