Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
alright all you gfx card gurus out there. wolfpup and eidorian i'm eying you ;)
we know that the 9600m gt is not that much of an improvement over the 8600m gt, but SLI that with the 9400m, does performance go up leaps and bounds? Would it be worth the upgrade? And you finally be able to play Crysis in full res via a 30'' display?!!

And does anybody know if the new MBP will automatically utilize both gfx cards when doing intensive graphics like playing a game and automatically shut down the discrete card when doing normal tasks?

Just to cap things off... Lets clear this up a bit.

First a briefing for those who didn't know. Included in their previous MacBookPro's logicboard was a dedicated chipset from Intel (Santa Rosa), a dedicated Graphics Processor (8600M GT) and the common Intel Core2Duo Processor.

In the New MacBookPro, Apple have made some tweaks and switched many of their internal components to cut costs and somewhat increase performance. Instead of the Intel Chipset, Apple have chosen to go with the nVida's GeForce 9400M (which is a combined Chipset with integrated Videocard, similar to Santa Rosa) to boost graphic performance they've added the 9600M GT GPU, then they've thrown in the common Intel Core2Duo Processor.

According to the slightly misleading keynote, the New MacBookPro has five times the GPU power of the Intel Chipset Graphics offers, when compared to 8800M GT, it has approximate 88% of the power and all this before you switch on the 9600M GT. OK this sounds really impressive if you're using the machine for everyday tasks.

However...

I suspect there's a very valid reason as why the MacBook Pro 17 remains unchanged for now, does anyone know why? I believe it's mainly due to some applications are optimised to run better on a Intel Chipset based systems, while others will just run at 'X' speed regardless.


Currently only the new MacBookAir, MacBook and MacBookPro15 uses the nVida's GeForce 9400M, everything else us is still very much based on Intel's Chip-set - well i don't know about the rest of you, But I'd choose a Intel Chip-set for my system any day!
 
I'm still perplexed as to why there wasn't a 9650M. From what I gather it's a drop-in replacement, right?
 
This is what I see. The laptop can't do SLI because the chips are not the same. Apple set the laptop up so one chip turns on and one turns off depending on the task to game or save battery. SLI in that thin laptop would give you 1st degree burns on your lap and drain the battery fast. I think nVidia gave Apple the chips for practicaly free. oh yah 9600m GT = overclocked 8600m GT with stability.

The chips not being the same has nothing to do with it because Hybrid SLI technology is something new from nVidia that uses the nVidia integrated mobile GPU with an nVidia dedicated graphics card. Naturally an integrated GPU chip and a dedicated GPU card is not going to be the same. It's based on SLI but it's not the same thing as SLI on a desktop PC.

Sony has had dual graphics solutions using the Intel integrated graphics and a dedicated graphics card for a while now and you're able to switch back and forth between the two using a little switch on the laptop. This is something different even though it will offer the same properties as far as being able to conserve on power.

Hybrid SLI® technology, based on NVIDIA’s industry-leading SLI technology, delivers multi-GPU (graphics processing unit) benefits when an NVIDIA® motherboard GPU is combined with an NVIDIA discrete GPU. Hybrid SLI increases graphics performance with GeForce® Boost and provides intelligent power management with HybridPower™.
 
This is what I see. The laptop can't do SLI because the chips are not the same. Apple set the laptop up so one chip turns on and one turns off depending on the task to game or save battery. SLI in that thin laptop would give you 1st degree burns on your lap and drain the battery fast. I think nVidia gave Apple the chips for practicaly free. oh yah 9600m GT = overclocked 8600m GT with stability.

The same chip argument is not true, as has been stated before in this thread. Nvidia for some time now has Hybrid SLI technology that can combine the power of unevenly powerful cards. In this case it can combine integrated and discrete graphic chips.

The question still remains: Is this going to be enabled in Mac OSX?

I think that it will be possible under xp/vista under bootcamp, there is no technological reason why it shouldn't, it's all about drivers.

Lets wait for detailed reviews and tests before we jump to conclusions.
 
New MB vs. Old MBP.

the 8600gt should still be fairly better than the 9400.

Is this really the case?

I'm basically sitting here, twiddling my thumbs, and going back and forth between picking up the old MBP (1400) or the new MB (1499 w/ student discount), and this is one of the main factors weighing my decision. I think I'm leaning towards the old MBP, and from the looks of it, the 8600GT should still edge out the 9400M. Is this really the case? Anyone know how much the marginal difference would be?

Also, are the displays pretty much identical (sans glossy?). Is the 'instant-on' LED a new feature w/ this line?

In the end, the new design didn't seduce me, the loss of firewire is annoying, and the displayport and touchpad aren't enough to warrant picking it over the old MBP. What do you guys think?
 
Is this really the case?

I'm basically sitting here, twiddling my thumbs, and going back and forth between picking up the old MBP (1400) or the new MB (1499 w/ student discount), and this is one of the main factors weighing my decision. I think I'm leaning towards the old MBP, and from the looks of it, the 8600GT should still edge out the 9400M. Is this really the case? Anyone know how much the marginal difference would be?

Also, are the displays pretty much identical (sans glossy?). Is the 'instant-on' LED a new feature w/ this line?

In the end, the new design didn't seduce me, the loss of firewire is annoying, and the displayport and touchpad aren't enough to warrant picking it over the old MBP. What do you guys think?

http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html

8600mGT destroys the 9400..not to mention the 9400 shares memory
 
Max resolution of 9400M vs. 9600M

Am I gonna be able to use my 30 inch dell monitor with Nvidia 9400M. Do you think resolution would be greater with 9600M?

Thanks for your HELP.
 
The same chip argument is not true, as has been stated before in this thread. Nvidia for some time now has Hybrid SLI technology that can combine the power of unevenly powerful cards. In this case it can combine integrated and discrete graphic chips.

The question still remains: Is this going to be enabled in Mac OSX?

I think that it will be possible under xp/vista under bootcamp, there is no technological reason why it shouldn't, it's all about drivers.

Lets wait for detailed reviews and tests before we jump to conclusions.

In SLI doesn't it downclock the higher powered card to the speed of the lower powered card? Thus in this case kind of defeat the point because the 9600 alone would be more powerful than two 9400's?

I'm not saying I'm right or that your wrong. I'm asking a question because I don't know the answer.
 
I am very curious to see whatever it is barefeats reveals;

I recently got the old MBP and while I am in no way upgrading i am just curious to see what kind of real world performance I would see as I am not 100% convinced of Apple's claim of 85% better performance
 
Only a single GPU is used at a time. There is a preference in System Preferences you can use to switch between GPUs. You must log out for the change to take effect.
 
In SLI doesn't it downclock the higher powered card to the speed of the lower powered card? Thus in this case kind of defeat the point because the 9600 alone would be more powerful than two 9400's?

I'm not saying I'm right or that your wrong. I'm asking a question because I don't know the answer.

As far as I know it doesn't downclock the faster card. It just distributes what needs to be rendered between the cards depending on their power, such as computing can be distributed among multiple processors or processor cores.

That is the possible beauty of Hybrid SLI solution - on battery use integrated, need more power - use discrete, need even more power - use both.
 
That is the possible beauty of Hybrid SLI solution - on battery use integrated, need more power - use discrete, need even more power - use both.

I wish.

Only one GPU can be used at a time, they can't work together for extra power. Would have been cheaper for Apple to just let users slow down the clock speed of the 9600M in the preferences rather than having them select between two GPUs.
 
GeForce Boost
GeForce Boost turbocharges the performance of NVIDIA discrete GPUs when combined with NVIDIA motherboard GPUs. Plug any NVIDIA Hybrid SLI-enabled GPU into any NVIDIA Hybrid SLI-enabled motherboard to enjoy additive performance and more for your money.

http://www.nvidia.com/object/hybridsli_notebook.html

Hopefully this will be supported soon or when snow leopard is released. My guess is that Apple opened up the possibility by adding two GPU's and now passed the ball to NVIDIA.
 
The Geforce Boost feature doesn't used both GPUs, it's just an easier way to stick in a faster dedicated GPU.

If I'm wrong, please correct me. If I'm wrong it'll put a smile on my face :cool:
 
I'm just going to dig up my post about VRAM in case people are still confused on it:

"To explain it further, I'll pull a post straight from the notebook review forums:

"1.Is a GeForce 8600M-GT 256MB DDR2 better than a GeForce 8600M-GT 512MB DDR2?

This is a very common train of thought – more video memory must mean better performance. This is not true – the video card itself matters much more than the memory it has.
In this case, both cards have the same performance. The 8600M-GT DOES NOT HAVE ENOUGH POWER to use more than 256MB of memory. It has a limited 128-bit memory bus. Only cards with a 256-bit bus or greater are going to be able to use more than 256MB of memory. It is not worth spending any extra money on a mid-range card like the 8600M-GT with more memory. There is no performance gain to justify the price.
Why can't it use more memory effectively? Here's a primitive example. An office worker can use a maximum of three computers at a time. If he is given an additional three computers, is he any more productive? No, because he can only use three of them to begin with. The extra three do nothing."

Tale note that the 9600 has a 128 bit interface like the 8600."

The 9600 will be faster then the 8600 by a fair bit due to the higher clock speed, but a 9600 with 512MB of VRAM will be no faster then the 9600 with 256.
 
I'm just going to dig up my post about VRAM in case people are still confused on it:

"To explain it further, I'll pull a post straight from the notebook review forums:

"1.Is a GeForce 8600M-GT 256MB DDR2 better than a GeForce 8600M-GT 512MB DDR2?

This is a very common train of thought – more video memory must mean better performance. This is not true – the video card itself matters much more than the memory it has.
In this case, both cards have the same performance. The 8600M-GT DOES NOT HAVE ENOUGH POWER to use more than 256MB of memory. It has a limited 128-bit memory bus. Only cards with a 256-bit bus or greater are going to be able to use more than 256MB of memory. It is not worth spending any extra money on a mid-range card like the 8600M-GT with more memory. There is no performance gain to justify the price.
Why can't it use more memory effectively? Here's a primitive example. An office worker can use a maximum of three computers at a time. If he is given an additional three computers, is he any more productive? No, because he can only use three of them to begin with. The extra three do nothing."

Tale note that the 9600 has a 128 bit interface like the 8600."

The 9600 will be faster then the 8600 by a fair bit due to the higher clock speed, but a 9600 with 512MB of VRAM will be no faster then the 9600 with 256.

Do you have any numbers/benchmarks/etc to back that up, other than a forum post? I mean obviously there will be virtually no speed differences, but no difference at all? This shows otherwise: http://www.macworld.com/article/132330/2008/03/macbookpro_bench.html
 
I would say that modern graphic cards are going ahead of the actual industry needs: 8600M GT is definitely enough for any game you want to play on your laptop. 1024x768(or sometimes 800x600) resolution looks fine on a 15 inch screen, so you can even max out the details.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.