Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

djinn

macrumors 68000
Original poster
Oct 4, 2003
1,878
465
Just curious how this card would compare to a Nvidia 7950 GTX 512 mobile in a Dell XPS.
 
XPS M1710 with 7950GTX and T7600G would kill 8600M GT, its not really comparable. 8600M is a good card for a 15" though.
 
I figured it would.. So how outdate is the x1600 now?
 
Dell is probably a better raw desktop-replacement and also targets gamers more. MBP can't compete with it's graphics card; the 8600M might have DX10 compat, but I think people forget that an 8600M won't exactly be able to run DX10 games at 1440x900 (or more) with any kind of AA/AF on. It's not a high end card, but a solid mid-range card.

MBP is much more portable though, and I struggled to find a 15" Dell with a dedicated card that wasn't an old ATI card [for the same price as the MBP, not more costly].
 
Don't expect to be be playing Crysis at 1920x1200 on a 8600M.

However, it seems to be a decent card. Good enough to run current games and probably for a while yet. I don't quite understand the 'It won't run DX10 games very well' mentality because I assume that not every DX10 game has to be as demanding as Crysis or Alan Wake :confused:
 
I have a dumb question.

I'm getting a 15" SR MacBook Pro on Monday. Obviously it has a resolution of 1440x900 pixels. If I play a game, Battlefield 2 for example, and set the resolution 1024x768, will the video card still be under just as much strain as if I set the resolution to 1440x900? Or is it a pretty easy job for the GPU to stretch the 1024x768 image to fill the entire 1440x900 screen?
 
I have a dumb question.

I'm getting a 15" SR MacBook Pro on Monday. Obviously it has a resolution of 1440x900 pixels. If I play a game, Battlefield 2 for example, and set the resolution 1024x768, will the video card still be under just as much strain as if I set the resolution to 1440x900? Or is it a pretty easy job for the GPU to stretch the 1024x768 image to fill the entire 1440x900 screen?

It'll be easier to drive it because it's a lower resolution. However, it might not look very good because it's a non-native resolution.
 
Ah. OK, thanks. Maybe I'll just not full-screen it then...
I doubt you will worry so much about if when things moves thought, if it was a static image moved up from 1024x768 to 1440x900 you might see that some pixles are "double", but when it's in a game and things move around I doubt it's very noticable, if it increases your frame rate to something you are more comfortable with I'd say go for it anyway.
 
Whats the true difference in % with the ATI x1600 vs Nvidia 8600M GT?
 
Whats the true difference in % with the ATI x1600 vs Nvidia 8600M GT?

Barefeats - Santa Rosa 3D Shootout

The new MacBook Pro with the GeForce 8600M runs 3D accelerated games significantly faster than the previous 2.33GHz model with the Mobility Radeon X1600. For example, Quake 4 ran 60% faster, Prey 55% faster, and Doom 3 ran 38% faster. The "Rosa" even beat the 4-Core Mac Pro desktop with a GeForce 7300 GT in 4 out of 5 of our tests.
 
I've run quite a few games at 1280 x 1024 or 1366 x 768 on my iMac 20" Core Duo. They looked just fine stretched.

A lot of newer games come with widescreen resolutions as well. Even if it's slightly lower then native you won't get the stretched out look. It only seems to happen in the menus at that.
 
Dell is probably a better raw desktop-replacement and also targets gamers more. MBP can't compete with it's graphics card; the 8600M might have DX10 compat, but I think people forget that an 8600M won't exactly be able to run DX10 games at 1440x900 (or more) with any kind of AA/AF on. It's not a high end card, but a solid mid-range card.

MBP is much more portable though, and I struggled to find a 15" Dell with a dedicated card that wasn't an old ATI card [for the same price as the MBP, not more costly].

Yes but what about running it via external monitor at 1024x768 or 1280x1024 instead of the 1440x900?
 
to give you a very general idea, i'm able to run Dark Messiah of Might and Magic on Vista on my 15 inch MBP at native resolution, high quality textures, high quality models, full reflections and shadows, with HDR lighting enabled with absolutely no stuttering or problems.

Dark Messiah is based on an upgraded Source engine.

Also, I'm able to run NVIDIA's Mad Mod Mike demo perfectly, and the new 8xxx organic tower one in lite mode.

I can't get 3DMark to run tho
 
Thats why I said on-par with. It wins out in some things and loses in others.
Right, I understand. Really it needs more ROP's and more memory bandwidth.

DX10 is a prtty huge point.
So, um why is everyone so happy about Apples support of a MS format? I would think supporting the next version of OGL is top priority not if it support DX10.
 
So, um why is everyone so happy about Apples support of a MS format? I would think supporting the next version of OGL is top priority not if it support DX10.

Well, some of us are gamers and run Windows.

The other thought that occurs is with OpenGL 3 due later this year, it doesn't hurt to have hardware in place ready :)

Another feature of DX10 is the start of the "virtualized GPU" hardware, so it has some potential for helping with software like Parallels and VMWare. Of course, that's pretty basic on DX10, it seems more likely that DX11 or DX12 will bring the extensions we reallt need for high performance virtualizing, but hey, I'll take what I can get.

Beyond that... I'm just happy it's a decent GPU. I wish Apple had gone for 256MB/512MB options, which is where most of the Windows vendors seem to be going with their systems but hey, at least it's not an X1600...
 
I have another question, this one slightly less dumb. How much graphics processing is required for anti-aliasing in a game? Especially in relation to stuff like high quality textures/geometry (when playing a game, I would probably be willing to sacrifice some texture quality for the sake of better anti-aliasing, as I think AA really improves the gaming experience). And is anti-aliasing more graphics processor-intensive or graphics memory-intensive?
 
I have another question, this one slightly less dumb. How much graphics processing is required for anti-aliasing in a game? Especially in relation to stuff like high quality textures/geometry (when playing a game, I would probably be willing to sacrifice some texture quality for the sake of better anti-aliasing, as I think AA really improves the gaming experience). And is anti-aliasing more graphics processor-intensive or graphics memory-intensive?

Memory bandwidth intensive.

Well, some of us are gamers and run Windows.

The other thought that occurs is with OpenGL 3 due later this year, it doesn't hurt to have hardware in place ready :)

Another feature of DX10 is the start of the "virtualized GPU" hardware, so it has some potential for helping with software like Parallels and VMWare. Of course, that's pretty basic on DX10, it seems more likely that DX11 or DX12 will bring the extensions we reallt need for high performance virtualizing, but hey, I'll take what I can get.

Beyond that... I'm just happy it's a decent GPU. I wish Apple had gone for 256MB/512MB options, which is where most of the Windows vendors seem to be going with their systems but hey, at least it's not an X1600...

Oh, don't get me wrong, I think the GPU is great. I just find the excitement over what admittedly is only truly useful in Windows funny (as in why buy a Mac if you have to run Windows to take advantage of the features?). I can't wait for OGL 3, I hope it is incorporated in Leopard right off the bat.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.