Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
ATI is hardly behind at the moment. The 4870 (desktop) matches the GTX 260. The 4870X2 should easily match the GTX 280. The 4 series is so good that Nvidia actually lowered their prices from $650 for the GTX 280 to $500 and $400 for the GTX 260 to $300. Nvidia had no idea they would be so good.

I would hope for an Nvidia GPU in the next revision, as the 3650 is the 2600 with DirectX 10.1. Now you might say "OSX doesn't use DirectX!", and you're absolutely right. There would be no added benefit. The 9600GT is what we're going to get.

It's funny that some of you guys are convinced that it's a horrible card when in fact it's a mid range card. Before I built my current gaming computer along with my G4, I was using a GeForce 2MX, and a 5200FX in an iMac. THOSE are garbage cards, even for their time.

Compared to the 8800GTS in my computer, the 8600GT isn't much. But it's miles ahead of where you think it is.

As someone said above me, if the X1600 gets 35FPS with medium settings, that's great. 30FPS is considered easily playable, anything more is just icing on the cake.

Actually the GTX 280 is only a tad bit better than the 4870. The only game where the GTX 280 will be noticeably better is Crysis...which is still like 40fps versus 35fps, and $650 versus $299 respectively. In other games you're getting 60 and above; you won't know the difference. You're dumb if you grab the GTX280...and I guess most of America ain't dumb, that's why Nvidia is losing sales on contrary to popular belief. So no, ATi is not behind at all. It's just that they're smart enough to use a slightly different technology to control cost and not get their behinds bitten by Rambus in the process.
 
Actually the GTX 280 is only a tad bit better than the 4870. The only game where the GTX 280 will be noticeably better is Crysis...which is still like 40fps versus 35fps, and $650 versus $299 respectively. In other games you're getting 60 and above; you won't know the difference. You're dumb if you grab the GTX280...and I guess most of America ain't dumb, that's why Nvidia is losing sales on contrary to popular belief. So no, ATi is not behind at all. It's just that they're smart enough to use a slightly different technology to control cost and not get their behinds bitten by Rambus in the process.

It's actually $400 or so now, but yeah, the performance is virtually identical.
 
Actually the GTX 280 is only a tad bit better than the 4870. The only game where the GTX 280 will be noticeably better is Crysis...which is still like 40fps versus 35fps, and $650 versus $299 respectively. In other games you're getting 60 and above; you won't know the difference. You're dumb if you grab the GTX280...and I guess most of America ain't dumb, that's why Nvidia is losing sales on contrary to popular belief. So no, ATi is not behind at all. It's just that they're smart enough to use a slightly different technology to control cost and not get their behinds bitten by Rambus in the process.

thanks for agreeing and getting facts wrong I guess ;)
 
Erm... even the 9700 GT isn't twice as fast as the 8600. Where are you getting your information?

And the 3870 won't make an apperance in the 15" laptops, as it's for larger ones, meaning 17" or larger... and probably even the 17" is too thin for a 3870. Would be nice though.

Maybe he was thinking desktop 9600 GT vs desktop 8600 GT, which it is about twice as fast if memory serves correct.

And the 3870 is in at least one 15.4'' notebook. The Clevo M860TU (aka Sager NP8660)

What's better, the 9600M GT or the 9650M GS?

And the 8600M GT wasn't bad either. I have the 256MB GDDR2 8600M GT in my Inspiron 1520, and it plays Crysis on medium at 30fps, which is perfect. 9600M GT wouldn't be a huge jump for the Mac Pros, but at least it's something.

I'm personally hoping for a 3870, though doubting it. Anything 9600M GT and up or Mobility 3670 and up would make me happy.
 
Not correct. 8600 was the best available M gpu for ~ a year.

Apple will use the best available card that is large, cold and small enough to fit in.

....Are you high?

The 8600 was never the best available Mobile GPU. http://www.notebookcheck.net/NVIDIA-GeForce-8600M-GT.3986.0.html

The GeForce 7900 was faster, and it was introduced a year before. it's the 8600 and not the 8800 for a reason.... it's a mid-level, mid-cost card, as have been all of Apple's mobile GPUs ever used.

The 2-4 MB ATI Rage LT they used to use in the first PowerBook G3s were mid-range. The ATI Rage 128 chips they used in later PB G3s were midrange. So were the original Radeons, and so on.

They always use mid-cost solutions.
 
I'M PRETTY CONVINCED THEY WILL USE THREE WAY SLI AND QUAD CROSSFIRE IN THE NEW MBP'S STANDARD I MADE IT UP BUT THATS OK ITS TRUE :rolleyes:

Seriously, some of these people have no perspective whatsoever.
 
I want this or a 3850.

How possible is this?
The 3850 is a lot hotter than a 8600.

There's no way a 3850 or any similar card will go in the MBP, when not even a 45 W Core 2 Extreme is an option (+ 10 W over regular Core 2 Duos). And Apple focuses more on the CPU than the GPU.
 
Apple should never think of using an Nvidia card in their Macs again.
Apple should choose an ATI card.
 
Like I said before, the 3650 is the 2600 with a new name and DirectX 10.1 support. OSX has no idea what DX is, so you're basically using a card that was new in mid 2007. The 9600GT on the other hand, we can agree, has some sort of performance benefit.
 
if they are putting in a 9600 im going to get a macmini to hold me together until next year's nehalm for MBP!

*offtopic, out of interest, is it likely they will update the mac minis along with mbp - or anytime this year?
 
Bollocks... I don't know what I'll do if all we get is a 9600M GT then.. maybe get a MB and an iMac..
 
Like I said before, the 3650 is the 2600 with a new name and DirectX 10.1 support. OSX has no idea what DX is, so you're basically using a card that was new in mid 2007. The 9600GT on the other hand, we can agree, has some sort of performance benefit.

Well, this is somewhat true. th 3650 has Shader Model 4.1 while the 2600 has 4.0. OpenGL might be able to use 4.1 through extensions right now (it can use 4.0 through extensions), and if not, I'm sure it will some time down the OpenGL development timeline. But those advantages are minimal.
 
I have the money, that's the thing, I will pay whatever it takes to get a MacBook Pro with an actual decent GPU but Apple won't freaking provide one.
It's not just the GPU. The MacBook Pro can't handle a 45 W CPU, and that, combined with the "GHz Rule," means that quad-core won't go in the MacBook Pro until 2010.
 
lets be realist. Apple will not upgrade the GPU simple because using anything better than the 8600 GT would consume much more power and produce much more heat than the MBP can support. I prefer to have poor graphics and decent battery life than having 100FPS in crysis and 1 hour of battery life, plus a cooking grill on my desk.

but I really hope that they change the nVidia GPU for a equivalent ATI GPU :D
 
lets be realist. Apple will not upgrade the GPU simple because using anything better than the 8600 GT would consume much more power and produce much more heat than the MBP can support.
Tthe 8600 GT went through an optical shrink from 80 nm to 65 nm. The 65 nm card is the 9500M GS. So Apple could use a slightly more powerful card than the 9500M GS and still maintain acceptable power usage. The 9650M GS is the 65 nm version of the 8700M GT.

But yeah, anything *8*0 is unrealistic.

I prefer to have poor graphics and decent battery life than having 100FPS in crysis and 1 hour of battery life, plus a cooking grill on my desk.
You know, powerful CPUs and GPUs only use a lot of heat when you're using them close to 100%. If you ask me, I'd rather be able to have both high performance and low power usage on the same computer.
 
One reason why I think they will stick with nvidia - CUDA... they've already announced that Snow Leopard will have some kind of OS-level implementation of CUDA (aka OpenCL or whatever), so it wouldn't really make sense for them to start adopting ATI (which lacks CUDA support) in new products at this point...

Which also leads me to believe that at some point Apple will go 100% nvidia... how else could they realistically expect OpenCL/CUDA in Snow Leopard to work?

"Hey we have this fancy new feature that uses the power of nvidia GPUs, but our flagship notebooks have ATI GPUs" just doesn't make sense.
 
Tthe 8600 GT went through an optical shrink from 80 nm to 65 nm. The 65 nm card is the 9500M GS. So Apple could use a slightly more powerful card than the 9500M GS and still maintain acceptable power usage. The 9650M GS is the 65 nm version of the 8700M GT.

But yeah, anything *8*0 is unrealistic.

You know, powerful CPUs and GPUs only use a lot of heat when you're using them close to 100%. If you ask me, I'd rather be able to have both high performance and low power usage on the same computer.

sure it would be nice!

But I think (and I can be wrong) that even with an efficient power management, a more powerful dedicated GPU still spread much more heat than a less capable one, even when not in intensive use. Im saying this only based on my experience though. When watching DVDs, I put the MBP on my belly while laying on my bed, sometimes it gets inconveniently hot, comparing to any other laptop I had before. :p
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.