Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why are so many people in love with ATI?

We've stopped buying Macs with ATI graphic chips as their support for OpenGL is flaky and brakes under most Professional CAD or rendering software.
Even Coverflow can send the screen nuts under the wrong conditions.
I'm rather fond of the new HD48xx Series video cards which offer the best performance per watt for a video card right now.

Not to mention Crossfire support on the P45 chipset.
 
Does anyone have an idea of what the NVIDIA integrated graphics will be like performance wise against the 8600M?

While they are integrated, I'm interested to see how much juice can be squeezed out of it and how much it'll cope with games.
 
Thoughts?
It all depends on power consumption and price.

MB battery life is longer then the MBP due to it uses integrated GPU, MB is aimed towards student, student usually will prefer longer battery life notebooks, my fren starts not to like his M1530 because its battery life is short, he don't do much GPU intensive stuffs, but because his laptop uses a dedicated GPU its draining his notebook battery life.
How does a GPU that is not used much drain the battery a lot? In fact, I'm not sure what takes more power: Integrated GPU at ≈50% usage or discrete GPU at ≈10% usage.

And is the OPTION of a more powerful GPU too much to ask, Apple? Or maybe the ability to switch between integrated and discrete?
 
First thoughts when hearing rumours of the new MacBooks using NV MCP79: *4W35SUM*! :D

Though I do find it a bit difficult to justify replacing my barely a year old late-November 2007 MacBook. Besides, looking at the 'leaked' photos, there seems to be some weird design decisions that Apple made:
No FireWire? Hmm...
IR receiver on the left hand side? That's not going to work...
Still two USB ports?! omgwtfbbq!
Another proprietary display connector? *sob*sob*

Maybe, just maybe if that MCP79 with a dedicated graphics processor is really that good, and a higher resolution 13.3'' screen, possibly matte, might move my heart. But for now, breathe and calm down. :)
 
Though I do find it a bit difficult to justify replacing my barely a year old late-November 2007 MacBook. Besides, looking at the 'leaked' photos, there seems to be some weird design decisions that Apple made:

1. No FireWire? Hmm...
2. IR receiver on the left hand side? That's not going to work...
3. Still two USB ports?! omgwtfbbq!
4. Another proprietary display connector? *sob*sob*

1. It looks like there is only FireWire 800, no FireWire 400. They better include FireWire, I've got a few devices and a mixer that need it

2. You'd be surprised how good the IR receiver is - I can have the remote round the side of my MacBook Pro and it still works.

3. Yeah, only two USB ports sucks. But then again, at my desk, I have wireless keyboard and mouse, and a USB hub.

4. Another adapter to carry around!
 
sweet sweeet sweeeeeet. For the casual user, it would now make even more sense to go with a macbook instead of the pro. esp if the entry price on the macbook is ~$1000US. Unless you can find a refurb previous gen MBP for close to that price.

Now it looks like the refresh to the iMacs might also be using nvidia chipset. & if the entry price of the iMac 20inch is also ~$1000, then it might make sense to buy an iMac with potential 9800MGT?? & a macbook with 9300M card for close to the price of a MBP.
 
well I am waiting to see if they vamp up the chipset (and looks like they may do that very well) then Ill have to jump into the macbook camp :) (My Air is great but Id love a gaming capable portable for when I am offsite) :)
 
1. It looks like there is only FireWire 800, no FireWire 400. They better include FireWire, I've got a few devices and a mixer that need it

2. You'd be surprised how good the IR receiver is - I can have the remote round the side of my MacBook Pro and it still works.

3. Yeah, only two USB ports sucks. But then again, at my desk, I have wireless keyboard and mouse, and a USB hub.

4. Another adapter to carry around!

Ah! Just to clarify, I am looking at the MacBook, not Pro. The new case photos shows that there are no FireWire ports (even FW400) on the new MacBooks.

I take back what I said about having the IR receiver on the side. :eek: Just tried with my Apple Remote, it indeed does work even round the MacBook's corner! I thought IR works on line-of-sight only. Perhaps it still works with the reflection of IR light off objects.
 
Ah! Just to clarify, I am looking at the MacBook, not Pro. The new case photos shows that there are no FireWire ports (even FW400) on the new MacBooks.

Looks like it has FireWire 800 to me - port down from the MagSafe. Unless it is a GigaBit ethernet port. However, I'd assume Apple would provide a USB-Ethernet adapter, rather than throwing away a FireWire port.

161931.jpg
 
Looks like it has FireWire 800 to me - port down from the MagSafe. Unless it is a GigaBit ethernet port. However, I'd assume Apple would provide a USB-Ethernet adapter, rather than throwing away a FireWire port.

161931.jpg

That's insightful! Almost everyone on the other thread is saying that port is Ethernet, leaving no FireWire for MacBooks. Like so: https://forums.macrumors.com/posts/6393333/

Of course! Since Apple has done away with Ethernet on the MacBook Air, they could always sell us another cable to get Ethernet back.
 
That's insightful! Almost everyone on the other thread is saying that port is Ethernet, leaving no FireWire for MacBooks. Like so: https://forums.macrumors.com/posts/6393333/

Of course! Since Apple has done away with Ethernet on the MacBook Air, they could always sell us another cable to get Ethernet back.

Although, by quickly comparing the MacBook and MacBook Pro, the MacBook appears slightly thinner (looking at the space around the USB ports) so maybe it looks like an Ethernet port in relation to the MacBook Pro?

Wishful thinking though! I was so going to buy a MacBook, but if it doesn't have FireWire..
 
Looks like there are differing views on NVIDIA. I'm anxious to see how performance will be improved. Is this just a step towards Snow Leopard? By the time that's released next year, they should have any NVIDIA bug worked out.
 
So are you saying just because the chipset supports DDR3, all Apple is going to just put DDR3 in everything? Doesn't seem like the benefits justify that kind of price increase. Maybe on the higher end machines, but on all of them?
I actually hope Apple puts DDR3-1066 in everything. It seems PC makers are sticking to DDR2-800 with Montevina so going DDR3-1066 will certainly set themselves apart. The lower power consumption will be a help too.

In terms of price, I don't think it's as much of an issue since Apple seems to get preferential treatment from Samsung probably because of the volumes of flash memory they buy. I'm pretty sure all Apple discrete graphics cards since switching to Mac have used GDDR3, at a time when most PC makers stick to cheaper DDR2. Which is why the 8600M GT in the MBP is actually faster than it looks since it uses higher-clocked GDDR3 while most PC implementations use DDR2 with the 8600M GT. Even the lowly HD2400XT used in the iMac uses faster GDDR3 instead of DDR2. I've checked that the GDDR3 in my Mobility X1600 in my MBP is from Samsung, so it wouldn't surprise me if Apple gets better memory prices.
 
Does anyone have an idea of what the NVIDIA integrated graphics will be like performance wise against the 8600M?

While they are integrated, I'm interested to see how much juice can be squeezed out of it and how much it'll cope with games.

Likewise. I'm not expecting anything amazing, but I am definitely interested in seeing what the 9300/9400 is actually capable of doing.
 
So, it looks as though Apple is going with nvidia due to price point and performance. Hands down, nvidia wins with graphics performance. The hope is that Apple really gets their driver teams to thoroughly test that chipset as nvidia + Intel has been a bit too much of a drama story in the past, with I/O issues, data corruption, etc. The other part is power consumption and heat. The former is likely to be taken care of, but latter might still be there. nvidia chipsets do run hot, at least in their ATX mainboard forms. Does anyone know the rated heat output of this chipset? With any luck, nvidia will have put together a more solid chipset as in their NF4 days, and people will be able to rock out on their macbook (pros) :).
 
Sounds awesome and do you think you could run some year old games like TF2 at a decent framerate like 50-60 maybe 70?

ChrisN

Looking at some branhcmark, the Nvidia 9300m is around 12x faster then Intel's GMA 900, and around 6x faster then Intel's GMA 3100.

Considering people can run TF2 and the like with a GMA 3100, I think you'll be able to play TF2 at least at medium.
 
So, it looks as though Apple is going with nvidia due to price point and performance. Hands down, nvidia wins with graphics performance. The hope is that Apple really gets their driver teams to thoroughly test that chipset as nvidia + Intel has been a bit too much of a drama story in the past, with I/O issues, data corruption, etc. The other part is power consumption and heat. The former is likely to be taken care of, but latter might still be there. nvidia chipsets do run hot, at least in their ATX mainboard forms. Does anyone know the rated heat output of this chipset? With any luck, nvidia will have put together a more solid chipset as in their NF4 days, and people will be able to rock out on their macbook (pros) :).

I really hope so too
 
What the hell's the point of 70FPS on an LCD display only capable of 60 Hz?

Typically the "FPS" rating of a game is the average for a standard test run.

During difficult scenes the frame rate is much lower - so benching a game at 100 FPS is really much better than 60 FPS. Even though you won't see the difference in the fast scenes, the 100 FPS system can be much better in the slow scenes.
 
makes sense why Apple waited so long to update to Centrino 2. A new NVIDIA chipset w/ better performance makes sense. The only caveat I have is that this may damage the Intel relationship if true... but then again, Apple doesn't participate in Centrino programs, so the relationship may not be damaged at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.