Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

nyc4lifedt

macrumors member
Original poster
Aug 19, 2009
49
0
I previously owned the "first generation" of unibody MacBook Pro when it was released back in 2008. I purchased 24-inch Apple LED Cinema Display with it. The two worked fine together and have enjoyed the combination.

With i5 processor upgrade, I thought it was worth the investment to upgrade to the new MacBook Pro. Now there seems to be some issues that users should be concerned or at least be aware about, especially if you were to attached an external monitor.

The integrated GPU on the new MacBook Pro is Intel HD Graphics, which is suppose to run under low profile situations or when running under battery to maximize the battery usage. As the information gathered on the internet, Apple adopted an automatic switching between the two GPUs, Intel HD and NVIDIA 330M it ships with, without logging on and off and interrupting your workflow or you even knowing about it.

Switching of the GPUs without logging off is great, but since it's automatic, there's no way of running it strictly on integrated GPU. You can choose to run it strictly on discrete GPU, but not the other way around.

The new integrated GPU is not powerful enough, or it just chooses not to, to run attached display. When the LED Cinema Display or any other external display is connected, even under a low power consumption situation, it will run under discrete GPU full time. Doesn't matter if all I'm doing is just some web browsing with Safari or running the Mail or iCal, it will run under the new discrete GPU. This generates the new MacBook Pro to have excessive heat, cooling fan inside to run faster, creating a loud fan noise. This happens when I'm doing just basic work.

I do use Final Cut Pro, DVD Studio Pro, Photoshop once in awhile, but not everyday. Discrete GPU worked great, I just changed the setting in the preference and reboot or re-logged to run more powerful applications. (This with my old MacBook Pro)

Now, even if I wanted to run under integrated GPU with the LED Cinema Display attached, I can't do that. It might not be an issue to many, since discrete GPU is suppose to perform better. And it does. When rendering some videos or running powerful applications, i5 processor is much faster than the old Core 2 Duo. And since you'll be attached to the power adapter, it has nothing to do with your battery usage. I liked the integrated GPU I had with the previous generation that I owned, since it was quiet and didn't create much heat. Everyday work with the integrated GPU was more than enough.

I created a case with Genius Bar expert/agent, and have an option to return it or exchange for a different machine. I'm debating now since I like the new i5 processor (again, it does perform better than Core 2 Duo), but don't like the GPUs.

Maybe an iMac + lower-end MacBook Pro/Air combination might be an option for me.

Just my two cents.
 
Hmm. Good to know. I have actually been strongly considering getting Apple's Cinema Display and what your saying has me thinking I should be doing a little bit more homework before I do. Let us know how it works out.
 
That is good to know, thanks for sharing.

You don't want C2D perhaps but maybe consider last years 15" MBP with max spec's, 512 vRam, ssd, 8gb ram you'll get it at a refurb'd price and you'll have control of the graphics, plus save some money.

Apple will eventually do something about this, no? Maybe a software update can't fix but in the next revision we may see changes again?

Interesting for sure.
 
I have a feeling that eventually, Apple might give in and enable a manual "Always use integrated graphics" mode.

I would definitely rather power the external monitor with the 330m vs. the Intel HD Graphics. The Intel graphics can't even smoothly render expose or stacks on my Core i7 HiRes internal monitor, so pushing more than twice as many pixels at once (both external and internal screens) doesn't sound like it'd work well at all.

I have an iMac 27'', and I had been thinking of ditching that for a large monitor to use with my new MBP, but I've decided to keep it. I use Dropbox to keep my important work files synced between the two computers, and I can get the best of both worlds. Everything works so seamlessly. That's just my 2 cents, though.
 
I've been doing some non-scientific tests over the weekend, running heavy applications and just basic work.

I came to a decision that I will be returning the new MacBook Pro.

Funny to say, it doesn't have anything to do with the new i5 processor. I loved the performance aspect of the new processor. It was noticeably faster than Core 2 Duo processor that I've been using for awhile.

But the new MacBook Pro + Cinema Display combination killed it. It just didn't make sense to me that the basic tasks that even an iPad can handle, had to be run under discrete GPU. The temperature was extremely hot, which forced the fan to go on at full blast. The fan noise kept coming back. It effected my work with the noise. And since the temperature is hot, the processor performance started to drag.

And the firmware won't be able to fix this. As said by Apple Genius/Experts, Intel HD Graphics won't even be able to activate the use of an external monitor on an integrated GPU. (God knows if they're right or they're just reading from what they have to say.)

I'm looking at iMac + MacBook Air combination now.
 
I have a feeling that eventually, Apple might give in and enable a manual "Always use integrated graphics" mode.

I would definitely rather power the external monitor with the 330m vs. the Intel HD Graphics. The Intel graphics can't even smoothly render expose or stacks on my Core i7 HiRes internal monitor, so pushing more than twice as many pixels at once (both external and internal screens) doesn't sound like it'd work well at all.

I have an iMac 27'', and I had been thinking of ditching that for a large monitor to use with my new MBP, but I've decided to keep it. I use Dropbox to keep my important work files synced between the two computers, and I can get the best of both worlds. Everything works so seamlessly. That's just my 2 cents, though.


I would rather my NVidia 330 be on all the time.

I have no use for the integrated graphics. I am on battery less than 30 minutes every day, and run my MBP plugged in on a desk either at work or home all the time.
 
this is the true...I have the new i7 attached to the 24 cinema display and it runs off the intel graphics with the display plugged in when im not playing games etc.
 
I don't have these same problems with my display. However i'm only running a 22" Dell display at 1680x1050, and while the machine gets warm, I don't hear the fans running and it only gets slightly warm. Not any warmer than usual when not plugged into the external display.

Are you sure this isn't just a problem with this specific machine or the fact that you're running somewhat intense applications?
 
The fact that that the integrated card cannot support an external display is bullish. The integrated card is supposed to be able to supply with the high res of the built in monitor and it does so. 1600x 1050 is quite a resolution to supply graphics to.

The gt330 m kicks in when a external display is plugged in every time, yes, its been like this for the longest time, also on my older mbp with the 9600 as soon as you plug in an external monitor it switches over to dedicated no matter what.
 
Last gen units - the 9600 - always go {integrated} once an external is plugged in. I'm surprised if they have changed this and if they did, why? :confused:

* Edit * Looks I was beaten to the punch on this from Panzo
 
Last gen units - the 9600 - always go dedicated once an external is plugged in. I'm surprised if they have changed this and if they did, why? :confused:

The gt330 m kicks in when a external display is plugged in every time, yes, its been like this for the longest time, also on my older mbp with the 9600 as soon as you plug in an external monitor it switches over to dedicated no matter what.

Since when? I don't know if I'm reading yours and Panzo's post wrong (it's too early still) but I don't recall the 9600M GT ever being switched to when an external is connected and automatically for that matter. I've used the 9400M with an external. I'm going to connect it to my TV just to verify but I'm pretty sure there is no switching with the 9400M + 9600M GT MacBook Pro's.

Also, according to Arstechnica, the Intel HD GPU isn't OpenCL capable. Although OpenCL isn't mainstream yet as developers still need to program to take advantage of it, it has to be one, if not the reason, why Apple isn't allowing just the integrated option. The idea of automatic switching is good but Apple seriously needs to tweak it or find another way. Just off the threads alone, the 330M basically turns on for just about anything. Apple should seriously try to integrate a solution very close to what Optimus gives - a list of applications to to turn on the 330M.

EDIT: Running an external (LCD TV) and still on the 9400M, no switching (I don't think it's even possible to switch automatically)
 
Apple should seriously try to integrate a solution very close to what Optimus gives - a list of applications to to turn on the 330M.

Personally I'm very fond of Apple's solution. Give it time and hopefully Apple will refine the switching heuristic. Independently of this developers are bound to adjust their code so as not to be incorrectly flagged as needing the discrete GPU. In short the future looks promosing and Apple's solution is only set to get better!

Adam
 
I'm pretty sure it will get better with time but at the same time, I think a lot falls on Apple and less on the developers. The problem isn't really how developers program, it's more to do with the fact that OS X is so revolved around Core Image and the Quartz frameworks that most OS X applications will utilize in some way. With 330M being activated when those frameworks are used, it's no surprise that the 330M is constantly being switched to for even the most simplest tasks.
 
Since when? I don't know if I'm reading yours and Panzo's post wrong (it's too early still) but I don't recall the 9600M GT ever being switched to when an external is connected and automatically for that matter.

I was confused by Panzo's post and likewise posted confused. I meant integrated and will reflect that in my edited post. I should know better than to liquid lunch (business) and post! :D
 
Are you guys saying that the 330m fan always run full speed? My friend have the 8600m model which is really quiet, except while gaming and everyone know this gpu get hot... 330m cant be worse...
 
What's your concern?

1-You won't use it plugged to the power cord all the time and battery duration becomes an issue.

or

2-You think this will "increase GPU usage" and you'll have a shorter computer life.

I want to understand your issue better so I recommend the best IMO.
 
I use mine plugged into a 24" most of the time and it's plenty fast and doesn't get hot. Don't know whether it's on the integrated graphics or not, and couldn't care less - when it's plugged into the monitor, it's AC powered!
 
I'm curious if possibly the OP's issue is more related to an individual machine that seems to run warm and work the fans too much. I mean, who cares what GPU is running as long as the machine performs well and is quiet.

I'm also using a 24 ACD 99% of the time with my 2008 2.53 but I find that either of the GPUs works just fine.
 
I'm curious if possibly the OP's issue is more related to an individual machine that seems to run warm and work the fans too much. I mean, who cares what GPU is running as long as the machine performs well and is quiet.

I'm also using a 24 ACD 99% of the time with my 2008 2.53 but I find that either of the GPUs works just fine.

It's with the way it's built. Since it runs under discrete GPU full time with Cinema Display connected, it gets warm, fans get worked up, creates loud noise. Tried monitoring with iStat and the temperatures stayed high, fan speed rpm goes from basic ~2000rpm all the way up to ~5000rpm even with basic work. When the fans start to kick in, if I idle my computer and don't do anything (so much for a decoration purpose computer), it can lower the temperature relatively quickly. But as soon as even the basic work is done under discrete GPU with Cinema Display connected, discrete GPU gets hot, fans kick in. And there's no way of turning off the discrete GPU, and that's the whole point of my original post. And since it has excessive heat, the performance drags.

And we're not talking about MacBook Pro GPUs in general past and present, but specifically the new i5 version with automatic GPU switching. Just wanted anyone who might be interested to be heads up. Again, for majority of the people, it's not gonna be an issue. It's a really well-made machine, performs great, love the new processor. Just this GPU switching thing that could (including myself) be an issue for some.

...also on my older mbp with the 9600 as soon as you plug in an external monitor it switches over to dedicated no matter what...

As daneoni said, it's a wrong information. Older MacBook Pros with dual graphics card did not switch GPUs automatically. MacBook Pros were able to "activate" external monitors under integrated GPU. The new version comes with Intel HD integrated Graphics that cannot do that. Only the new version as mentioned, switches over to discrete GPU no matter what.
 
I would think that switching to the discrete graphics solution is the right thing to do when connected to an external monitor.
 
I connect my 15" i7 to a Dell U2410 24" 1920x1200 display – working in Eclipse/Aptana Studio, Photoshop, and Safari/Firefox – and I haven't noticed heat issues so far.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.