Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Previously Nvidia's switching did allow both cards to be on.

By the way, I just read in an older article on arstechnica that Apple uses its own implementation of automatic graphics switching, not Optimus. This custom implementation seems to always power down the iGPU/dGPU if the other GPU is active. Optimus, AFAIK, only powers down the dGPU.

http://arstechnica.com/apple/2010/04/inside-apples-automatic-gpu-switching/

It would be nice if Apple would provide an optional manual control over the active GPU as well as the option to have both enabled at the same time. Although gfxCardStatus seems to work nicely, I would prefer a native solution.

----------

True but what if you're using bootcamp to run a CAD application?.. Then u are stuck with the weaker (in the case of CAD) gt750 card....gfxcardstatus is only a mac OSX app!

You can use Nvidia's drivers under Windows. They provide the same functionality. Here is an official demo from Nvidia: http://www.youtube.com/watch?v=Zh4HCadTY_A
 
By the way, I just read in an older article on arstechnica that Apple uses its own implementation of automatic graphics switching, not Optimus. This custom implementation seems to always power down the iGPU/dGPU if the other GPU is active. Optimus, AFAIK, only powers down the dGPU.

http://arstechnica.com/apple/2010/04/inside-apples-automatic-gpu-switching/

It would be nice if Apple would provide an optional manual control over the active GPU as well as the option to have both enabled at the same time. Although gfxCardStatus seems to work nicely, I would prefer a native solution.

----------



You can use Nvidia's drivers under Windows. They provide the same functionality. Here is an official demo from Nvidia: http://www.youtube.com/watch?v=Zh4HCadTY_A

This would make sense on previous machines. The iGPU could only do OpenGL, so no point in keeping it powered. But on previous Macbook Pros where both GPUs could do OpenCL, they're both available. It's possible that OpenCL can trigger the GPU to power back on.
 
Does the 750m still support 3D-Displays like the 650m of the previous generation? Does the Iris Pro even support 3D displays?
 
As noted by the above poster, gfxCardStatus is a dream and works beautifully.

Wasn't there an issue where owners of the original 2012 RMBP had to install gfxCardStatus because it was causing kernel panics and crashing their machines? I agree that it is a handy piece of software, but it doesn't feel quite right to spend $2,500 plus and have to rely on third party band aids.
 
Last edited:
Wasn't there an issue where owners of the original 2012 RMBP had to install gfxCardStatus because it was causing kernel panics and crashing their machines? I agree that it is a handy piece of software, but it doesn't feel quite right to spend $2,500 plus and have to rely on third party band aids.

Not that I recall, and I owned one of those models from launch and bought another one a few months ago.

I wasn't talking about relying on it as being a "band-aid" as much as I am about it giving you back control—something Apple's historically been against. Sometimes, if you're mobile, you might prefer to use the iGPU to preserve battery life, even if that means some lagging and stuttering in graphics. gfxCardStatus makes that sort of thing possible.
 
from the developer of gfxCardStatus

twitter.convo.10.23.13.png
 
By the way, I just read in an older article on arstechnica that Apple uses its own implementation of automatic graphics switching, not Optimus. This custom implementation seems to always power down the iGPU/dGPU if the other GPU is active. Optimus, AFAIK, only powers down the dGPU.

http://arstechnica.com/apple/2010/04/inside-apples-automatic-gpu-switching/

It would be nice if Apple would provide an optional manual control over the active GPU as well as the option to have both enabled at the same time. Although gfxCardStatus seems to work nicely, I would prefer a native solution.

The control for multiple GPUs is all in the API. Its very easy for the developer to utilise all/particular the GPUs in the system. Same goes for OpenCL. All they need is add two lines of code. The question is though - will developers do it? VLC for example still triggers the dGPU (for no valid reason). The same was also an issue for Chrome, no idea if they have fixed it by now.
 
If you plan on upgrading the baseline HD to 512gb and then bumping up the processor to the next tier then the baseline becomes the same price as the high-end that includes the dGPU. I feel like it's almost a no-brainer then to go with the high-end option for the same price and get the dGPU for free essentially. No?
 
Yeah i'm in a position where i'm not sure whether I need the dGPU but I would like to upgrade the RAM on the base model. So for an extra $360 I would get twice the storage, a higher clock speed and a dGPU. Is it worth it for a computer I expect to last 5 or so years? I'm leaning towards yes.
 
True but what if you're using bootcamp to run a CAD application?.. Then u are stuck with the weaker (in the case of CAD) gt750 card....gfxcardstatus is only a mac OSX app!

Nope. Nvidia Optimus will kick in just like a regular PC laptop.
 
Here are some early benchmarks.

The config he used includes 16GB of RAM, 256GB of flash storage and Nvidia GT 750M graphics.

The article does not talk about the Iris Pro vs NVIDIA GT 750M debate, although thought you'd guys appreciate his benchmarks.

Just a summary of the article: Benchmarks show slight improvements overall, nothing over 30% from the 2012 model. PCIe SSD transfers 309 MBps vs last years SSD of 196 MBps.
 
Yes you are right, the CPU is different. I think 300MHz for $100 is a pretty good deal. Actually, bumping up the CPU in the base model has some GPU gain as well: Turbo 1.2G to Turbo 1.3G

There is no way Apple uses HD4600 in the top model. Otherwise, most of the time, when the discrete graphics is switched off, the graphical performance would be far inferior to the base model, which would be ridiculous!

[/COLOR]

Not quite—there's a $100 gap, which is accounted for by the CPU jump from 2.0 to 2.3. The OP didn't mention that he cared about the CPU increase (nor should he, since that's the least relevant upgrade of all).

----------


Yes, the higher end does have BOTH chips. It's more than a bit surprising for cost reasons that Apple elected to do this (I have to think Intel gave them a sweet deal, perhaps for marketing reasons, else using HD4600's for the iGPU should have been a no-brainer).
 
Here are some early benchmarks.

The config he used includes 16GB of RAM, 256GB of flash storage and Nvidia GT 750M graphics.

The article does not talk about the Iris Pro vs NVIDIA GT 750M debate, although thought you'd guys appreciate his benchmarks.

Just a summary of the article: Benchmarks show slight improvements overall, nothing over 30% from the 2012 model. PCIe SSD transfers 309 MBps vs last years SSD of 196 MBps.

Cheers for the link!

While this reviewer is obviously a fan, what I take away from the article is that the performance bumps, while still modest, seem to sit closer to the 15% overall mark than the disappointing 6-10% I'd seen in earlier benchmarks.

It might be splitting hairs, but my buyer's confidence certainly improves knowing that there's actually a perceivable difference between this machine and the 1.5-year-old tech it's meant to be replacing.

----------

There is no way Apple uses HD4600 in the top model. Otherwise, most of the time, when the discrete graphics is switched off, the graphical performance would be far inferior to the base model, which would be ridiculous!

Very good point.

I found their decision to include both the Iris Pro and the 750M a bit puzzling, considering they are much closer to each other in raw horsepower than the tandems we've seen in prior generations. But when you look at it from that angle it makes a bit more sense.
 
i just wanted to add that if you work in a lot of Adobe programs, they seem to favor Nvidia when it comes to optimizing graphics intensive features. Whether this is because the actual cards are better or because it is easier, more beneficial to them, or a byproduct of some sort of business relationship between the two companies, i don't know. I do know, Nvidia is the only one of the two with CUDA cores, which is what Adobe relies on for the ray traced rendering performance in After Effects. It can operate on cpu alone, but is nearly unusable because of the insane rendering time. Even though the 650m in my rMBP has just about the least amount of CUDA cores of any supported cards in Adobe's list, it goes from taking minutes to render a frame via cpu alone, to seconds (in ray traced render mode).

Beyond any of this though, i am no expert on this stuff and it sounds like many others in this thread have a lot more knowledge on the subject. I think my final thought would be that if you have to ask whether you need one card or the other, you're probably not using applications that would benefit much from one card or the other. I understand the desire to have a future proof machine, but if you are not really going to benefit from the graphics capabilities of the add-on, why pay more than you have to?
 
True but what if you're using bootcamp to run a CAD application?.. Then u are stuck with the weaker (in the case of CAD) gt750 card....gfxcardstatus is only a mac OSX app!

But though, think about the other side. More industrial applications support CUDA on Windows. I don't know if they did count CUDA when they've run benchmark with solidwork (which does support CUDA) OpenCL is not the only option in this case.
 
There is no way Apple uses HD4600 in the top model. Otherwise, most of the time, when the discrete graphics is switched off, the graphical performance would be far inferior to the base model, which would be ridiculous!

Not really. It's only dicey from a marketing perspective, but from a technological and cost perspective, it would make a ton of sense. The whole point of the iGPU/dGPU split is that the iGPU can handle the non-difficult load. Anything that would strain the iGPU shouldn't be on the iGPU anyway.

The HD4000 does a perfectly fine job, under Mavericks, of powering day-to-day stuff on the rMBP. The HD4600 would be a great improvement for day-to-day use and would probably get some bigger battery gains, too—at least based on the power utilization numbers we've seen the past few months.
 
But though, think about the other side. More industrial applications support CUDA on Windows. I don't know if they did count CUDA when they've run benchmark with solidwork (which does support CUDA) OpenCL is not the only option in this case.

Not exactly...CUDA is mostly used for rendering. The point of interest in the solid works benchmark is the viewport/realtime performance for the purpose of modeling. Mostly all 3d applications are agnostic when it comes to modeling/viewport/opengl performance meaning that since they appeal to a very broad spectrum of engineers/designers etc they need to cater to all of their different video configurations .
Yes Solidworks does support CUDA but it's only for rendering. Most engineering/CAD applications are focused on 3d modeling.
The benchmark you saw will not benefit from CUDA whatsoever.
 
So is there any reason to go with just the Iris Pro and no dGPU? Even if it's the exact same price?

I am planning on ordering the 15" with 16GB RAM and 512GB of storage. If I upgrade the base model with just Iris Pro, it comes out to $2400. That's the same price and specs as the higher-end model but without the 750M. How does that even make sense/is Apple really trying to screw its customers with that?

I am coming from a 2006 15" MBP that is on its very last legs. So no matter what I buy I know it'll be leagues ahead of my current laptop. I don't do any gaming but do occasionally use AutoCAD and Final Cut, while most of my work comes from Photoshop, InDesign, and Ableton. Knowing all this, which would make the most sense in terms of performance, battery life, and heat management?
 
Do you regularly work on your graphics intensive applications for 7-9 hours with no outlet nearby? If your answer is no then the higher-end model is a better bet assuming you were going to upgrade the RAM etc.
 
True but what if you're using bootcamp to run a CAD application?.. Then u are stuck with the weaker (in the case of CAD) gt750 card....gfxcardstatus is only a mac OSX app!
So far, Bootcamp enforced nVidia anyway. Not sure how it'll be handled on just released dial-GPU MBP.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.