Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
There's a simple app called gfxCardStatus that you can get and force only the integrated graphics. At the same price go with the 750m, you'd be shooting yourself in the foot if you didn't (resale value).
gfxCardStatus cannot help when an external screen is connected and it the entire switching process also means you need to restart most if not all apps if you want to force the Intel GPU. You attach it to a projector, TV or a external monitor. Move it away and want power savings on the go, you need to close most of your apps. That is just annoying.

750M is only interesting for Gamers in Windows. In Games it is quite a bit faster but in OSX OpenGL counts and here Intel drivers seem to be better than the nvidia drivers. In OpenCL and basically most of the stuff professional apps use the Intel GPU either wins or doesn't matter.
Gamers will want the 750M but everybody else should, even if money is no issuem think whether they want to bother.
 
Since the 750m is essentially the 650m, does the 750m still have support for 3D displays? Preferably I would want the 750m but if it doesn't have support for 3D displays like the 650m then I'll probably get te previous gen.
 
I agree with you on that it feels kind of stupid not to go with the 750m if it is the same price, the reasons why I am hesitating are:

- Heat/Battery consumption. Just the fact that there has been a third party software to turn off the dGPU (not recommended by Apple) tells me the problem is real. I actually think this is the most annoying thing that can happen to a notebook, I hate the sound of speeding fans.

http://support.apple.com/kb/TS4044

Since I don't play games the only reason left is aftermarket value I guess? Maybe that websites will all go 3D in a couple of years and you need more power for openGL web apps but I have probably bought a new generation MBP then :)
 
What exactly is the issue between the computer switching between the Iris Pro and the GT 750M? I thought this was a seamless operation and unnoticeable to the user? If it is noticeable, what impact does it have?
It is not if you don't close applications often. The problem is how Apple implemented it. It doesn't work like Optimus in Windows, where you really can swtich the hardware anyway you like and nothing breaks down, even in game. Because the program talks to a driver that sits in front of the Intel or Nvidia driver.
In OSX the program more or less has to say which GPU it wants and then it stays with that. There is no on demand switching. Say you open VLC and it starts up the dGPU, that GPU stays active even if that program sits in the background doing absolutely nothing at all. You still have the added power draw for nothing. Programs also need to know which procedures to load based on the active GPU and therefore they usually need to restart for full performance after the GPU switches back to Intel. Some of Apple's own apps deal with that while running but most other apps don't. Switching over to the dGPU always works but not back.
An other issue is that there is no real way to manually choose a GPU. Apple switches based on frameworks and way too often uses the dGPU when not necessary. A simple GUI that uses for some animations the wrong framework loads the dGPU even if that animation is never triggered because it is only a feature that can be used in a certain situation. Say iphoto slide shows need the dGPU for some fancy animation, but maybe all you do is import and browse.

A lot of these issues can be fixed with proper care for programming and useing all the Apple APIs right but very few programmers bother or want to go that extra mile. Apple just made it too difficult.

ie. I have pathfinder in may dGPU dependency list and that is just a finder replacement with absolutely no need any sort of GPU speed. It uses some Core animation features I think if you activate them. I don't but the framework is still loaded and so is the dGPU. I also always have the external screen attached and although all I do is watch vlc movies or put spotify on it, or some other normal 2D non demanding stuff it always has that extra bit of heat that makes my old 2010 MBP noisier than it needs to be. So it is a heat thing as well. It takes more load to ramp up the fans when the dGPU is off.

In Windows you also have to deal with the dGPU always active. Usually not a huge deal as most people boot into Windows for gaming anyways but still. In Windows on directX with Nvidia's Windows drivers the 750M is worth it but in OSX with that fairly poor OpenGL performance Intel might actually be better in almost all cases.
 
It is not if you don't close applications often. The problem is how Apple implemented it. It doesn't work like Optimus in Windows, where you really can swtich the hardware anyway you like and nothing breaks down, even in game. Because the program talks to a driver that sits in front of the Intel or Nvidia driver.
In OSX the program more or less has to say which GPU it wants and then it stays with that. There is no on demand switching. Say you open VLC and it starts up the dGPU, that GPU stays active even if that program sits in the background doing absolutely nothing at all. You still have the added power draw for nothing. Programs also need to know which procedures to load based on the active GPU and therefore they usually need to restart for full performance after the GPU switches back to Intel. Some of Apple's own apps deal with that while running but most other apps don't. Switching over to the dGPU always works but not back.
An other issue is that there is no real way to manually choose a GPU. Apple switches based on frameworks and way too often uses the dGPU when not necessary. A simple GUI that uses for some animations the wrong framework loads the dGPU even if that animation is never triggered because it is only a feature that can be used in a certain situation. Say iphoto slide shows need the dGPU for some fancy animation, but maybe all you do is import and browse.

A lot of these issues can be fixed with proper care for programming and useing all the Apple APIs right but very few programmers bother or want to go that extra mile. Apple just made it too difficult.

ie. I have pathfinder in may dGPU dependency list and that is just a finder replacement with absolutely no need any sort of GPU speed. It uses some Core animation features I think if you activate them. I don't but the framework is still loaded and so is the dGPU. I also always have the external screen attached and although all I do is watch vlc movies or put spotify on it, or some other normal 2D non demanding stuff it always has that extra bit of heat that makes my old 2010 MBP noisier than it needs to be. So it is a heat thing as well. It takes more load to ramp up the fans when the dGPU is off.

In Windows you also have to deal with the dGPU always active. Usually not a huge deal as most people boot into Windows for gaming anyways but still. In Windows on directX with Nvidia's Windows drivers the 750M is worth it but in OSX with that fairly poor OpenGL performance Intel might actually be better in almost all cases.

So basically, I will not notice improvements with the Nvidia unless I do gaming, heavy video editing, or intense 3D rendering stuff? And even though its the same price as the maxed out base 15" with Iris Pro, I should go with the Iris Pro one? Will it be nearly just as "future proof"? Thank you so much.
 
For those suggesting using gfxCardStatus to force integrated graphics, please note that there is a dependency list, and if the app is on the list, it will use discrete graphics no matter what:

http://gfx.io/switching.html#integrated-only-mode-limitations

----------

Whether to go Nvidia or not is a personal decision based on the use case.

In my case, i'm not going to be doing heavy rendering or setting games to max.

However, I do computationally heavy work and use a lot of ram.

With my current MBP, during the day, I'll shuttle my laptop over to colleagues to show them some results and the computer is really hot.

At night, I continue working on my laptop from my lounge chair, and even though I have one of those laptop pillows, I still feel the heat from the laptop.

That's why for me, it is important to have a computer that is as cool and power efficient as possible, even at the cost of graphics power.

----------

I think one of the benefits of going integrated only is the more efficient and even thermal dissipation.

If you look at the teardown of the previous 15" rMBP with discrete graphics, you'll see in the center the CPU and on the right the GPU:

http://d3nevzfk7ii3be.cloudfront.net/igi/vMK4VGFB3NOv5UfZ.large

The right thermal pipe will dissipate the heat from the GPU and the left with dissipate the heat form the CPU. You can sort of assume the CPU and GPU being in thermal equilibrium, so not much heat is passing between them.

However if you look at the teardown of the 13" rMBP without discrete graphics, you'll see just the CPU in the center:

http://d3nevzfk7ii3be.cloudfront.net/igi/F6xjJTqi2sWJwZbv.large

Both the right and left thermal pipe are working to dissipate the heat from the CPU and the back of the case will be evenly heated.

Overall, this means that heat is being transferred more efficiently and you'll likely see lower temperatures and better battery life.

-----------

Personally, I see the discrete graphics chip going the way of the CDROM drive. That's why Apple only keep it in the highest end model.

So by the time I get around to selling my new macbook pro in 4 years, laptops won't even come with discrete graphics, so I don't think there will be so much of an issue with resale, since people won't think it is weird not to have discrete graphics.

In fact, it might be strange in 4 years to hear that the computer has a discrete card since it will be associated with heat and loud fans.
 
There's a simple app called gfxCardStatus that you can get and force only the integrated graphics. At the same price go with the 750m, you'd be shooting yourself in the foot if you didn't (resale value).

GET THE 750M you chumps (especially if there is no price difference)... I can't believe we are debating this.

Yes, gfxcardstatus will allow you to permanently disable the dGPU if you ever desire to, so you can squeeze out that extra battery life.

Incorrect. gfxCardStatus cannot force iGPU only in all cases, i.e. PowerPoint. Look at section 2 of the following link.

http://gfx.io/switching.html

----------

For those suggesting using gfxCardStatus to force integrated graphics, please note that there is a dependency list, and if the app is on the list, it will use discrete graphics no matter what:

http://gfx.io/switching.html#integrated-only-mode-limitations

Wow. You beat me by 5 minutes. :p
 
GET THE 750M you chumps (especially if there is no price difference)... I can't believe we are debating this.

Yes, gfxcardstatus will allow you to permanently disable the dGPU if you ever desire to, so you can squeeze out that extra battery life.

Well said. Although after following the Waiting for Haswell thread for months, where everything under the sun and on other planets was explored without end, I'm not at all surprised that now there is a debate about whether or not to pass on a dedicated card for free (when factoring in the CPU, RAM, and SSD upgrades from the base 15"). This actually sums up MacRumors rather nicely I think.
 
Last edited:
Let me clear something. A dedicaties gpu doesn't generate heat unless you start taxing it's full power. Under such scenario's, an integrated GPU will produce alot of heat and noise too.

If you don't believe what I am saying, listen to how loud and hot a MacBook air or 13 inch MacBook pro gets while gaming.
 
Let me clear something. A dedicaties gpu doesn't generate heat unless you start taxing it's full power. Under such scenario's, an integrated GPU will produce alot of heat and noise too.

If you don't believe what I am saying, listen to how loud and hot a MacBook air or 13 inch MacBook pro gets while gaming.

From my experience and what I've heard from other people, a dGPU on a Mac does pretty much guarantee more heat, noise, and less battery life because it turns on even when you really don't need it. For example, giving a presentation in PowerPoint. Even the app linked above that allows you to manually switch between iGPU and dGPU doesn't work for all apps such as Keynote, PowerPoint, etc.
 
If you get the laptop to game get the 750M one..

If you're planning on using Solidworks, Maya, Lightwave, or any 3d creation or CAD application get the Iris pro one. It is significantly faster for those tasks and not that far behind the GT650M-750M...

Here are some benchmarks of the Iris Pro 5200: http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-5200.90965.0.html

Nvidia gt650m(the 750m which is fairly similar to the 650M does not include the specviewperf 11 scores for comparison):
http://www.notebookcheck.net/NVIDIA-GeForce-GT-650M.71887.0.html
 
I think it is important to keep in mind that until these computers get benchmarked, this is all educated speculation. Their could be come fancy code that drastically affects the standard benchmarks people are using as justification for and against the different GPU's.

Second, if you are trying to decide whether to get the Iris Pro only or dGPU MBP at the same price point, keep in mind that unless you plan to do intense rendering, editing, etc on a plane, in a park, or somewhere that is not near a power outlet, the minor decrease in battery life from the dGPU being engaged is an non-issue.

I do not know many professional film editors that do several hours of dGPU intense work away from an outlet. If you are somewhere that 6 hours of battery instead of 8 is going to be an issue (because of the dGPU (and the hours are a guess)), you will need an alternative power solution anyway.

I do not know about gaming, but I can't imagine there are many people playing games in the quad at school for hours on end.
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.

Link? You seem to have some very interesting information that no one else on this entire site seems to know about. It really would cause quite a stir!
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.

Thanks for the intel bro, but that majorly sucks!!!!! Is there a way to isolate the the dgpu I wonder……I just need the high-end rMBP for CAD and the dgpu is in some cases (solid works) half as fast:mad:
 
I'll bet these run cooler.

I actually sold my 2.4ghz retina 15" because, for one reason, it ran hot doing nothing. I also own the 13" Retina and it runs cool

So the new base should at least be better than the last version

Anyway, I was waiting like a lot of you, and will now buy another 15"
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.

So to clarify, if I were to pick the base 15" and upgrade the CPU to 2.3, it would auto-add a dGPU?

This does not seem logical based on what the store configurations show...
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.

Wow omg if this is true that will be amazing!!!!!!!!!!!!!!
 
So to clarify, if I were to pick the base 15" and upgrade the CPU to 2.3, it would auto-add a dGPU?

This does not seem logical based on what the store configurations show...

It's not logical because it doesn't exist. Although you never know, you and Sdali may be the only ones special enough to have a dGPU w/ your 2.3 and 2.6 base models! :)
 
This thread has become moot.

I asked Apple directly and they got back to me.

2.0Ghz is Iris Pro only and 2.3Ghz and 2.6Ghz automatically come with nvidia, regardless of which configuration you start with.

That's why they are the same price. So no 2.6Ghz Iris Pro only.

Please tell us who you asked and if they were credible. Thanks!!
 
It's not logical because it doesn't exist. Although you never know, you and Sdali may be the only ones special enough to have a dGPU w/ your 2.3 and 2.6 base models! :)

No, your logic is flawed :D. If it doesn't exist it's nothing
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.