Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have a naive question. If the dGPU is on, and you are not running a graphically-intense task, does it use less power than when you are e.g. gaming? In other words, is it like a CPU in that if the dGPU is on because of Chrome, it will use less power than if it is on because of a game? Or is it binary on/off? Now that I'm thinking about it I guess it has to be the former but I just wanted to be sure.

Because if the dGPU isn't running full blast all the time I don't see it as being an issue at all...

Misconception on my part from someone who hasn't had a dGPU before.



----------

I was incorrect about why they don't allow you to run in integrated preeminently but the app he is asking about does not force the Mac to use integrated graphics, as stated above.

Thanks dude, I appreciate it. I'm not sure why everyone had to get so hard on your case. We're not all experts (I most certainly am not).
 
Last edited:
I have a naive question. If the dGPU is on, and you are not running a graphically-intense task, does it use less power than when you are e.g. gaming? In other words, is it like a CPU in that if the dGPU is on because of Chrome, it will use less power than if it is on because of a game? Or is it binary on/off? Now that I'm thinking about it I guess it has to be the former but I just wanted to be sure.

Because if the dGPU isn't running full blast all the time I don't see it as being an issue at all...

Misconception on my part from someone who hasn't had a dGPU before.

For me and my MacBook Pro, it doesn't use as much battery it stil uses more than integrated graphics but it's not as bad. It's not fully using the processor and hard drive so it doesn't use as much battery life (so simply, no it doesn't use the same amount as integrated but it does use less than when you are heavily using the computer)

Example:
(My MacBook Pro non retina 15")
7 hours with safari and mail on the integrated graphics card
3 hours with final cut and photoshop on the discrete graphics card
5 hours with safari and mail on the discrete graphics card

Thanks dude, I appreciate it. I'm not sure why everyone had to get so hard on your case. We're not all experts (I most certainly am not).

No problem :) and if people want to bash on me for not being 100% correct im sure their perfection allows them to correct me like that. :cool: Just trying to help out. :)
 
A dGPU also has power states. It will clock down very far but even then it still uses a lot more power than an iGPU. The Intel GPU can borrow its memory access from the main memory that it running anyway because the CPU needs it.
The dGPU has to keep the GDDR5 memory online and its own memory controller running. Even in lower power states that is a lot extra because it is just extra modules that have to be at least somewhat active. An iGPU can simply dip much lower in power consumption.
Intel's GPU also runs on 22nm tri gate Transistors which should be more power efficient and is generally more aggressive with power gating and faster with clock switching. So Intel is better at getting the best balance of performance and power savings in low usage states.
dGPUs of Nvidia are good at full load but much less efficient at medium/low load.
 
A dGPU also has power states. It will clock down very far but even then it still uses a lot more power than an iGPU. The Intel GPU can borrow its memory access from the main memory that it running anyway because the CPU needs it.
The dGPU has to keep the GDDR5 memory online and its own memory controller running. Even in lower power states that is a lot extra because it is just extra modules that have to be at least somewhat active. An iGPU can simply dip much lower in power consumption.
Intel's GPU also runs on 22nm tri gate Transistors which should be more power efficient and is generally more aggressive with power gating and faster with clock switching. So Intel is better at getting the best balance of performance and power savings in low usage states.
dGPUs of Nvidia are good at full load but much less efficient at medium/low load.

That makes a lot of sense. Thank you!

So if I use gfxCardStatus to disable the dGPU, that powers the dGPU off completely, including its onboard memory, correct? This is supported by Keeg's battery life stats and by theBrick's hypothesis that the external display output is routed through the dGPU (and thus you cannot have iGPU-only mode when you plug in to an external display).

And if in the worst case gfxCardStatus no longer works in future versions of Mavericks, it seems that the battery life won't take too much of a hit. This is supported again by Keeg's stats, by the fact that Apple still claims 8hrs of battery (though I'm sure that's running only iGPU apps), and by The Verge (who got a dGPU model, though I'm not sure their test stresses the dGPU):

The most impressive part, though, is that all this comes with almost no penalty to battery life. The 15-inch Pro lasted 9 hours, 35 minutes on the Verge Battery Test, which cycles through a series of popular websites and high-res images with brightness set to 65 percent.

I'll wait for the definitive battery life tests from Anandtech though.
 
Apple claims its battery life only for the iGPU active. That is why they claim the same each time.

Yes if the iGPU is active all of the dGPU is cut of the power. It is turned off entirely. The Verges battery test definitely won't include any dGPU action. It runs on Safari and Apple's own browser doesn't activate the dGPU unnecessarily. It is just a web browsing test were a script loads different web sites in certain time intervals. It isn't even that realistic because just scrolling on a website increases power consumption quite a lot compared to what that test does.
If the dGPU is active you can cut probably about 30% from your battery life expectations.

Anandtech's battery life test won't stress the dGPU either not even the heavy one. Heavy just plays a movie but if you use the right player(QT rather than VLC) that will stay on the iGPU. If you want answers ask the people around here that already have the new MBP. With iStat pro they can read out current power draw and extrapolate easily the battery life impact. Reviewers aren't really all that good with this kind of data. They are more reliable than biased users who want their priced possession to look good but they leave out lots of detail. I always found you can learn the most about new notebooks by checking out forum threads on notebookreview.com or notebookcheck.com.
 
So I finally placed the order for the machine without the 750m. I will not be doing any games on and will prefer a cooler laptop. Same price I know, but that's not my criteria.
 
So I finally placed the order for the machine without the 750m. I will not be doing any games on and will prefer a cooler laptop. Same price I know, but that's not my criteria.

That's what I got yesterday and it certainly does run cool and silent, even when doing some of the gaming benchmarks.
 
The Retina MBP has an HDMI and two miniDP, how would those fare when connected to lets say two thunderbolt displays and a 1080p screen and also running de built in display. Those are a LOT of pixels.


All of this on only the iGPU.

I think the dGPU is needed if you are thinking on running external displays.
 
So I finally placed the order for the machine without the 750m. I will not be doing any games on and will prefer a cooler laptop. Same price I know, but that's not my criteria.

That's what I got yesterday and it certainly does run cool and silent, even when doing some of the gaming benchmarks.

You guys are killing me :D

Seriously, as soon as I think I've decided someone makes me think otherwise. Maybe I should just buy two of these :p.
 
The Retina MBP has an HDMI and two miniDP, how would those fare when connected to lets say two thunderbolt displays and a 1080p screen and also running de built in display. Those are a LOT of pixels.


All of this on only the iGPU.

I think the dGPU is needed if you are thinking on running external displays.

Is this true? I want to run an external display with the new rMBP Iris Pro w/o dedicated gpu. Would a user experience slowdowns with just one external display connected?
 
Is this true? I want to run an external display with the new rMBP Iris Pro w/o dedicated gpu. Would a user experience slowdowns with just one external display connected?

I dont think that one external display would make things ugly.
I can't try with my rMBP, because you cant force the iGPU with the external monitor connected.

But once you go into two monitors higher than 1080p, things could get a little trickier.
 
I just ran Heaven in a window whilst monitoring a kill-a-watt whilst using the iris 5200 and the 750M exclusively.
The performance of the 750M is slightly higher whilst drawing around 10% less power from the wall. 80w under Iris power compared to 72w under 750m.

This wasn't the world most scientific test but might be fodder for further tests or those contemplating whether to order with dGPU or not.

Hopefully some readable screengrabs showing CPU / GPU power consumption via iStat menus.

i-5vg24Dr-X3.jpg


i-bBmBfhj-X3.jpg
 
Last edited:
On basic chrome googledocs word processing working, my wall meter shows Iris consumption at 20-22w and 750M at 30-33 so much more efficient under low load conditions. Once things start getting heavy going, the 750M becomes more efficient.
 
That makes a lot of sense. Thank you!

So if I use gfxCardStatus to disable the dGPU, that powers the dGPU off completely, including its onboard memory, correct? This is supported by Keeg's battery life stats and by theBrick's hypothesis that the external display output is routed through the dGPU (and thus you cannot have iGPU-only mode when you plug in to an external display).

And if in the worst case gfxCardStatus no longer works in future versions of Mavericks, it seems that the battery life won't take too much of a hit. This is supported again by Keeg's stats, by the fact that Apple still claims 8hrs of battery (though I'm sure that's running only iGPU apps), and by The Verge (who got a dGPU model, though I'm not sure their test stresses the dGPU):



I'll wait for the definitive battery life tests from Anandtech though.

The way that Apple handles GPU switching is that the dGPU is completely off when the iGPU is active.
 
What about powering 4k monitors. Playing 4k video without lag. Or running the os on a 4k monitor without any hiccups. Will the iris pro be able to do this? How bout 2-3 yrs down the line, will it still be able to do it. Or is the 750 gonna have trouble doing this too. Or can I just use the thunderbolt 2.
 
Last edited:
Yes Iris Pro can handle all those pixels.
intel-haswell-deck-2.jpg

Just pixels aren't a problem for GPUs these days. Remember standard resolutions have been handled well by GPUs for ages and they used to be many many times slower. Number of pixels alone means nothing these days. Smartphone chips power 2560x1600 displays and the resolution they can put out is usually only limited by the HDMI standard they use.

ianj1972 said:
On basic chrome googledocs word processing working, my wall meter shows Iris consumption at 20-22w and 750M at 30-33 so much more efficient under low load conditions. Once things start getting heavy going, the 750M becomes more efficient.
That is what I would expect. Nvidia still has the more power efficient architecture but is designed to work best at full load. To save power all it does is clock lower but that clocks virtually everything lower. Intel on the other hand can shut off (power gate) one half of the 40EU shader cluster completely. Clock the rest down to 200Mhz and power gate even small parts of the GPU very well. Under full load it doesn't get enough done to compete but it is just much better at adjusting quickly to the precise load demands. It also can just dip much lower than a dGPU possibly can.
Though I would suspect with a workload like an actual real game that puts a bit more load on the CPU while pressuring the GPU will probably have the 750M peak out higher. Anything else wouldn't make too much sense.

Since you have the notebook. Have you tried bootcamp yet. I would be interested if Nvidia's boost 2.0 works there. In OSX it seems to obviously not work.
68-50=18.
72W at the wall. Say 85% power supply efficiency.
~60W
- 18W CPU package
- 15W display and everything else.
= ~28W GPU
The 750M should peak much higher under load if it runs at just 80C there is room for Nvidia boost.

It sucks that there seems to be just no way to get any access to Quicksync for video encodes. Those would be so nice and fast.
 
Waiting for the Anandtech reviews on the Haswell RMPB to inform my purchase decision is KILLING me. I expected it would be out by now.
 
Chiming in here. I've got the 15" higher-end model with both Iris Pro and a dGPU, absolutely loving it so far!

I was in the same boat as the OP and many others, where I didn't really need the dGPU but I wanted to upgrade to 16GB RAM and 512GB storage so I could have gone with the Iris Pro-only model. But for the same price, it just didn't make sense. I've noticed that with Mavericks, the automatic graphic switching is much improved. The dGPU only turns on when doing certain tasks in Photoshop or Ableton, etc. I don't game at all but I do a lot of graphic/rendering work so I thought the dGPU might come in handy later.

In terms of battery consumption/heat dissipation, my 15" runs as cool and quiet as can be! Even when on dGPU, it seems much better than the last generation rMBPs. There's no excessive heat or noticeable battery drain. I'm usually averaging 9+ hours on a full charge.
 
correct, im pretty sure it would cause damage to the graphics card. That's why the have discrete mode, if you are running integrated only and say you open Photoshop or final cut, it informs you that it has switched to discrete and will not allow you to change to integrated until you close those apps.

It doesn't relate to damage, but I can explain it for you. Photoshop and FCPX (not the old FCP) make use of certain OpenCL libraries for specific functions. It doesn't speed up everything. In photoshop most of the time you'll never notice the difference, because it's limited to really specific stuff.

The other part is OpenGL. This isn't terribly difficult stuff for a modern gpu. They just replaced the old tiled redrawing using whatever set of arrays, which relied primarily on the cpu, with functionality from a library built for the express purpose of highly parallel calculations. It's not a big deal, because these things return specific colors, and OpenGL doesn't have to track normal vectors in photoshop. An image with a set of layers is just a flat plan with a fixed normal, which can be disregarded. The 15" may have more pixels, but the difference has always amounted to speculation among users who don't understand the limited complexity of calculations performed there.
 
I have the top (bto) spec machine and use gfxcardstatus no problem. switches just fine.
 
So I finally placed the order for the machine without the 750m. I will not be doing any games on and will prefer a cooler laptop. Same price I know, but that's not my criteria.

Really don't get you.
Why not just use GFX card status, and shut of the dGPU.
Same battery, and you have that extra power when ever you want it.
AND your laptop will have a much higher resell value.

----------

According to this post, it is no longer possible to force integrated-only mode. Do you know if that is correct?

It works fine.
 
Hi all. I'm in the same type of predicament with choosing between the two 15" models. This would be my first Mac, so my lack of experience with them makes it tough.

I was wondering if you can use gfxCardStatus to force the iGPU when using Eclipse and XCode?

Also, I was reading a thread or two where the user was running out of battery while charging, since he was doing heavy gaming.

I'm not worried about gaming, but if I was to get the 750m model, could the same thing happen if I was to have two external thunderbolt displays going? Since if you get the 750m it's not possible to use the iGPU for external displays. This would be during programming/safari/other basic tasks. I'm guessing it would be ok, since the external displays would have their own power source, but I'd like to be sure before I buy.
 
Hello there,

I got the 15" base model upgraded to 512 GB SSD, 16 GB RAM and 2.3 i7.

I don't do 3d and never play games. Iris pro seem perfect so far, I just noticed little lags when using Mission Control on an external monitor, but that' all and it's not really annoying.

Furthermore, my display doesn't seem to have ghosting problem :)

I will often connect my external monitor, which is a 27" running at 2560x1440. Because of this, I wouldn't want to use dGPU each time my external monitor is connected. Fan noise would drive me crazy !

So, I'm quite happy with my iris pro only model for now. :)
 
Also, I was reading a thread or two where the user was running out of battery while charging, since he was doing heavy gaming.

I'm not worried about gaming, but if I was to get the 750m model, could the same thing happen if I was to have two external thunderbolt displays going? Since if you get the 750m it's not possible to use the iGPU for external displays. This would be during programming/safari/other basic tasks. I'm guessing it would be ok, since the external displays would have their own power source, but I'd like to be sure before I buy.

Theoretically it's possible, but I think you're unlikely to run into that situation. The whole point of it drawing off the battery while plugged in is that it needs that extra juice to power whatever you're doing. It's not a design flaw.

I truly think getting the dGPU is absolutely a no-brainer if you're going for 16/512. People talk about the fan noise, but (A) they are not usually throttled all the way up to 6000 rpm even when they do kick in, and (B) the fans on rMBPs are so much quieter than previous generations. If I were powering two TB displays, I would absolutely want that dGPU today. But that's me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.