Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I didn't get it either, but video editing software is designed to load rendering to the dGPU when it's present. Some effects are even dependent on CUDA in some software. I'm not a pro by any means but I've been editing video/audio for about 15 years starting with VHS tapes. I used FCP7 on a late 2009 mac mini for a while and it was fine. The rendering took a while but I'm not in production so it didn't matter. Then trying to use FCPX and Motion I hit the wall. The whole process became extremely painful. When I had a high-end rMBP w/650m the difference was obviously impressive but it was overkill. I had 3 different Motion templates rendering at the same time and it could take it.

Yes, sure, modern GPUs are computational powerhouses and can assist you tremendously in these tasks, but these facts are not really helping in the whole CPU vs. GPU debate. First of all, the previous Nvidia GPUs (650M/750M) are quite bad at those tasks (due to Nvidia optimising the cards more for gaming performance) compared to some older cards (Maxwell is quite fast again). Secondly, the Iris Pro is every as bit capable at processing data as a midrange GPU and is actually much FASTER when random data access is involved (as OpenCL benchmarks clearly show). The only scenario where the Iris Pro comes short is when memory size/bandwidth becomes the limiting factor - such as rendering very large videos with simple filters (but do you really do that?).

Bottomline is - if you try the MBP with Iris Pro, I am sure you will find it just as fast as that laptop with the 675m.
 
Bottomline is - if you try the MBP with Iris Pro, I am sure you will find it just as fast as that laptop with the 675m.

Whoa whoa whoa... I was with you until the bottom line... ;) In terms of gaming performance it's not even close. The 675m is no power-house but 1080p gaming is possible in 99% of games with high or better details. Iris Pro is stuck at 900p. I don't want to start a resolution debate on a 15" screen so I understand the difference isn't great... but it is... it's also frustrating when you can't use all those tasty pixels on a retina screen.

Now like I said, I wouldn't go for a 750m over Iris Pro for any reason. The "up to 15%" improvement isn't worth it IMO. But because a dGPU will help with video editing AND the 850m is on the horizon, it's worth waiting for it IMO. It will greatly increase the life of a new machine.

People think Broadwell and Skylake will bring the end of dGPUs. Broadwell is 2 years away and who knows with Skylake. Maxwell is here and it begins the era of low thermal/TDP with high performance. This is only going to get better every year and bring mobile performance closer to desktop standards. By the time Broadwell launches and improves Iris Pro, the next iteration of Maxwell will be out and it's dominance will continue. Some people will say Iris Pro is "good enough" so we don't need a dGPU. But until "some people" is almost all, Apple will serve it's customers a dGPU.
 
Whoa whoa whoa... I was with you until the bottom line... ;) In terms of gaming performance it's not even close.

Sorry, I should have been more clear. I was only talking about task like video editing. If we include gaming, you are absolutely right of course.

People think Broadwell and Skylake will bring the end of dGPUs. Broadwell is 2 years away and who knows with Skylake.

I don't think that Skylake etc. will bring the end of dGPUs, but they will certainly bring it closer. E.g. with Skylake, the CPU itself gains many properties of the GPU. If this trend continues, Intel can drop the iGPU completely at some point and do all the graphics-related computations using the vector-processing units (or to put it differently, all the iGPU-based technology fuses with the CPU execution units).

And true, Maxwell is a really amazing step up. Honestly didn't expect Nvidia to deliver what that did.
 
People think Broadwell and Skylake will bring the end of dGPUs.
Yes.

Broadwell is 2 years away and who knows with Skylake.
Broadwell MBPs are about 12-15 months away.

Some people will say Iris Pro is "good enough" so we don't need a dGPU. But until "some people" is almost all, Apple will serve it's customers a dGPU.
Apple have already made the following models dGPU-free:
- all MBAs (2011)
- all 13" MBPs (2011)
- about half of the 15" MBPs (2013)

Do you see a trend?
 
Yes.


Broadwell MBPs are about 12-15 months away.


Apple have already made the following models dGPU-free:
- all MBAs (2011)
- all 13" MBPs (2011)
- about half of the 15" MBPs (2013)

Do you see a trend?
MBP nad MacBook Air never had a dGPU. 9400M and 320M were integrated graphics in chipsets of Logic board.
 
Yes.


Broadwell MBPs are about 12-15 months away.


Apple have already made the following models dGPU-free:
- all MBAs (2011)
- all 13" MBPs (2011)
- about half of the 15" MBPs (2013)

Do you see a trend?

Yeah I should have said a year and a half... There was something I read that some of the mobile Broadwell chips could be out by the end of the year but I couldn't find anything that elaborated.

I see the trend, but it made sense to go with Iris in 2013 when Intel made the leap. In 2014 it will make sense to go with Maxwell. The MBAs are an entry level laptop... a netbook... not relevant. Why not include the iMac and Mac Pro lineups then? All have dGPUs. See a trend?
 
MBP nad MacBook Air never had a dGPU. 9400M and 320M were integrated graphics in chipsets of Logic board.

They were discrete from the CPU.

----------

Why not include the iMac and Mac Pro lineups then? All have dGPUs.
The pressure for and benefits of integration are greater with laptops than with desktops. The desktops will eventually be dGPU-free also, just as desktops became free of discrete floating point processors.

See a trend?
Yes, the trend is toward ever greater integration. I first became aware of the trend when Intel released the 80186. The trend toward greater and greater integration has continued ever since and continues today.
 
A Maxwell GPU makes the most sense.

In 2013, the Iris Pro made significant leaps, thus only the highest laptop model actually needed something else to keep up to date with competition. However, with the gains made by Maxwell, dGPUs will still make sense. A 850M brings enormous new benefits and the rMBP is not going to be capable of competing if it lacks Maxwell graphics. The Razer Blade got them and the Dell XPS 15 is rumored to be getting them.
 
Notebook Check has just released detailed gaming benchmarks concerning the 850M:

http://www.notebookcheck.net/NVIDIA-GeForce-GTX-850M.107795.0.html

Perhaps it's sufficient to say that most games can be played at ultra settings at mostly playable frame rates (20-40) and at high settings with fluent frame rates (30-80). On the 3DMark11 benchmark, the GTX 765 is actually 10% slower and it falls only 5% behind the GT 750 SLI (!). It's the 43rd best graphics card reviewed by the website.

That is some serious graphics power right there.
 
The pressure for and benefits of integration are greater with laptops than with desktops. The desktops will eventually be dGPU-free also, just as desktops became free of discrete floating point processors.


Yes, the trend is toward ever greater integration. I first became aware of the trend when Intel released the 80186. The trend toward greater and greater integration has continued ever since and continues today.

You're not the only "old guy" here. So you know people have been saying this for a very long time... yet nVidia keeps on making them... and they're using them in more ways all the time. I'd put my money into nVidia before Intel if I'm thinking of a long-term investment.

----------

Notebook Check has just released detailed gaming benchmarks concerning the 850M:

http://www.notebookcheck.net/NVIDIA-GeForce-GTX-850M.107795.0.html

Perhaps it's sufficient to say that most games can be played at ultra settings at mostly playable frame rates (20-40) and at high settings with fluent frame rates (30-80). On the 3DMark11 benchmark, the GTX 765 is actually 10% slower and it falls only 5% behind the GT 750 SLI (!). It's the 43rd best graphics card reviewed by the website.

That is some serious graphics power right there.

Mother of God. That's even better than I expected given what info was available. Thanks for the head's up! :D It's definitely time to start saving my pennies!
 
I think that there is the time to clarify why Apple used Iris Pro graphics.

eDRAM that works as a cache for CPU. Benefit of it is way bigger than using higher clocked CPU. Apple thought that they casn use Iris as only GPU, but it was way slower and will be way slower than dGPU.

CPU's with eDRAM are way faster than those without especially when we compare them clock-to-clock.

If Iris Pro would be faster than dGPU - then Apple would use it in both specs of MBP, but they didnt. GT750M is faster in most tasks, than Iris Pro, and GTX850M will wipe it out. In EVERY single way.

On september, we will see Haswell Update in MBP with Maxwell GPU.

@mcarling. 9400M and 320M were not dGPU's. They were part of CPU chipset placed on logic board. Another thing - not Apple ditched them. Intel not allowed Nvidia to build chipsets for their CPU's. Thats the story about why Apple went with Intel iGPU's. And ONLY reason in doing so. HD3000 was rubbish, Nvidia had way faster GPU, that ultimately morphed into GT420M.
 
I think that there is the time to clarify why Apple used Iris Pro graphics.

eDRAM that works as a cache for CPU. Benefit of it is way bigger than using higher clocked CPU. Apple thought that they casn use Iris as only GPU, but it was way slower and will be way slower than dGPU.

CPU's with eDRAM are way faster than those without especially when we compare them clock-to-clock.

If Iris Pro would be faster than dGPU - then Apple would use it in both specs of MBP, but they didnt. GT750M is faster in most tasks, than Iris Pro, and GTX850M will wipe it out. In EVERY single way.

On september, we will see Haswell Update in MBP with Maxwell GPU.

@mcarling. 9400M and 320M were not dGPU's. They were part of CPU chipset placed on logic board. Another thing - not Apple ditched them. Intel not allowed Nvidia to build chipsets for their CPU's. Thats the story about why Apple went with Intel iGPU's. And ONLY reason in doing so. HD3000 was rubbish, Nvidia had way faster GPU, that ultimately morphed into GT420M.

This is why I've been saying that there is no way that Apple is pulling dGPU's out of all their 15" MBP's. The Iris Pro is good for resolving the problem of stutter when you hit Expose or Mission Control in the low power state. The 2880x1800 resolution was really crushing the HD4000. Iris Pro made the low power mode work like butter.

But make no mistakes, a GeForce 850M is over twice as fast. Its not even close. We're talking Futuremark on Iris Pro is about 8000, and on an 850m is like 17500. Couple this with being able to over clock it to 860m specs with Afterburner and not changing the thermal envelope much at all, and now we are north of 20000.

The MBP would be a GREAT gaming machine in this configuration, and still be amazing at everything it does today.
 
But make no mistakes, a GeForce 850M is over twice as fast. Its not even close. We're talking Futuremark on Iris Pro is about 8000, and on an 850m is like 17500. Couple this with being able to over clock it to 860m specs with Afterburner and not changing the thermal envelope much at all, and now we are north of 20000.

The MBP would be a GREAT gaming machine in this configuration, and still be amazing at everything it does today.

GTX850M has 40W TDP IF Turbo Boost is Enabled. If not - its around 22-27W.
GTX860M has 45W TDP IF Turbo Boost is enabled. If not - its around 25-30W.

You can OC GTX850M without getting up in Wattage to stock clocks of GTX860M. That would be very impressive, cause the performance would get to level of GTX680M. 2.4 GHz 4860HQ CPU has TDP of 47W but power draw of 45W. Add 25W of GTX850M OC'ed and we end up with 70W of power draw, way less than power supply can bring.

And still get 8 hours of battery life ;). Thanks to GPU switching ;).
 
GTX850M has 40W TDP IF Turbo Boost is Enabled. If not - its around 22-27W.

GTX860M has 45W TDP IF Turbo Boost is enabled. If not - its around 25-30W.



You can OC GTX850M without getting up in Wattage to stock clocks of GTX860M. That would be very impressive, cause the performance would get to level of GTX680M. 2.4 GHz 4860HQ CPU has TDP of 47W but power draw of 45W. Add 25W of GTX850M OC'ed and we end up with 70W of power draw, way less than power supply can bring.



And still get 8 hours of battery life ;). Thanks to GPU switching ;).


I'm with you dude. :D
 
This is why I've been saying that there is no way that Apple is pulling dGPU's out of all their 15" MBP's. The Iris Pro is good for resolving the problem of stutter when you hit Expose or Mission Control in the low power state. The 2880x1800 resolution was really crushing the HD4000. Iris Pro made the low power mode work like butter.

But make no mistakes, a GeForce 850M is over twice as fast. Its not even close.

You seem to be under the impression that just because there's a large performance gap, that means there is "no way" the dGPU won't get killed. Given that Apple faces no hardware competitors, and given that gaming has never been a target niche for them, your logic is questionable.
 
Spot on

It isn't called MBP for no reason. These are not gaming laptops and OpenCL works great on Intel's iGPU.

They've been in that position before, and in the past kept to their game plan. That is they don't try to follow others, but roll out updates when according to their schedule.

As for being far behind in the GPUs, other then gaming its impact may not be as far reaching as some people thing. Apple caters the laptops to the general consumer and the current setup is more then adequate for such sales.
 
You seem to be under the impression that just because there's a large performance gap, that means there is "no way" the dGPU won't get killed. Given that Apple faces no hardware competitors, and given that gaming has never been a target niche for them, your logic is questionable.

Let's see what they release next, and let the looser acknowledge they were wrong. I'm game if you are.
 
Notebook Check has just released detailed gaming benchmarks concerning the 850M:

http://www.notebookcheck.net/NVIDIA-GeForce-GTX-850M.107795.0.html

Perhaps it's sufficient to say that most games can be played at ultra settings at mostly playable frame rates (20-40) and at high settings with fluent frame rates (30-80). On the 3DMark11 benchmark, the GTX 765 is actually 10% slower and it falls only 5% behind the GT 750 SLI (!). It's the 43rd best graphics card reviewed by the website.

That is some serious graphics power right there.

The model tested in this charts is model with DDR3 Memory, and CPU in the laptop is dual core.

With Quad Core and with GDDR5 it will be around 30% faster. So its not the end of potential ;).
 
Let's see what they release next, and let the looser acknowledge they were wrong. I'm game if you are.

I'm game. If Apple introduce the 850M (or any other new discrete GPU) into the MBP before Broadwell, then I'll admit that I was wrong.
 
I'm game. If Apple introduce the 850M (or any other new discrete GPU) into the MBP before Broadwell, then I'll admit that I was wrong.

Not the question. The question is will Apple discontinue the dGPU in the next iteration. I have no idea when Apple will put the 850 in the MBP, but my position is that they will either now or in Sept. I am being told that my logic is questionable, because despite Apple having an advertisement for gaming on the MBP product page, Apple doesn't care about gamers and going to therefore discontinue the dGPU.

One of us is pointing to the 15" having a dGPU since 1998 and believes that they will continue to make one available. Hint: thats me.
 
My hunch is that we will see the next "iteration" of MacBook line introduced at WWDC in June, whatever they are. That is only 3 months away.

I don't understand why people think Apple is waiting until September or later to refresh them...it will be June and soon.
 
Not the question. The question is will Apple discontinue the dGPU in the next iteration.

No, that was not the question. I never claimed that and I don't recall anyone else doing so. I claim that the Skylake MBPs will not have a discrete GPU.
 
No, that was not the question. I never claimed that and I don't recall anyone else doing so. I claim that the Skylake MBPs will not have a discrete GPU.

Sorry, should have clarified. I was never disputing this one with you, but with John123 above. I haven't seen Skylake, and don't know how well the GPU side will compete with whatever Nvidia has in the 900 series.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.