Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will the Haswell rMBP be announced in September with a dGPU option?


  • Total voters
    407
  • Poll closed .
The rMBP and cMBP have seen the same GPU (including amount of dRAM) as the 21.5 inch iMac for the past 2 years.

This. One thing I haven't looked up though is the respective maturity of each of the GPUs in the iMacs/MBPs over the years.

Since the 750 is a revamped 650, etc., etc., etc., I'm not sure if Apple has historically put in GPUs with little room to overclock, as would be the case with the 750, etc. etc. etc.
 
This. One thing I haven't looked up though is the respective maturity of each of the GPUs in the iMacs/MBPs over the years.

Since the 750 is a revamped 650, etc., etc., etc., I'm not sure if Apple has historically put in GPUs with little room to overclock, as would be the case with the 750, etc. etc. etc.

I mean, I'd expect them to be satisfied with a 750M clocked at a 755M. There really isn't too much of a difference from the 650M to 660M. I believe the performance gap between the two is similar. The 755M just requires less overclocking in general from the 750M to reach "said" performance increase.

We've all already noted that the performance gains this year will probably be minimal (assuming they go dGPU route) - so it may be that the performance specifically regarding overclocking may also be slightly less.
 
I mean, I'd expect them to be satisfied with a 750M clocked at a 755M. There really isn't too much of a difference from the 650M to 660M. I believe the performance gap between the two is similar. The 755M just requires less overclocking in general from the 750M to reach "said" performance increase.

We've all already noted that the performance gains this year will probably be minimal (assuming they go dGPU route) - so it may be that the performance specifically regarding overclocking may also be slightly less.

Yeah that's I'm real interested to see what Apple does because
a) the implications for their GPU selection strategy
b) how the dang thing performs with real life benchmarks, apps and games.
 
So then why did they include Iris Pro in the iMac?
I assume because they could. It is the cheapest recycled i5. It is probably not a terribly expensive chip.
Since nobody is really running Intel's door in they probably got a good deal. It also keeps the baseline performance quite high. I doubt it is needed or anything that really gets too much consideration from buyers of the baseline. They wouldn't just release a new base model at the same price as the old which had a 640M included and then have to admit that it is now only some basic Intel GPU and nothing else. Bad for marketing more than anything else. The reputation after the whole sticking with 320M thing still hangs over Intel. Non Tech savvy people hear the brand names Nvidia or Intel and that is were it ends. Who cares about actual graphics demands.
Virtually all people that I know who have iMacs (not that many only 3) would never ever need anything more than a HD 4600. Just normal family PCs that need to handle iphoto, word processing, web browsing and no gaming. People with normal jobs like teachers.
With Iris Pro Apple can claim dGPU level performance and the guys that sell them can claim it and they don't have to explain anything more. People can look it up. No need to explain to them that after all the fuss about GPU focus of Apple it actually really doesn't matter at all for most people's use cases.
 
For examples there are specialised software that won't run on anything by nVidia or FireGL graphics, Metacast, ORAD and Weatherscape won't run on integrated graphics cards due to the software being 100% built around the nVidia libraries.

"being 100%" built around the nVidia libraries" so your talking about CUDA? well if they put an AMD chip in there, like they have in the past, then those apps wouldn't have the benefit of CUDA anyways so i don't think that argument is as big as you claim

and for photoshop / AE / final cut pro ... the iris pro > Kepler GK107

the GeForce gpu's are built/optimized for gaming environments not 'pro' user environments that's why there are Quadro and Tesla gpus for real professionals

One of those benchmarks has an HD 4000 outperforming the 650M.

yep as people said, those are desktop chips, look at the wattage, though i still wouldn't be surprised to see the 4000 series beat the 650m in some of the opencl tests too

if Iris Pro was only for pro users, and Iris pro is better for app and OpenCL, why apple only put Iris pro only in the base model 21.5" iMac? the iMac is still under apple logo and still a pro user, its not just portable like a laptop, is for office/home use

intel's igpu is not solely for pro users, its intel's attempt at taking over the lower (and over the next few years, mid) end gpu market

it was probably cheaper for them to put in the i5 than a dgpu, and if the user needs gaming performance they can upgrade - lots of imacs are not used for gaming but just for family home use or reception use ...

when they decided to put a igpu instead of a dgpu in there they would definetly have picked iris pro simply because the wording 'iris pro' looks more fancy than 'HD 5000', its sad but that probably got them a lot more sales
 
With Iris Pro Apple can claim dGPU level performance and the guys that sell them can claim it and they don't have to explain anything more. People can look it up. No need to explain to them that after all the fuss about GPU focus of Apple it actually really doesn't matter at all for most people's use cases.

That was the point I was trying before, there are only a few user cases where a graphics card would be on absolute benefit, but there are users who both want and need this performance benefit.

Though with that said if you're purchasing a new laptop and I am paying $USD1500+ then I would expect it to come with a discreet graphics card.
 
"being 100%" built around the nVidia libraries" so your talking about CUDA? well if they put an AMD chip in there, like they have in the past, then those apps wouldn't have the benefit of CUDA anyways so i don't think that argument is as big as you claim

I am not disagreeing with you.

Correct about CUDA, though there are specialised apps as I have previously mentioned like Weatherscape that clearly won't run on anything but nVidia.

My point being though it's an edge case there are professionals out there who need discreet graphics cards opposed to integrated, this will impact them for what they will end up choosing if Apple decides to drop discreet graphics cards from the mix on their professional laptop range.

This is part of the reason why I am considering purchasing a decent refurb or another brand such as HP or Boxxtech if Apple drops discreet cards because my bread and butter application just won't run.
 
yes i guess if you think about the cost of the machine you would expect a dgpu.

but i think you also need to consider where the expenses are coming from, many other laptops in this price range can afford a high end dgpu because they cut costs in other ways such as plastic construction, cheap screen (low res or tn pannel), slower ssds ...

apple's construction quality is very high, uses an ips with crazy resolution, nice unibody, higher speed ssds, the ram being soldered on according to anandtech results in the closest to 1600mhz they have ever seen ...

so apple seems to be spending the money making the device better in other ways ( + apple tax )
 
I assume because they could. It is the cheapest recycled i5. It is probably not a terribly expensive chip.
Since nobody is really running Intel's door in they probably got a good deal. It also keeps the baseline performance quite high. I doubt it is needed or anything that really gets too much consideration from buyers of the baseline. They wouldn't just release a new base model at the same price as the old which had a 640M included and then have to admit that it is now only some basic Intel GPU and nothing else. Bad for marketing more than anything else. The reputation after the whole sticking with 320M thing still hangs over Intel. Non Tech savvy people hear the brand names Nvidia or Intel and that is were it ends. Who cares about actual graphics demands.
Virtually all people that I know who have iMacs (not that many only 3) would never ever need anything more than a HD 4600. Just normal family PCs that need to handle iphoto, word processing, web browsing and no gaming. People with normal jobs like teachers.
With Iris Pro Apple can claim dGPU level performance and the guys that sell them can claim it and they don't have to explain anything more. People can look it up. No need to explain to them that after all the fuss about GPU focus of Apple it actually really doesn't matter at all for most people's use cases.

But, this is the MacBook PRO. The MacBook Air is suppose to be the standard consumer version. Pro = power. The review sites aren't going to fall for it either. Anyone spending that kind of money is going to read a review. The reviews will have a chart comparing GPU performance between the 2012 and 2013 models and show that the 2012 model is 40 percent faster. Not good.
 
yes i guess if you think about the cost of the machine you would expect a dgpu.

but i think you also need to consider where the expenses are coming from, many other laptops in this price range can afford a high end dgpu because they cut costs in other ways such as plastic construction, cheap screen (low res or tn pannel), slower ssds ...

apple's construction quality is very high, uses an ips with crazy resolution, nice unibody, higher speed ssds, the ram being soldered on according to anandtech results in the closest to 1600mhz they have ever seen ...

so apple seems to be spending the money making the device better in other ways ( + apple tax )

I completely understand where you're coming from with quality equaling value of a systems.

There are more to a the overall performance of a machine than just GPU, though there is the expectation that a professional machine would both be of a quality build and be inclusive of dGPU, quality part, etc..

My 2008 MBP is still going strong after a HDD update and more memory update... The only reason why I had to update it (we'll give it to my wife) was due to the fact it was getting too slow for the type of work I do.

Where I am coming from, without a dGPU then it would be increasingly hard for me to choose that laptop over one within the same price range with a dGPU. I do believe there would be other users within this same position.
 
So, let me sum up...

We are expecting a 'binned' Iris Pro to have higher compute/OpenCL performance, but lower gaming performance than the expected NVidia 750M?

Does the 750M out-perform an OpenCL app if a CUDA version is available?

Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?
 
So, let me sum up...

We are expecting a 'binned' Iris Pro to have higher compute/OpenCL performance, but lower gaming performance than the expected NVidia 750M?

Does the 750M out-perform an OpenCL app if a CUDA version is available?

Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?

These are all excellent questions. I want to see Apple's 750 against the iris pro.
 
So, let me sum up...

We are expecting a 'binned' Iris Pro to have higher compute/OpenCL performance, but lower gaming performance than the expected NVidia 750M?

Does the 750M out-perform an OpenCL app if a CUDA version is available?

Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?

That is a seriously good point, I would be hoping for an entry rMBP level having Iris Pro and a high BTO with 7XXM + Intel.

It actually makes sense.
 
we are expecting a iris pro, there is no binning outside the usual.

that nonsense of exclusive apple cpus needs to end
 
So, let me sum up...

We are expecting a 'binned' Iris Pro to have higher compute/OpenCL performance, but lower gaming performance than the expected NVidia 750M?

Does the 750M out-perform an OpenCL app if a CUDA version is available?

Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?

This really is a great summary - the question of if in some areas, a low(er) end rMBP would actually perform better than a high end rMBP.
 
So, let me sum up...

We are expecting a 'binned' Iris Pro to have higher compute/OpenCL performance, but lower gaming performance than the expected NVidia 750M?

Does the 750M out-perform an OpenCL app if a CUDA version is available?

Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?

This is what I've been stating in a couple other threads. It's why I'm so confused about the issue. I've already been refraining from further discussion, just kinda waiting to see what they do.

To answer that question, technically, if you look at the iMac's I think you will see that that is the route they have chosen to go. I think the lower end 21.5 iMac outperforms the higher end one in OpenCL.
 
But, this is the MacBook PRO. The MacBook Air is suppose to be the standard consumer version. Pro = power. The review sites aren't going to fall for it either. Anyone spending that kind of money is going to read a review. The reviews will have a chart comparing GPU performance between the 2012 and 2013 models and show that the 2012 model is 40 percent faster. Not good.
If you ignore the kids that want to play games in Windows there isn't really a 40 percent difference to the old model. It is more like 30% ahead of a 650M.

Apple doesn't necessarily have to care about DirectX gaming performance in Windows when promoting their Mac products.
 
If you ignore the kids that want to play games in Windows there isn't really a 40 percent difference to the old model. It is more like 30% ahead of a 650M.
[url=http://www.macitynet.it/wp-content/uploads/2013/09/MacgTEST1.jpg]Image[/URL]
Apple doesn't necessarily have to care about DirectX gaming performance in Windows when promoting their Mac products.

Agreed! Not to mention, Lightwave, Maya, Solidworks and pretty much most of the 3d content/engineering application that the Iris 5200 is much faster at..

http://www.notebookcheck.net/Intel-Iris-Pro-Graphics-5200.90965.0.html.....just scroll down to the Specviewperf 11 section and there's the comparisons. In Solidworks for example Iris pro is twice as fast as the GT650M!
 
Do we really expect Apple to position a 4600 + 750M at the top of the line, but the Iris Pro version (presumably cheaper) has faster Photoshop performance?

Yes.

They have already decided this. The new iMacs are out :

21.5"
Low = Iris pro
High = 750m

27"
Low = 755m
High = 775m


Apple have such little respect for the Iris Pro performance that they don't even offer it on the 27" at all. I think they are right.

----------

If you ignore the kids that want to play games in Windows there isn't really a 40 percent difference to the old model. It is more like 30% ahead of a 650M.

The "kids" who play games have an average age of 30. i.e the sales stats show the average gamer is a 30 year old professional, not a child or teenager.

I find it a bit strange how games are treated with such disdain in many of the comments in the threads about Haswell iGPU vs Iris Pro. The games market is huge, it is larger than the film and music industry combined. A huge number of adults play them as entertainment. Wanting to play games on your very expensive laptop is a perfectly reasonable thing to desire.
 
I think some of those specviewperf results on nvidia hardware only hold for Windows. In OSX they should show a bit better because OSX doesn't have the artificially crippled drivers just worse drivers in general as OpenGL shows.
It is really a problem of miserable driver performance more than anything else. The K2000 Quadro shows what the Kepler chips can pull of if they want to.
Intel beats even a 780M in those benchmarks under Windows but doesn't stand a chance against Quadro or FirePro.

If they released both a version with Iris Pro and a dGPU with hd4600, it might result in that real professionals should pick the Iris Pro while gamers keep going for nvidia.

The "kids" wasn't really meant like that. It was more like when we let the inner kid out. I know there are lot of older gamers out there and I am guessing the fast majority of buyers doesn't give anything about Maya, Photoshop, C4D, AutoCAD or any of that expensive workstation software. Most people use it for stuff an Air could handle with a some gaming on top. This thread however has been carried by self-proclaimed professionals (of the Maya, AutoCAD, C4D kind) complaining that MBPs would no longer be an option for their work.
If we are honest gamers would be better of with non Mac hardware anyway, so they wouldn't buy a Mac for best gaming performance and with the whole hassle of only getting most games with decent performance in Windows. It isn't the biggest loss if gaming performance would take a little of a back seat.
The games that rely on OpenGL in OSX will probably not fare to badly in the 650M vs Iris Pro comparison.
 
Last edited:
If they made a 13" w/ a dGPU I'd sell my 15" tomorrow and buy one.

When the 13" came out I was one of those guys sighing deeply when I heard it was going to be iGPU only. I lug around the 15" rMBP (haha just kidding - at 4.6lb it's not exactly a heavyweight). But seriously, I'd buy a 13" w/ a dGPU tomorrow with every option save the largest SSD.
 
The rumor on Mac Generation was a 755M I think. I'd like to see a 765M myself with 2 GB of memory available as an option at least to match up with the Razer Blade/Blade Pro.
 
It isn't the biggest loss if gaming performance would take a little of a back seat.
The games that rely on OpenGL in OSX will probably not fare to badly in the 650M vs Iris Pro comparison.

Luckily I don't think we are going to have that problem. From the new Haswell iMac specs i think apple have decided against iGPU for now. That could change with Broadwell but by the time we have Broadwell nVidia will have Maxwell and that will be a much higher target to reach.
 
I think some of those specviewperf results on nvidia hardware only hold for Windows. In OSX they should show a bit better because OSX doesn't have the artificially crippled drivers just worse drivers in general as OpenGL shows.
It is really a problem of miserable driver performance more than anything else. The K2000 Quadro shows what the Kepler chips can pull of if they want to.
Intel beats even a 780M in those benchmarks under Windows but doesn't stand a chance against Quadro or FirePro.

If they released both a version with Iris Pro and a dGPU with hd4600, it might result in that real professionals should pick the Iris Pro while gamers keep going for nvidia.

The "kids" wasn't really meant like that. It was more like when we let the inner kid out. I know there are lot of older gamers out there and I am guessing the fast majority of buyers doesn't give anything about Maya, Photoshop, C4D, AutoCAD or any of that expensive workstation software. Most people use it for stuff an Air could handle with a some gaming on top. This thread however has been carried by self-proclaimed professionals (of the Maya, AutoCAD, C4D kind) complaining that MBPs would no longer be an option for their work.
If we are honest gamers would be better of with non Mac hardware anyway, so they wouldn't buy a Mac for best gaming performance and with the whole hassle of only getting most games with decent performance in Windows. It isn't the biggest loss if gaming performance would take a little of a back seat.
The games that rely on OpenGL in OSX will probably not fare to badly in the 650M vs Iris Pro comparison.

Very well said sir!!
Not to mention that PS4 and XBOXONE are around the corner!!...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.