Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Will the Haswell rMBP be announced in September with a dGPU option?


  • Total voters
    407
  • Poll closed .
Iris Pro is good. Good enough to run a retina screen pretty well. Which is why I think Apple will think it's fine for the 13" and entry 15" models. But a decent modern GPU, like a ~760M, or hopefully more like the upcoming ~9770M should be able to significantly outperform it. Therefore, still worth having.

Also, I think if Apple were going to lose the dGPU, they would have waited a while longer for the significant case redesign that came with the retina models, and just stuck the retina screen on the 'classic' body and components. Seems pointless to design a new case to take up to 90W TDP, just to drop the maximum TDP down to 50W only 16 months later.

Interesting. Hadn't looked at the 9770 either.
 
Iris Pro is good. Good enough to run a retina screen pretty well. Which is why I think Apple will think it's fine for the 13" and entry 15" models. But a decent modern GPU, like a ~760M, or hopefully more like the upcoming ~9770M should be able to significantly outperform it. Therefore, still worth having.

Also, I think if Apple were going to lose the dGPU, they would have waited a while longer for the significant case redesign that came with the retina models, and just stuck the retina screen on the 'classic' body and components. Seems pointless to design a new case to take up to 90W TDP, just to drop the maximum TDP down to 50W only 16 months later.

That's true. It would only make sense in order to prioritize battery life...
 
What if Apple is putting Iris Pro 5200 in the 15" for a different reason than to just replace the dGPU. With Iris Pro the need to switch to the dGPU for anything apart from graphic intensive programs would be none existent. Unlike the HD4000 the Iris Pro will be able to handle the screen and most low end games / software. Then when the need arises with a really 3D intensive program or game the dGPU kicks in and runs it smoothly. Less switches equal better battery savings.

I know people are saying the two are too expensive to work alongside each other, but Apple has stupidly strong buying power at this point. Who's to say they can't dictate their own prices enough to make it viable?
 
No they are also too close in performance to make it worth it.
The HD 4000 is quite a lot more capable than people often care to admit. It works fine for most programs and even games that don't ask for much. The HD 4600 is even better. An Air with a GPU that is slower than a HD 4000 is perfectly fine running most programs and can even handle external big screens.
Apple switches to the dGPU too often not because they need to but because they just implemented it rather stupidly. It doesn't switch when performance is needed but when the driver thinks it might potentially be needed and it is wrong very often.

Anywhere you wouldn't want to switch to a dGPU anyway you wouldn't need a HD 5200 either and would be fine with HD 4600 performance. The benefit of a HD 5200 is it can power up and down based on actual load. It will always run at optimal power efficiency.

Apple may have the buying power but they have always been stupidly cheap with it as well. Look at the amount of VRAM they always used. GDDR5 is fairly expensive so they always only used very little. For many things it was enough but some would have preferred more (Pro Apps, Gamers) and it wouldn't have been that expensive.
 
No they are also too close in performance to make it worth it.
The HD 4000 is quite a lot more capable than people often care to admit. It works fine for most programs and even games that don't ask for much. The HD 4600 is even better. An Air with a GPU that is slower than a HD 4000 is perfectly fine running most programs and can even handle external big screens.
Apple switches to the dGPU too often not because they need to but because they just implemented it rather stupidly. It doesn't switch when performance is needed but when the driver thinks it might potentially be needed and it is wrong very often.

Anywhere you wouldn't want to switch to a dGPU anyway you wouldn't need a HD 5200 either and would be fine with HD 4600 performance. The benefit of a HD 5200 is it can power up and down based on actual load. It will always run at optimal power efficiency.

Apple may have the buying power but they have always been stupidly cheap with it as well. Look at the amount of VRAM they always used. GDDR5 is fairly expensive so they always only used very little. For many things it was enough but some would have preferred more (Pro Apps, Gamers) and it wouldn't have been that expensive.

you can't argue that the 4000 can run games decently. try running shogun2 on it and tell me how it handles
 
Shogun 2 is too demanding even for some mobile dGPU though it is more a CPU problem if you want some fun. At minimum with no units on the field anything can handle it but at high with some resolution a dGPU doesn't fare well either.
He was talking about not very demanding games like Heartstone card game, Farmville or some CS.
Low end game, software was the topic.

My argument is that for everything for which you'd need a 5200 over a HD 4600 there is little sense to not go straight for enabling a 760M or would you run shogun 2 of the 5200 if your notebook had a 2GB dGPU as well. Especially with a game so demanding on CPU resources with big battles and huge unit size (which is so much more fun) a dGPU is what you would enable.
 
Does anyone think that Apple putting in a dGPU in their iMac and leaving Iris Pro as the base config bode well for having dGPU on the rMBP? Not the 13" of course, but the 15"?
 
Does anyone think that Apple putting in a dGPU in their iMac and leaving Iris Pro as the base config bode well for having dGPU on the rMBP? Not the 13" of course, but the 15"?

Yes.

Previously I had thought that having both Iris Pro and dGPU models wouldn't make sense because Iris Pro's list price is so high that it wouldn't save money over the cost of putting in a dGPU.

However, Apple just put Iris Pro in the base model $1299 iMac with the upgraded model featuring a 750M. I'm guessing Intel is giving them very good pricing on the Iris Pro parts to make that viable.

Now I could definitely see Apple doing a base model 15" with Iris Pro and a higher end model with HD 4600 graphics and a dGPU.

I think that the new iMacs with the optional GTX 780M with 4GB VRAM shows that Apple is still serious about at least giving people the option of serious graphics performance.
 
Yes.

Previously I had thought that having both Iris Pro and dGPU models wouldn't make sense because Iris Pro's list price is so high that it wouldn't save money over the cost of putting in a dGPU.

However, Apple just put Iris Pro in the base model $1299 iMac with the upgraded model featuring a 750M. I'm guessing Intel is giving them very good pricing on the Iris Pro parts to make that viable.

Now I could definitely see Apple doing a base model 15" with Iris Pro and a higher end model with HD 4600 graphics and a dGPU.

I think that the new iMacs with the optional GTX 780M with 4GB VRAM shows that Apple is still serious about at least giving people the option of serious graphics performance.

Agreed. Apple likes to save money, but not at the expense of having great products. Their margins on their computers are very high, that is why they can afford to put in HD5000 in the MBA. That is not a cheap part.
 
Previously I had thought that having both Iris Pro and dGPU models wouldn't make sense because Iris Pro's list price is so high that it wouldn't save money over the cost of putting in a dGPU.
Iris Pro isn't actually that expensive. First news sites just mainly printed the 4950hq price but compared to the normal quads it is only 60-80 $ on top. The lower quads aren't that expensive and on top of a cpu that costs about 400$ to begin with anyway it isn't all that much.

With the iMacs out, this thread might go quickly to a conclusion. I suppose the MBP line will follow soon.
It is weird though how extremely long it took for Haswell to get out. Not just Apple but Samsung, Asus also haven't updated their highend products at all. It was obvious that it wouldn't be a fast rollout but this one was exceptionally slow.
 
Now I could definitely see Apple doing a base model 15" with Iris Pro and a higher end model with HD 4600 graphics and a dGPU.

One issue here is that the Iris Pro eDRAM is a full-featured L4 cache, which works for CPU as well as for GPU. Using a CPU without Crystalwell in a 'better' model will actually give you less performance. So it has to be Iris Pro + dGPU. Which would be ideal IMO, as it offers the best of two worlds - strong gaming/3D performance of a dGPU and strong compute performance of the Iris Pro.
 
One issue here is that the Iris Pro eDRAM is a full-featured L4 cache, which works for CPU as well as for GPU. Using a CPU without Crystalwell in a 'better' model will actually give you less performance. So it has to be Iris Pro + dGPU. Which would be ideal IMO, as it offers the best of two worlds - strong gaming/3D performance of a dGPU and strong compute performance of the Iris Pro.

But Apple chose to make that same tradeoff in the iMacs. Only the very base model iMac has the L4 cache. The high end 21.5" model and the 27" models do not have that.
 
I think that the new iMacs with the optional GTX 780M with 4GB VRAM shows that Apple is still serious about at least giving people the option of serious graphics performance.

Agreed. I think it's a very good sign!

2gb 765M option for the MacBook Pro please!
 
One issue here is that the Iris Pro eDRAM is a full-featured L4 cache, which works for CPU as well as for GPU. Using a CPU without Crystalwell in a 'better' model will actually give you less performance. So it has to be Iris Pro + dGPU. Which would be ideal IMO, as it offers the best of two worlds - strong gaming/3D performance of a dGPU and strong compute performance of the Iris Pro.
It is a rather far away slow cache compared to some SRAM cache like the Xbox 360 has. It is a lot faster than main RAM access in latency and bandwidth but it is not L3 performance. While Intel says that in some workloads it can give as much as 25% performance boost in many and probably most situations it does fairly little to nothing. I think it averages out, on making up for the amount of TDP, it takes away from the total but not so much that it clearly beats the slightly higher turboing cpus without this extra cache. Except for the 4960hq which is probably a win all-round.
 
Shogun 2 is too demanding even for some mobile dGPU though it is more a CPU problem if you want some fun. At minimum with no units on the field anything can handle it but at high with some resolution a dGPU doesn't fare well either.
He was talking about not very demanding games like Heartstone card game, Farmville or some CS.
Low end game, software was the topic.

My argument is that for everything for which you'd need a 5200 over a HD 4600 there is little sense to not go straight for enabling a 760M or would you run shogun 2 of the 5200 if your notebook had a 2GB dGPU as well. Especially with a game so demanding on CPU resources with big battles and huge unit size (which is so much more fun) a dGPU is what you would enable.
I know its demanding, and its one of the few series I play

the great about TW series is that for 1080p maxed and with a min of 30fps, not avg, but min 30fps, you need a very good OC on your cpu and a very powerful gpu, usually the 680 did the trick. We shouldn't talk about 1440p since its basically forbidden territory unless you are talking about flash vbios titan with a great OC and I bet it still won't make the min 30fps, specially on 40vs40 battles on huge unit sizes. We shouldn't go for rome 2 either...

anyway, if it behaves like a 640m as it does, it should do the trick for medium details and not so large battles.
 
I know its demanding, and its one of the few series I play
It is one of my favourites too. I started with the first medieval played Medieval 2 and Empire. But it only looks really cool with huge unit sizes and that just brings notebooks so quickly to its knees. The standard unit sizes are so small it looses the epic feel.
The ship battles in empire also looked and sounded great but they were very heavy on the GPU and I never quite figured them out (strategically). I always lost unless I had the clearly better fleet.
Great series. As soon as I get a new computer I will get play one of the newer ones again. Rome 2 seems to be quite ridiculously demanding though with a steep drop off in quality from the higher settings. I think you need a desktop for an epic scale battle that isn't a slide show. On notebooks probably wait for the 20nm generation. The quality difference is also so huge. In Starcraft 2 everything higher than medium doesn't make a big difference (huge difference between low and medium) but in TW the grass disappears, arrows, dust clouds, model detail all changes quite a bit but the change in the environment on the lower settings is quite a big blow imo. When you cannot even see the grass your units are hiding in it gets weird.
Maybe just Empire at higher settings or Shogun 2 is something more reasonable for my next hardware update.

Anyway a Hd 4600 seems to do medium @low res on Rome 2 on some benchmark.
http://www.notebookcheck.com/Benchmarkcheck-Total-War-Rome-II.100921.0.html
But I think it isn't worth playing at that setting. Rather play an older one where you can increase unit size and see the battlefield how it is supposed to look.
 
I have a 13" rMBP that I 100% intend to replace with a Haswell 15" model the second it's announced, *IF* the new model has a decent dedicated GPU. Unfortunately, I've slowly accepted that Apple may choose to leave it out.

Today's iMac reveal makes me gain a little hope back, though. Iris in the base, dGPU on the top end. Don't see why they can't do that with the laptops.
 
The iMac model with the 750m uses an HD 4600 in case anyone missed that, there's no version with Iris Pro and a dGPU.

I still seriously doubt it. Here's the problem from my perspective:
You create 2 MBP options, 1 with Iris Pro, and then a "higher end" model with HD 4600 and a 750m.

So now you have a "higher end" machine that performs worse in day to day use (frame rates in web rendering being the big problem) compared to your "base" model. That's not going to look very good. Nor is the fact that the Iris Pro model does better in OpenCL. The only way this could work is if you considered them side-grades (same price point), which is something I don't think Apple's ever done.

On the desktop this is a non-issue. The resolution is much lower on the smaller iMac and there are no power concerns related to simply keeping the dGPU on constantly.

Most importantly, the GPU switching has always been terrible. It's fine for a desktop machine, where having the dGPU come on for completely nonsensical reasons isn't a concern, but on a laptop it's a battery destroying switch with no user feedback (unless you install a 3rd party app to monitor GPU status).

The point is I'm still expecting to see iGPU only
 
Agreed. I think it's a very good sign!

2gb 765M option for the MacBook Pro please!

Feasible but going from 45 watts to 75 watts might create some issues.

I threw together this chart showing the iMac/MBP configurations over the last couple years. If history is an indicator, this year's rMBP would get a 750M with 1 or 2GB VRAM.

I'd hope for 755M with 2GB. :eek:

Although a 76xM unseen in the iMacs would be nice too. :D
 

Attachments

  • 2013_MBP_dGPU.bmp
    72.1 KB · Views: 123
Last edited:
From SemiAccurate: "Did Intel deliver for Apple?"

I can't read any more than the first paragraph, but my guess is that Apple changed plans and moved the upcoming MacBook Pros away from all-integrated graphics because Intel didn't "deliver" to Apple's satisfaction, so there will be at least one model with discrete graphics.
 
From SemiAccurate: "Did Intel deliver for Apple?"

I can't read any more than the first paragraph, but my guess is that Apple changed plans and moved the upcoming MacBook Pros away from all-integrated graphics because Intel didn't "deliver" to Apple's satisfaction, so there will be at least one model with discrete graphics.

That's quite a guess. :) Does macrumors have a deal on stuff like this? I want to read that article! :D
 
The point is I'm still expecting to see iGPU only

Not a chance

I threw together this chart showing the iMac/MBP configurations over the last couple years. If history is an indicator, this year's rMBP would get a 750M with 1 or 2GB VRAM.

I'd hope for 755M with 2GB. :eek:

Although a 76xM unseen in the iMacs would be nice too. :D

That's exactly what I think :). 755M (probably 1GB though cus :apple:) makes complete sense, but I have a feeling they'll just put a 750M in there, probably clocked higher (which is what a 755M is though :confused::confused:). I'd be happy with 750M/755M and 760M would be icing on the cake. 2GB would be a dream come true, but we can't have that now :D


EDIT: Looking at the iMac configurations on apple.com and then the 750M and 755M on notebookcheck.com it only really seems logical that they will put the exact same 750M in the rMBP's, with 1GB RAM. The only difference between the 750M and 755M is that the 750M runs at 2000-5000 MHz, while the 755M only runs at 5400 MHz. For a laptop it seems likely they'd want the variable clocking in there, which would be the 750M. That's my assumption anyway.

Only other difference is the 750M can run with DDR3 or GDDR3, while the 755M only runs on GDDR3, but that's irrelevant.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.