What rMBP for photoshop/starcraft 2/Diablo3: iris, iris pro or Nvidia?

Maccotto

macrumors 6502
Original poster
Oct 6, 2012
260
19
Goodmorning, i want to buy a macbook with financial promotion (12 payment with 0% tax).


My doubt is that financial system will refuse a top 15"rmbp so i want ask if also the other rmbp i can do the same things.
I use the notebook only for normal use, photoshop and , very rarely, play little steam games (Dota2) and blizzard games (Starcraft/d3).


I will move my macbook only in my little office , when i go in my parents or (Very very rarely) in my bed instead the tablet/smartphone.

Warning: i will buy every rmbp with 16gb of ram because 16gb make me a long life machine
 

Blue Sun

macrumors 6502a
Feb 11, 2009
908
190
Australia
If you want to do a fair amount of gaming, the model with the GT 750m is the way to go. However, for D3 and SC2 - the Iris Pro will do the job.

For PS, the discrete chip will be better.
 

thekev

macrumors 604
Aug 5, 2010
6,669
1,745
For PS, the discrete chip will be better.
You have it backwards. For the games you should see a difference with discrete graphics. PS will run off practically anything. Why does everyone think it makes such heavy use of the gpu? It's like this entire board is incapable of reading Adobe's specifications. I read this over and over, and it's 100% false. You do not need to worry about gpus for photoshop. If you want to use the raytracer in after effects, that is a different story.
 

Zodiac.mj

macrumors member
Mar 10, 2013
85
10
For Blizzard games, dedicated is a good choice. They run a lot better on nVidia hardware.
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
For Blizzard games, dedicated is a good choice. They run a lot better on nVidia hardware.
Seriously, sometimes I don't understand what the 750m does when I'm gaming. I run SC2 at 1680x1050, Medium Settings, using both the Iris Pro and the 750m. It's like the dedicated gpu isn't there. The only thing I've noticed is that a LOT of apps want the dgpu, and the stuff I do doesn't even remotely stress any gpu shaders...
 

0983275

Suspended
Mar 15, 2013
472
56
If you're going to be playing just these two games, Iris Pro will do fine, anything more than that, it's recommended that you go with 750m.

Even they ran fine on HD 5000 when I had my MacBook Air.
 

c1phr

macrumors 6502
Jan 8, 2011
352
4
You have it backwards. For the games you should see a difference with discrete graphics. PS will run off practically anything. Why does everyone think it makes such heavy use of the gpu? It's like this entire board is incapable of reading Adobe's specifications. I read this over and over, and it's 100% false. You do not need to worry about gpus for photoshop. If you want to use the raytracer in after effects, that is a different story.
Because Adobe optimizes their applications to take advantage of the additional compute power provided by the GPU. Photoshop CS6 uses Adobe's Mercury Graphics Engine to speed up certain operations.

http://forums.adobe.com/message/4289204

If you do any serious amount of PS work on a machine with a less capable card and then move to one with a faster GPU, the difference is night and day. Basic manipulation work won't show any difference.
 

sabbyp

macrumors regular
Oct 26, 2013
139
1
Seriously, sometimes I don't understand what the 750m does when I'm gaming. I run SC2 at 1680x1050, Medium Settings, using both the Iris Pro and the 750m. It's like the dedicated gpu isn't there. The only thing I've noticed is that a LOT of apps want the dgpu, and the stuff I do doesn't even remotely stress any gpu shaders...
It's not using both the iris AND the 750m, it's just using the 750m.

Osx switches to the discrete by default for gpu heavy applications, and it's impossible to use both.
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
It's not using both the iris AND the 750m, it's just using the 750m.

Osx switches to the discrete by default for gpu heavy applications, and it's impossible to use both.
Yes, I understand that. That's why I don't understand what the 750m is doing. If I can run the game at the same settings on BOTH GPUs, either the Iris Pro is incredibly powerful or the 750m is horribly underpowered.
 

thekev

macrumors 604
Aug 5, 2010
6,669
1,745
Because Adobe optimizes their applications to take advantage of the additional compute power provided by the GPU. Photoshop CS6 uses Adobe's Mercury Graphics Engine to speed up certain operations.

http://forums.adobe.com/message/4289204

If you do any serious amount of PS work on a machine with a less capable card and then move to one with a faster GPU, the difference is night and day. Basic manipulation work won't show any difference.
There are a couple things you still miss here. I've already seen that page, and I've dealt with plenty of files that had to be saved out as .psb (large document format).

I wouldn't call jamming in filters serious work. Liquify is fast no matter what at this point. It might shave off a second. In the case of Iris, it's actually quite fast with OpenCL. Note that PS uses OpenCL, not CUDA. Some of the other aps differ. The benchmarks on liquify are silly, because they create a badly distorted mesh. Warp doesn't take long to render. It also works fine on Iris. You should note it on the supported list. The rest of these aren't things that anyone uses very often. I mean you can do a much nicer job using other tools compared to what is offered by lighting effects if you had to put together a comp or layout. Lastly take a look at the OpenCL difference between Iris and the 750m. Even if they were concerned, there is practically no advantage. The 3D tools are terrible, so I never reference them. I've used maya since it was maya complete on PowerPC, but even Blender has better 3d tools than PS. This means you don't have to spend anything to get better tools than those available in PS. That's why I never address them.

Adaptive Wide Angle Filter (compatible video card required)
Liquify (accelerated by compatible video card with 512MB VRAM, GPU mode unavailable on Windows XP)
Oil Paint (compatible video card required)
Warp and Puppet Warp (accelerated by compatible video card, GPU mode unavailable on Windows XP)
Field Blur, Iris Blur, and Tilt/Shift (accelerated by compatible video
card supporting OpenCL, GPU mode unavailable on Windows XP)
Lighting Effects Gallery (compatible video card required with 512MB
VRAM, unavailable on Windows XP)
New 3D enhancements (3D features in Photoshop require a compatible video card with 512MB VRAM, unavailable on Windows XP):

Draggable Shadows
Ground plane reflections
Roughness
On-canvas UI controls
Ground plane
Liqht widgets on edge of canvas
IBL (image based light) controller
 

dusk007

macrumors 68040
Dec 5, 2009
3,383
61
Iris Pro is about 70% faster than Iris
and 750M is about 30% faster than Iris Pro (in games at least which is where the biggest difference is, other stuff they tend to be closer).
As far as SC2 goes once you put 4v4 multiplayer and stuff into the picture, the bigger problem is really the cpu. Even Iris is fast enough to power SC2 decently. If the CPU doesn't limit, the 750M is faster in SC2 but 30% is not a night and day difference. That is usually just enough to be noticable if you look for it but fps counters help.

Back when we where still discussing possibilities almost everybody ruled out that they would pair an Iris Pro with a 750M because they are too close. Difference between an iGPU and a dGPU used to be 2-3 times performance and this time it is barely perceptible.
I don't know exact clock speeds in OSX but in Windows I have been running the 750M at 1060Mhz (which is the max +135Mhz overclock possible) without any heat problems. Playing a 1st person shooter the CPU isn't heavily loaded and the GPU stays pretty much at the exact same temp (75-78C with Vsync enabled) regardless of clocks. Yet Apple disabled the Nvidia Turbo boost and lowered default clocks a little to 925Mhz. So effectively in Windows it is a 20% slower 750M then it should be (per default) which makes the difference to the Iris Pro even smaller. Also you need all the overclocking headroom just to get close to what a default 750M should look like, nothing past that although there'd be room in cpu light stuff.

I think a standard Quad with 4600 + a dGPU would have made more sense. Or they should have been more serious on the GPU side. It is a bit weird the way it is.
 

GSPice

macrumors 68000
Nov 24, 2008
1,623
76
After all this people still can't differentiate the high-end and low-end rMBPs?

:(
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
Iris Pro is about 70% faster than Iris
and 750M is about 30% faster than Iris Pro (in games at least which is where the biggest difference is, other stuff they tend to be closer).
As far as SC2 goes once you put 4v4 multiplayer and stuff into the picture, the bigger problem is really the cpu. Even Iris is fast enough to power SC2 decently. If the CPU doesn't limit, the 750M is faster in SC2 but 30% is not a night and day difference. That is usually just enough to be noticable if you look for it but fps counters help.

Back when we where still discussing possibilities almost everybody ruled out that they would pair an Iris Pro with a 750M because they are too close. Difference between an iGPU and a dGPU used to be 2-3 times performance and this time it is barely perceptible.
I don't know exact clock speeds in OSX but in Windows I have been running the 750M at 1060Mhz (which is the max +135Mhz overclock possible) without any heat problems. Playing a 1st person shooter the CPU isn't heavily loaded and the GPU stays pretty much at the exact same temp (75-78C with Vsync enabled) regardless of clocks. Yet Apple disabled the Nvidia Turbo boost and lowered default clocks a little to 925Mhz. So effectively in Windows it is a 20% slower 750M then it should be (per default) which makes the difference to the Iris Pro even smaller. Also you need all the overclocking headroom just to get close to what a default 750M should look like, nothing past that although there'd be room in cpu light stuff.

I think a standard Quad with 4600 + a dGPU would have made more sense. Or they should have been more serious on the GPU side. It is a bit weird the way it is.
I think Apple went for better battery life instead of more powerful graphics. Also, I'm not so sure a really powerful gpu would feel too comfortable with the rmbp 15" form factor. This thing gets hot enough as it is.

----------

After all this people still can't differentiate the high-end and low-end rMBPs?

:(
I don't really like the difference between the two stock 15" rmbp options this year. They should have offered the dGPU as an BTO instead of being part of the higher end stock. Personally I find I spend more time telling my machine NOT to use the dGPU than actually making use of it.
 

GSPice

macrumors 68000
Nov 24, 2008
1,623
76
I don't really like the difference between the two stock 15" rmbp options this year. They should have offered the dGPU as an BTO instead of being part of the higher end stock. Personally I find I spend more time telling my machine NOT to use the dGPU than actually making use of it.
I will admit the "free" dGPU this year is an odd option to market.
 

theSeb

macrumors 604
Aug 10, 2010
6,963
83
Poole, England
Yes, I understand that. That's why I don't understand what the 750m is doing. If I can run the game at the same settings on BOTH GPUs, either the Iris Pro is incredibly powerful or the 750m is horribly underpowered.
How are you running games using the Iris Pro?
 

dusk007

macrumors 68040
Dec 5, 2009
3,383
61
I think Apple went for better battery life instead of more powerful graphics. Also, I'm not so sure a really powerful gpu would feel too comfortable with the rmbp 15" form factor. This thing gets hot enough as it is.
I don't think the Iris Pro chip really helps with battery life at all. At best it doesn't hurt it, at worst it hurts it.
Under high load nvidia is more efficient, at very low/med load I would assume the normals Quads with HD 4600 that do not have to power the extra eDRAM will be a tad less power hungry too. Intel claims lots of power gating and being able to shut of half the shader cluster but still I doubt that the entire Iris Pro package can beat the standard quad. The eDRAM according to Intel needs 1.5W in Idle and up to 4W under load, of course that gets somewhat negated by being able to shut down the load it removes from the memory controllers and the power states usable with all that. Still I doubt there is anything saved.
A standard quad core with a dGPU would have been at least as good in battery life and potentially better, compared to a dGPU + Iris Pro as is.

I think Intel didn't find enough buyers and they agreed to giving Apple a sizable discount just to get those chips out. The HD 5000 in the Air is also not worth the money and most manufacturers use the HD 4400 chips, because the HD 5000 can win some synthetic stuff but barely is any faster than 4400. Just a waste of quite a lot of die space and therefore cost. At the TDP those chips operate the 40EUs are just not worth it. It starts making sense at 28W+.
For Iris Pro there just wasn't any commitement. Intel still has a bad reputation so Samsung, Asus & Co will still try to fit a dGPU in if just for marketing. Iris Pro only notebooks don't show up as quickly. I think Apple gets them really cheap and Intel considers it a marketing cost for the first batch of these kind of chips.
Putting them in wasn't a engineers decision by Apple I think. Marketing and Supply made that happen. Engineers said: "We can do that if you want."
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
How are you running games using the Iris Pro?
You use a third party utility to force the machine to use the integrated GPU.

----------

I don't think the Iris Pro chip really helps with battery life at all. At best it doesn't hurt it, at worst it hurts it.
Under high load nvidia is more efficient, at very low/med load I would assume the normals Quads with HD 4600 that do not have to power the extra eDRAM will be a tad less power hungry too. Intel claims lots of power gating and being able to shut of half the shader cluster but still I doubt that the entire Iris Pro package can beat the standard quad. The eDRAM according to Intel needs 1.5W in Idle and up to 4W under load, of course that gets somewhat negated by being able to shut down the load it removes from the memory controllers and the power states usable with all that. Still I doubt there is anything saved.
A standard quad core with a dGPU would have been at least as good in battery life and potentially better, compared to a dGPU + Iris Pro as is.

I think Intel didn't find enough buyers and they agreed to giving Apple a sizable discount just to get those chips out. The HD 5000 in the Air is also not worth the money and most manufacturers use the HD 4400 chips, because the HD 5000 can win some synthetic stuff but barely is any faster than 4400. Just a waste of quite a lot of die space and therefore cost. At the TDP those chips operate the 40EUs are just not worth it. It starts making sense at 28W+.
For Iris Pro there just wasn't any commitement. Intel still has a bad reputation so Samsung, Asus & Co will still try to fit a dGPU in if just for marketing. Iris Pro only notebooks don't show up as quickly. I think Apple gets them really cheap and Intel considers it a marketing cost for the first batch of these kind of chips.
Putting them in wasn't a engineers decision by Apple I think. Marketing and Supply made that happen. Engineers said: "We can do that if you want."
The Iris Pro helps battery life in that automatic graphics switching in Mac OS X is atrocious. I don't need the 750m in order to do php programming in Netbeans or to resize one image in Photoshop, but unless I force the iGPU that's what Mac OS X will give me. So in my case it will save a lot of power to stay on the iGPU.
 

theSeb

macrumors 604
Aug 10, 2010
6,963
83
Poole, England
You use a third party utility to force the machine to use the integrated GPU.

----------



The Iris Pro helps battery life in that automatic graphics switching in Mac OS X is atrocious. I don't need the 750m in order to do php programming in Netbeans or to resize one image in Photoshop, but unless I force the iGPU that's what Mac OS X will give me. So in my case it will save a lot of power to stay on the iGPU.
That's the answer I was looking for. I am on the phone now, so won't link you to benchmarks, but to answer your question, games will run at higher fps using the Nvidia GPU. You may not see notice that, depending on the game, settings and your eyes.

Hypothetically speaking, a game might run at 40 FPS on the Iris pro and 60 FPS on the Nvidia 750m. Many people won't notice that. Also the 750m will make the game more playable at higher settings and/or resolutions.
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
That's the answer I was looking for. I am on the phone now, so won't link you to benchmarks, but to answer your question, games will run at higher fps using the Nvidia GPU. You may not see notice that, depending on the game, settings and your eyes.

Hypothetically speaking, a game might run at 40 FPS on the Iris pro and 60 FPS on the Nvidia 750m. Many people won't notice that. Also the 750m will make the game more playable at higher settings and/or resolutions.
In the case of SC2 it'll give me a "You're computer is slowing down the game..." at the same settings with both GPUs. Can't comment on the threshold for other games.
 

dusk007

macrumors 68040
Dec 5, 2009
3,383
61
The Iris Pro helps battery life in that automatic graphics switching in Mac OS X is atrocious. I don't need the 750m in order to do php programming in Netbeans or to resize one image in Photoshop, but unless I force the iGPU that's what Mac OS X will give me. So in my case it will save a lot of power to stay on the iGPU.
Sure. I was comparing a the Iris Pro + 750M model with a potential HD 4600 + 750M model.
You only really get rid of Apple's switching with the Intel only model and not the other one. My post about why having a big Iris Pro paired with a 750M. The Intel only model is most likely going to win in real life battery life just because of power states switch fast based on current and actual demand. A Hd 4600 + dGPU model might have a slight edge over a Hd 5200 + dGPU model though.
 

dusk007

macrumors 68040
Dec 5, 2009
3,383
61
In the case of SC2 it'll give me a "You're computer is slowing down the game..." at the same settings with both GPUs. Can't comment on the threshold for other games.
Turn off all the CPU heavy stuff. Reflections, Particles, Physics. Over multiplayer the other games don't care what your frame rate is but only how your network connection latency is and if the cpu can handle all the updates.
The mistake many people make is to check for a fluid game play experience in single player or only watching their base. Single player is great at testing GPU capabilities but CPU only gets challenged in multiplayer. So test in a 4v4 while a big battle happens (CMD+Opt+F) displays an fps counter in SC2. You will want to hit at least 20 fps in such battles. That is the kind of settings you can run, NOT what results in enough frames in single player or 1v1 first minutes.
The difference between just 1v1 first few minutes and late game battles in 4v4 is like 2-3 times performance difference (assuming vsync is off).
Keep in mind the computer needs to keep all army movement in sync even if it is not on screen over all computers. The GPU ever only needs to worry about what you actually look at.
 

phsphoenix

macrumors member
Oct 14, 2013
68
0
Turn off all the CPU heavy stuff. Reflections, Particles, Physics. Over multiplayer the other games don't care what your frame rate is but only how your network connection latency is and if the cpu can handle all the updates.
The mistake many people make is to check for a fluid game play experience in single player or only watching their base. Single player is great at testing GPU capabilities but CPU only gets challenged in multiplayer. So test in a 4v4 while a big battle happens (CMD+Opt+F) displays an fps counter in SC2. You will want to hit at least 20 fps in such battles. That is the kind of settings you can run, NOT what results in enough frames in single player or 1v1 first minutes.
The difference between just 1v1 first few minutes and late game battles in 4v4 is like 2-3 times performance difference (assuming vsync is off).
Keep in mind the computer needs to keep all army movement in sync even if it is not on screen over all computers. The GPU ever only needs to worry about what you actually look at.
Interesting suggestions. I'll definitely turn on the FPS counter, but you do have a point with the CPU problem. From what I understand SC2 only uses two cores, and from what I can tell the total system temperature never goes above 70C when I run the game, even in 4vs4 over Battle.Net. So something's definitely up with all those "Your computer is slowing down the game..." prompts, which I only get in multiplayer.