Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Could it? Yes. Will it? I don't think so anymore, per the above. (And I admit I have done a 180 on this. A week ago, I thought we'd see separate configurations, with an iGPU in the base and dGPUs in higher end models. I think this is less likely now.)

----------



We saw it happen a couple years ago when Apple used an iGPU only in the entry level MBP while dropping its price.

Honestly? I cant believe it either. BUT. Going iGPU across the whole line of MBPs will be downgrade. And believe me, LOTS and TONS of people started to game on Macs. Especially on MBP's.

And to those who say that even if, the Iris Pro will be still better in OpenCL i must say this: GTX760M will WIPE DOWN Iris Pro in this tasks.
 
In what other areas of tech do you see graphics performance drop from one generation to the next for devices of such premium?

2010 MBA 320M -> 2011 MBA HD 3000.

Apple sold tons of 2011 MBAs in the end anyways, and a whole bunch of people even upgraded 2010 -> 2011.

And this wouldn't even be a drop - its actually an increase in baseline performance, because the HD 5200 is much better than the HD 4000. So its actually a day to day bump... only offset by what should be totally unnoticeable losses on the productivity side which would be fractions of the time a machine is in use.

Really - the only loss will be gaming. Which is where the HD 3000 failed over the 320M as well.

Anyone remember - there was even a time Apple sold a product with just the GMA950 onboard. Now that's what you call really bad GPU performance. ;)
 
2010 MBA 320M -> 2011 MBA HD 3000.

Apple sold tons of 2011 MBAs in the end anyways, and a whole bunch of people even upgraded 2010 -> 2011.

And this wouldn't even be a drop - its actually an increase in baseline performance, because the HD 5200 is much better than the HD 4000. So its actually a day to day bump... only offset by what should be totally unnoticeable losses on the productivity side which would be fractions of the time a machine is in use.

Really - the only loss will be gaming. Which is where the HD 3000 failed over the 320M as well.

Anyone remember - there was even a time Apple sold a product with just the GMA950 onboard. Now that's what you call really bad GPU performance. ;)

they did that because they no choice.

IIRC Intel and Nvidia have a tussle, then Intel refuse to license Nvidia to make chipsets compatible for Intel processors.

Apple then resort to use Intel chipsets and processors, which have the ****** HD3000 graphics.

you can't have Intel HD3000 and Nvidia 440M at the same time, its consuming way too much power out of the 13" air and mbp.
thus apple chose to sacrifice performance over battery life, hence the reduced performance.

----------


well....screw that :(

----------

And that's what I've been trying to say in the other thread for a long time.

Iris Pro is far more power hungry than regular Haswell + HD 4600, so when the dGPU isn't on, Haswell + HD 4600 is more power efficient.

And when the need arises, GT 760M can kick Iris Pro up and down the stairs multiple times.

regarding the power hungriness ... all GPU nowadays can step up and down their clockspeeds just like CPU bus speed.

so ... its possible to run on ... lets say 100Mhz idle ...

I'm actually wondering what the GT650m is doing while the HD4000 is active.
 
thus apple chose to sacrifice performance over battery life, hence the reduced performance.

I guess they do indeed have a choice right now, but to me the Iris Pro gives them a huge performance bump (day to day) with likely only a small loss in some GPU accelerated productivity tasks outside of a benchmark, which no one will ever notice.

So aside from the gaming loss which has been well covered, going Iris Pro by omitting the GPU switching should actually be a great improvement.

I suspect Haswell/Iris is going to be exactly what the retina needed, since the HD 4000 just couldn't lift enough muscle, and then you'd end up with the heat/power consumption of the 650M if you toggled it on constantly... so I can't see how Haswell isn't going to be a huge productivity win (aside from gaming).

The real "pros" really shouldn't have anything to fear, if anything the experience will be much more seamless and result in a better overall rMBP.
 
If they go Iris they could probably replace the 2 fans with one larger one, which could be used for a bigger battery or something else
 
what sucks in having dGPU for me is that it kicks in when you plug in on external monitor and since for basic tasks dGPU is way hotter than iGPU.

Personally i find Iris Pro only rMBP 15" should be a base rMBP 15" model with lower price and to discount cMBP and for people who likes to play games and do dGPU stuff could buy a BTO option with dGPU who needs it :)
 
I guess they do indeed have a choice right now, but to me the Iris Pro gives them a huge performance bump (day to day) with likely only a small loss in some GPU accelerated productivity tasks outside of a benchmark, which no one will ever notice.

So aside from the gaming loss which has been well covered, going Iris Pro by omitting the GPU switching should actually be a great improvement.

I suspect Haswell/Iris is going to be exactly what the retina needed, since the HD 4000 just couldn't lift enough muscle, and then you'd end up with the heat/power consumption of the 650M if you toggled it on constantly... so I can't see how Haswell isn't going to be a huge productivity win (aside from gaming).

The real "pros" really shouldn't have anything to fear, if anything the experience will be much more seamless and result in a better overall rMBP.

I just don't get this line of logic at all. Everyday tasks perform just fine on the HD 4000 under ML and Mavericks. They don't really need more horsepower, and when that additional power is triggered, the dGPU kicks in. Can you tell me what specific functions you think are underpowered today, or are being offloaded to the dGPU because the iGPU would underpower them? (I'm discounting "bad" dGPU switches, since with gfxCardStatus, you can stop those.)
 
I'm not a heavy gamer, but i don't see the real benefit from a Iris 5200 pro. Negative: total CPU/GPU costs are higher, CPU Clock is lower an it has less performance than the previous generation. I don't think that the battery argument ist relevant, because a 4600 consumes even less when the discrete gpu sleeps.

On the pro side: no gfx switching, less power hungry system and more room for the battery or a thinner devince.Yeah thanks for that.
 
I'm not a heavy gamer, but i don't see the real benefit from a Iris 5200 pro. Negative: total CPU/GPU costs are higher, CPU Clock is lower an it has less performance than the previous generation. I don't think that the battery argument ist relevant, because a 4600 consumes even less when the discrete gpu sleeps.

On the pro side: no gfx switching, less power hungry system and more room for the battery or a thinner devince.Yeah thanks for that.
Less performance? We saw a Geekbenched 2.4 GHz Haswell processor with 6MB of cache and Iris Pro outperform an IB 2.8 GHz with 8MB cache. Seems like an increase to me.
 
Less performance? We saw a Geekbenched 2.4 GHz Haswell processor with 6MB of cache and Iris Pro outperform an IB 2.8 GHz with 8MB cache. Seems like an increase to me.

Said Haswell processor is the top-end and also the most expensive CPU that comes with Iris Pro.

So it's actually in the same class as the IB 2.8GHz with 8MB cache.

And it "outperforms" the IB 2.8GHz by something like 0.1%.
 
Can you tell me what specific functions you think are underpowered today, or are being offloaded to the dGPU because the iGPU would underpower them? (I'm discounting "bad" dGPU switches, since with gfxCardStatus, you can stop those.)

That's the problem - you already know those "bad" dGPU switches happen. This eliminates the switching entirely. You don't always need the horsepower, and instead of an on/off switch, all you get is the frequency of the Iris going up/down as needed.

No switching solution can match the efficiency of avoiding the switching. Hook up an external monitor, its not like you need the dGPU all the time, but you generate the heat anyways... the switching is just not a good thing.

Eg. If you're using Aperture and Photoshop, most times in the app you don't need the dGPU, but some actions benefit from having it engaged. So you're forced into dGPU mode needlessly.

So like I said - you're better off with just one GPU.
 
Said Haswell processor is the top-end and also the most expensive CPU that comes with Iris Pro.

So it's actually in the same class as the IB 2.8GHz with 8MB cache.
And it "outperforms" the IB 2.8GHz by something like 0.1%.

I'm not saying it is a huge performance leap. But clock vs clock Haswell performs better than its IB counterpart, even with Iris Pro. And the increase is 0.7%.
 
That's the problem - you already know those "bad" dGPU switches happen. This eliminates the switching entirely. You don't always need the horsepower, and instead of an on/off switch, all you get is the frequency of the Iris going up/down as needed.

No switching solution can match the efficiency of avoiding the switching. Hook up an external monitor, its not like you need the dGPU all the time, but you generate the heat anyways... the switching is just not a good thing.

Eg. If you're using Aperture and Photoshop, most times in the app you don't need the dGPU, but some actions benefit from having it engaged. So you're forced into dGPU mode needlessly.

So like I said - you're better off with just one GPU.

So I'm trading horsepower because too many Mac users are too stupid to download and use gfxCardStatus? That sucks. Quite the opposite of your final point is true. I am not better off with just one GPU.
 
If you look at the benchmarks, they show it running OS X Mavericks 10.9 Since Mavericks won't be out until this Fall, that could mean that we won't see the new Retina MacBook Pros until This Fall.
 
So I'm trading horsepower because too many Mac users are too stupid to download and use gfxCardStatus? That sucks. Quite the opposite of your final point is true. I am not better off with just one GPU.

Attitude quite a bit you have Johnny. In the book, the vortex was asymmetrically similar.
 
If you look at the benchmarks, they show it running OS X Mavericks 10.9 Since Mavericks won't be out until this Fall, that could mean that we won't see the new Retina MacBook Pros until This Fall.

It could go either way. Mavericks is being widely tested as would be expected. I don't know :confused:

It would be better news for a July update to see the benchmarks running Mountain Lion but that they are on Mavericks doesn't necessarily preclude a July release. One could talk oneself in circles :rolleyes:
 
Attitude quite a bit you have Johnny. In the book, the vortex was asymmetrically similar.

Like Yoda you speak, RandomEnding. And my attitude is such because a perfectly good solution (HD4600+dGPU) is unlikely to be pursued because of a combination of bad decisions by systems (switching), a bad design decision (not giving users control out of the box), and user ignorance (that they can control the behavior). What a trifecta of stupid.
 
what does it mean? It was all a mistake and supposed to be IB the entire time? Or they are trying to cover up the leak?

The suppliers, supplied the wrong information hence why they had to check it today and it's an Ivybridge model. So it was a mistake from the suppliers to BHs end.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.