Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not sure why everyone thinks a difference of 42% constitutes almost

It would be quite interesting if there was a comparison of the Iris Pro with the previous gen AMD 6770M graphics in the 2011 15" Macbook Pro.

My guess - they would be about the same.

I mean "almost" in the sense that it's almost at the point where the benefits (longer battery life, lower temps, possibly smaller form factor) would outweigh the negatives (slight - moderate graphics performance decrease in heavy situations) for Apple to consider it worthwhile and put the Iris Pro in the 15", at the risk of making gamers angry.

I'm sure Apple would be more than happy to trade a decrease in graphics performance for one generation for significant improvements in other areas that we know they care about (heat/noise, battery life, form factor). The focus of the MBP is not gaming, we know that. I would be curious to see the difference in performance in GPU-dependent, non-gaming applications like Photoshop, Illustrator, Final Cut, etc.

However, like I said, I don't think it's quite there yet, which is why I don't think they'll do it.
 
to say that IGP is better than DGPU is a joke, Nvidia has been in the race much longer than Intel who claims that their "HD" graphics card is as good as DGPU.

In benchmarks, a single Iris 5200 is almost as fast as an Nvidia, whilst not requiring an additional 45-50 watts of power.

For those 45-50 watts of power, you could fit a second Core i7 in there (assuming intel do a chipset tweak to enable multi-socket i7s), which would mean (if using appropriate firmware) 2x GPU AND another 4 cores.

In the same power consumption.


Unless NVidia/AMD can give either drastically better performance next generation (and intel don't improve by a similar/better amount) it's not the performance that will necessarily kill them off.

It's that intel will be "good enough" in half the total power consumption.


For me, my Radeon 6750M is bordering on "good enough". a GT650M or similar (i.e., Iris) would be plenty.

I suspect I am not alone.

Btw: Iris actually BEAT a GT650M in some benchmarks. And drivers are still pre-release. Real world it will be a lot closer than they've ever been, and likely closer than these pre-production benchmarks.
 
In benchmarks, a single Iris 5200 is almost as fast as an Nvidia, whilst not requiring an additional 45-50 watts of power.

For those 45-50 watts of power, you could fit a second Core i7 in there (assuming intel do a chipset tweak to enable multi-socket i7s), which would mean (if using appropriate firmware) 2x GPU AND another 4 cores.

In the same power consumption.


Unless NVidia/AMD can give either drastically better performance next generation (and intel don't improve by a similar/better amount) it's not the performance that will necessarily kill them off.

It's that intel will be "good enough" in half the total power consumption.


For me, my Radeon 6750M is bordering on "good enough". a GT650M or similar (i.e., Iris) would be plenty.

I suspect I am not alone.

Btw: Iris actually BEAT a GT650M in some benchmarks. And drivers are still pre-release. Real world it will be a lot closer than they've ever been, and likely closer than these pre-production benchmarks.

IMO that it is still inconclusive.
HD5200 is relatively new, and we still haven't yet seen any laptop benchmark with it. those benchmark scores came from a reference machine according from the anandtech website.

I'd rather wait until there are more results coming out from other websites then we can decide whether the performance is on par with GT650 or not.



unrelated topic:
somehow Intel's pattern looks similar to Nvidia when they introduced GT200.
 
I am for ditching the dGPU, I bought my rMPb for mobile raw processing power. It's a work machine.

I used to game and bought a proper gaming laptop 3 years ago and it has more 3D power and more VRAM and cost $1000 less three years ago.

I am replacing that heavy beast with this little wafer thin beast which has 30% more processing power and 100% more RAM.
 
I am for ditching the dGPU, I bought my rMPb for mobile raw processing power. It's a work machine.

I used to game and bought a proper gaming laptop 3 years ago and it has more 3D power and more VRAM and cost $1000 less three years ago.

I am replacing that heavy beast with this little wafer thin beast which has 30% more processing power and 100% more RAM.

in order to have iris pro, it has to give up CPU speed. Both the CPUs with and without Iris pro have the same TDP.

----------

In benchmarks, a single Iris 5200 is almost as fast as an Nvidia, whilst not requiring an additional 45-50 watts of power.

For those 45-50 watts of power, you could fit a second Core i7 in there (assuming intel do a chipset tweak to enable multi-socket i7s), which would mean (if using appropriate firmware) 2x GPU AND another 4 cores.

In the same power consumption.


Unless NVidia/AMD can give either drastically better performance next generation (and intel don't improve by a similar/better amount) it's not the performance that will necessarily kill them off.

It's that intel will be "good enough" in half the total power consumption.


For me, my Radeon 6750M is bordering on "good enough". a GT650M or similar (i.e., Iris) would be plenty.

I suspect I am not alone.

Btw: Iris actually BEAT a GT650M in some benchmarks. And drivers are still pre-release. Real world it will be a lot closer than they've ever been, and likely closer than these pre-production benchmarks.

Half the total power consumption AT LOAD. At idle, reading email, etc the power consumption is the same. If you're taxing the computer at load all the time, you should probably be plugged in anyways.
 
As to the argument of why Iris Pro is almost as good.
That is because for all OSX purposes it is. Nvidia has a better well optimized Windows driver and Kepler is generally really efficient for gaming work load.
However OSX drivers aren't the same thing and Kepler is actually worse than GCN and Intel in general processing and some OpenCL. Basically all this so called "Pro" need the Nvidia Kepler won't provide them anywhere close to the advantage that game scores seem to suggest.

Kepler is great for all the gamers out there that really only need a fast GPU for games but people that use their MBP for work with gaming as a secondary or tertiary Iris Pro would not be a downgrade.

Personally I think Apple keeps the dGPU.
Because of Nvidia's reputation and the inevitable fall out of a removable of dGPUs and not good reason.
Most importantly that there are probably a lot more people owning and buying rMBP who need GPU prowess only for gaming and NOT professional work. Most people don't do any stuff that needs GPU performance.
Just removing the dGPU and not redesigning for a little thinner still would be too reasonable.

Apple ever only wanted enough GPU speed to power their animations and fancy gui. They got there with 320M anything since then isn't really needed.

They should offer a HD 5200 equipped rMBP in a thinner redesign but more likely they won't.
 
I mean "almost" in the sense that it's almost at the point where the benefits (longer battery life, lower temps, possibly smaller form factor) would outweigh the negatives (slight - moderate graphics performance decrease in heavy situations) for Apple to consider it worthwhile and put the Iris Pro in the 15", at the risk of making gamers angry.

I'm sure Apple would be more than happy to trade a decrease in graphics performance for one generation for significant improvements in other areas that we know they care about (heat/noise, battery life, form factor). The focus of the MBP is not gaming, we know that. I would be curious to see the difference in performance in GPU-dependent, non-gaming applications like Photoshop, Illustrator, Final Cut, etc.

However, like I said, I don't think it's quite there yet, which is why I don't think they'll do it.

Broadwell is suppose to have some big improvements for Iris Pro, like dedicated GDDR5 vram. Maybe next gen, it'll be ready to overtake a 750m..but not today.
 
Broadwell is suppose to have some big improvements for Iris Pro, like dedicated GDDR5 vram. Maybe next gen, it'll be ready to overtake a 750m..but not today.

That's the age old problem with buying new tech. There's always something better right around the corner. The only thing that really kept me from pulling the trigger on the current gen were all of the screen and other issues. Some what expected for a 1st gen product though (never buy a first gen).
 
Broadwell is suppose to have some big improvements for Iris Pro, like dedicated GDDR5 vram. Maybe next gen, it'll be ready to overtake a 750m..but not today.

GDDR5 lacks the flexibility that the edram have, in all probability it will just trickle down to more cpus, specially given that I dont think they will try to raise that so soon.

whats going to happen in broadwell is the finally a revamp of the igpu, it uses the same arch from SB

That's the age old problem with buying new tech. There's always something better right around the corner. The only thing that really kept me from pulling the trigger on the current gen were all of the screen and other issues. Some what expected for a 1st gen product though (never buy a first gen).
thats not a problem, buy it when you need it, 1st gen issues? screen, LG problem, their other IPS panel had also similar problems as well.
 
GDDR5 lacks the flexibility that the edram have, in all probability it will just trickle down to more cpus, specially given that I dont think they will try to raise that so soon.

whats going to happen in broadwell is the finally a revamp of the igpu, it uses the same arch from SB


thats not a problem, buy it when you need it, 1st gen issues? screen, LG problem, their other IPS panel had also similar problems as well.

Need? There are very few people that "need" a $2000 laptop. I 'm buying one because I want one and I'd prefer one that apple wasn't working the kinks out of. This includes screen issues with multiple suppliers, customer service issues relating to screen replacements, performance issues related to the retina screen (mostly fixed with updates but that's my point). Coming from a manufacturing background (and ask anyone in manufacturing will tell you the same) I can vouch for the fact that 1st gen redesigned products having more issues than future years.
 
If the TDPs of the Iris Pro and non-Iris Pro Quads are the same, why is it not reasonable to expect that the performance of both would the same if the iGPU part of both was not being used?

I'm hoping for either a 760M or AMD equivalent. I don't think the 750M would be good enough to justify the upgrade, as it appears to just be the same as the 650M, just clocked a tad higher. So, basically just the 650M that's already in the MBP. :rolleyes:

A dual quad CPU + dual Iris Pro would be an awesome idea, and certainly would be possible for Intel to whip it up, but it won't happen. Too expensive (likely WAY too expensive), and would require Intel creating a whole new line of CPUs, AND Motherboards, AND require some epic drivers and such from both Intel AND Apple. We would probably be better hoping Intel start working on a new line of mobile CPUs at around the 80-90W range in the future which provide at least close to dual-CPU server class performance and high end notebook graphics all in one. But that beast would cost the earth :cool:
 
Need? There are very few people that "need" a $2000 laptop. I 'm buying one because I want one and I'd prefer one that apple wasn't working the kinks out of. This includes screen issues with multiple suppliers, customer service issues relating to screen replacements, performance issues related to the retina screen (mostly fixed with updates but that's my point). Coming from a manufacturing background (and ask anyone in manufacturing will tell you the same) I can vouch for the fact that 1st gen redesigned products having more issues than future years.
buy it when you need it.
If the TDPs of the Iris Pro and non-Iris Pro Quads are the same, why is it not reasonable to expect that the performance of both would the same if the iGPU part of both was not being used?

I'm hoping for either a 760M or AMD equivalent. I don't think the 750M would be good enough to justify the upgrade, as it appears to just be the same as the 650M, just clocked a tad higher. So, basically just the 650M that's already in the MBP. :rolleyes:

A dual quad CPU + dual Iris Pro would be an awesome idea, and certainly would be possible for Intel to whip it up, but it won't happen. Too expensive (likely WAY too expensive), and would require Intel creating a whole new line of CPUs, AND Motherboards, AND require some epic drivers and such from both Intel AND Apple. We would probably be better hoping Intel start working on a new line of mobile CPUs at around the 80-90W range in the future which provide at least close to dual-CPU server class performance and high end notebook graphics all in one. But that beast would cost the earth :cool:

we have to see how the 760m behaves on temps, its higher clocked borther aint cool, i.e the 765m

regarding the maximum cpu performance, you cant logically expect that it should hold the same performance with the different igpus, with 20EU more and a edram on top of that
 
regarding the maximum cpu performance, you cant logically expect that it should hold the same performance with the different igpus, with 20EU more and a edram on top of that

I'm not sure I agree. If the iGPU isn't being used, then it shouldn't be using much power, and such shouldn't be generating much heat, which in turn should enable the rest of the CPU to overclock and fill in the power gap.
 
I'm not sure I agree. If the iGPU isn't being used, then it shouldn't be using much power, and such shouldn't be generating much heat, which in turn should enable the rest of the CPU to overclock and fill in the power gap.

I was agreeing with you when you put both under stress, not just the cpu portion
 
750M has a noticeably lower TDP than the 650M (33W vs ~50W), so if they did go in that direction it would be slightly faster, as well as cooler running than the previous generation. A more likely scenario would be the 760M, which is a considerably more serious card than both the 650M and Iris 5200. The 760M is also a 50W TDP chip, fitting in with the existing thermal envelope.

It'll be interesting to see what comes of it, but they wouldn't be silly enough to use a 5200 on the 15" Retina.

Regarding the suggestions of dual CPU, there are major architectural differences between Intel's consumer chips and the SMP chips, you'll never see one in a laptop.
 
750M has a noticeably lower TDP than the 650M (33W vs ~50W), so if they did go in that direction it would be slightly faster, as well as cooler running than the previous generation. A more likely scenario would be the 760M, which is a considerably more serious card than both the 650M and Iris 5200. The 760M is also a 50W TDP chip, fitting in with the existing thermal envelope.

In benchmarks, a single Iris 5200 is almost as fast as an Nvidia, whilst not requiring an additional 45-50 watts of power.

For those 45-50 watts of power, you could fit a second Core i7 in there (assuming intel do a chipset tweak to enable multi-socket i7s), which would mean (if using appropriate firmware) 2x GPU AND another 4 cores.
Btw: Iris actually BEAT a GT650M in some benchmarks. And drivers are still pre-release. Real world it will be a lot closer than they've ever been, and likely closer than these pre-production benchmarks.

First of all the 650m in the rmbp isn't using anything like 45-50 watts of power. Actual power use under heavy gaming loads is around 30-35 watts.

rmbp has an 85 watt charger (with a 45 watt cpu + rest of system there isn't that much power left for the gpu) unless apple ups the charger to 120 watts you aren't getting a 765m (and probably not a 760m though I could be wrong).
 
Why do you think the 650M is a 50W chip. Most TDP ratings have been nothing but specualtion. Based on the kind of notebooks it showed up in the 650M at stock clocks is most likely well below 30W and even at 900Mhz is probably no higher then 35W.

The 750M is technically much higher in TDP because it overclocks itself quite high and does ignore the TDP rating if the temps are okay.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.