The low voltage (i.e., ultrabook targeted) haswell variants are the only ones to get the highest spec integrated "Iris" GPUs from what I've seen? Which makes sense as you are more concerned with power than ultimate GPU performance.
If you're going for a quad core haswell, you won't be getting high end Iris graphics this time around I suspect.
My bet is that the MBP 15" models will come with Discrete GPU until Broadwell and maybe later.
----------
Not going to happen. The other posters stating above about the 'death' of the dGPU in Apple's portable lineup are correct. Please look up the ridiculous prices Intel is charging for their Iris Pro chips. The one Haswell with Iris Pro chip benchmarked in Geekbench costs Apple the same as would a lower end Haswell without Iris Pro paired with a GT750M. Why would Apple sell a rMBP without discrete graphics for cheaper when it costs them the same to make one with a separate discrete chip?
It WILL happen eventually, and the rest of the mobile market and later, desktop market will follow suit (by say, 2019-2020) except for some very small niche markets for a couple of reasons (none of which involve intel CPU cost):
- Integrated GPU performance improvement is outpacing software requierment increases at the moment
- Battery life
- Thermal budget
- If recent history is anything to go by Broadwell's GPU will likely be comparable with the GPU in the new big consoles
An Iris capable CPU/GPU uses a max of what... 45 watts? vs. a CPU + GT750 which will be up near 90 watts total, giving half the battery life and double the heat.
If you give intel the same 90 watts max TDP to play with then...
If intel make broadwell or a later mobile CPU+GPU multi socket capable and apple uses this (i.e., replace the GPU with a second CPU socket), a machine with dual iris CPU/GPUs will blow a CPU+GPU combo from intel/nvidia out of the water in CPU tasks and probably blow their mobile chips out of the water in GPU tasks as well.
Power management will be simpler as well, as will GPU switching (single driver) - on battery, just turn a socket off entirely if required... and there's nothing stopping intel from scaling to more than 2 sockets in the desktop market, either. Graphics are "embarassingly parallel" and easy to scale with more units.
Nvidia and AMD should be very concerned that the discrete GPU market will most likely be a very small niche inside of 3-5 years (i.e., ultra high end workstation GPUs only).
Unless they diversify and come up with a new market, or find some way to compete with what intel no doubt has in the pipeline, they're both walking corpses.
Intel don't have to be "best". They just need to be more than good enough for the vast majority so that 3D software is written to work well with their hardware.