I've been reading this forum for a few months now, and I still can't figure out where everyone got the idea that Apple is definitely dropping the dGPU. I can't see any hint of it in anything other than blind speculation.
Then, respectfully, you haven't really been reading.
"They wouldn't include both an Iris Pro 5200 and a dGPU, duh!"
That's actually not usually cited as one of the reasons, at least by anyone with a brain. There's no chance that they'd do that combination because it makes no sense. Please tell me you do understand that point, right? But that's not related to the dGPU or no dGPU point. There's an outside chance of an HD 4600 and a dGPU, but it's not likely.
"Oh, they have a custom Haswell chip, well that guarantees no dGPU!"
Again, not cited by rational people as a reason.
If they're going to target 12 hours of battery life, there definitely won't be a dGPU."
Actually, battery life is expected to be pretty comparable. The passive draw from Iris Pro is going to be around or slightly above what the HD 4000 and HD 4600 did/do. Apple has historically not used the dGPU at all in its marketing collateral when talking about battery statistics; those numbers rely on normal usage using the iGPU.
So, does anyone have any *actual* reason to be slapping down people who think there will still be a dGPU? Or is it just a case of "my blind prediction of change is better than your prediction of status quo."
Do you
really need me to go write out the whole list of reasons again? I'm almost tempted to, given that people like you keep claiming to have "read" things but either haven't actually done so, or skimmed so quickly as to not absorb the information, only so as to have a giant blob to copy-paste over and over again. I'd suggest you visit my last 20 or so posts in this thread as well as the separate dGPU thread. I agree with most (although not all) of mseth's points on the matter as well.
----------
I don't know what you mean with over all improved performance. CPU is like 0,1% faster. The faster SSD is hard to notice. And Iris is at least 24% slower, except OpenCL. Maybe that can be noticed, maybe not. Therefore we need a real world test. But this thing already has a memory and bandwith problem. Im not to optimistic for future software/games. But its Apple, they find a way to release it.
Clock for clock, the CPU is actually quite a bit faster. The current 2.7 Ivy Bridge is on par with the leaked 2.4 Haswell part. The only thing rendering that irrelevant is that the GT3e parts come in speeds of 2.0, 2.2, and 2.4, which will mean that assuming Apple goes with the Iris Pro 5200, we won't be seeing much of a speed increase. Haswell
does come in faster mobile speeds; it's just unlikely with the Iris Pro 5200 junk that we'll see them in MBPs.
Thus, the only real regression we'll see is for gamers and people who depend heavily on OpenGL stuff all the time. For everyone else, performance will stay about on par while battery life will increase. That may not be "progress" to all people, but I find it satisfactory enough. People seem to expect miracles from each new iteration and then become disappointed when they don't happen. Right now, we're in an era of evolutionary versus revolutionary improvement on these things. Folks need to temper their expectations accordingly.