I meant to say that given how 3dMark assesses performance a 40 EU GPU might be not quite as good as its 3dMark scores suggest compared to a dedicated GPU which can let the CPU still run as fast as it wants.
I doubt fluctuations when it DDR3 is in use will happen. The driver will most likely be written to specifically account for the eDRAM and will most likely in barely any game run only off 128MB. Rather it will ALWAYS keep the high priority stuff in eDRAM and textures and all the big stuff in memory.
In all cases this like every cache will lower the bandwidth requirements consistently for the DDR3 memory. Graphics computing isn't terribly latency dependent so it really boils down to bandwidth. The extra cache simple increases the bandwidth.
It will almost always use both the 128MB and some big probably dynamic sys memory. It can also mem copy really fast between the two, much faster than over PCIe to a dGPU.
I think heat is also generally less of a concern. When the hot spots are more spread out it tends to be easier to cool and move the heat away. The total TDP won't increase and moving heat away gets easier that is a win win.
ad 400Mhz difference. There is also the 20EU vs 40 EU so that 400 Mhz base clock seems more because of that rather than the eDRAM. Also quite little in my opinion. When the GPU does not have to deal with much load it still will turbo up almost to the same level. I don't see much performance loss there.
The base clock is quite high. I doubt it will be the same for the 28W U CPUs.
The dedicated GPUs are still faster, and the best solution at the moment, but I'm nervous that they'll go for the intel GPU for power saving reasons.
There aren't really any power savings though. With switchable graphics one can always disable the dGPU and then it is the same. Apple only markets battery life numbers of a switched off dGPU anyway. If they go for Intel only it is because the performance difference isn't worth adding an extra chip, its heat, the space for GPU + its GDDR5 memory, the cost. Definitely not for power savings (Though there would be some but that is only because of the idiotic way automatic switching works in OSX)
If the dGPU isn't at least 40% faster it isn't worth it IMHO.
It is unlikely though that they will remove it. They might offer one model without the dGPU as they used to but a 5200 instead. I would buy that one if the 5200 actually reaches 2000 points in 3dMark11.
Less hassle with the crappy switching and enough speed. Also Chrome, iphoto, network monitor won't keep the IGP from kicking in after it had to switch to the dGPU for the external screen. Even forcing the IGP doesn't work in OSX when you need an ext. screen. Also less heat while using an external display. I honestly don't see any reason I would want the dedicated GPU unless it is at least twice as fast. Double performance used to be the difference at the least in the past.