Thanks for the link.
I did some very rough calculations on the GDDR5 750M versus the 850M in terms of speed.
On the link you posted the average difference between DDR3 and the over-clocked GDDR5 version (I believe Apple uses a slightly overclocked version?) is a staggering 57% frame rate difference in the games tested.
While I couldn't find any exact benchmarks for the 850M, it is said to be comparable to the 765M in terms of performance, so I essentially compared the 765M to the 750M.
For the new game Titanfall, the difference on high settings between the 765M frame rate (54,3 fps) and the 750M frame rate (33,3 fps) is 63,1%. Using the average increase of 57% in frame rate, that means that the slightly over-clocked GDDR5 would be somewhere around 52 fps. The lowest difference between any game given stats for in the link was 43,75%, and thus even if frame rates only increased by that much with the GDDR5 variant, the 750M would mathematically still achieve about 48 frames per second. That's 13% less than the 765M.
So if games really are on the average that much faster with the GDDR5 version, if the 765M is actually close to the 850M and if I'm not missing something, there's not that much of a difference.
And yes, I checked - the 33,3 frames per second for the 750M on Titanfall (chosen not because I play it but because it's a new game) is on a 967MHz 750M, or in other words, a DDR3 variant, not a GDDR5.
(Frame rates not listed in the link were taken from Notebook Check)