By no means am I blaming you for saying this, but this particular piece of disinformation seems to be deterring a lot of people from buying computers that they would be perfectly happy with.
The framerate issues in OS X GUI have their origin completely in software, specifically the software used to scale apps to 2880x1800 or 2560x1600 to appear the same as they would at one quarter the resolution. If it were a hardware issue, it would stand to reason that forcing the GeForce chip to kick in would alleviate these issues (scrolling in Safari, window animations, etc), since the GeForce is roughly 10x as powerful as the integrated Intel chip- but there is no difference.
The issue is that in apps that use OpenGL-based languages or other forms of graphical hardware acceleration, the image is first rendered, scaled to quadruple the pixel density, and then re-rendered. As more and more apps become Retina-aware and future released of OS X improve the scaling algorithm this issue will cease to be. Don't believe me? Fire up Windows on Boot Camp at full resolution. Not a hint of jitter, you just can't read without a magnifying glass.
Also- no product is perfect, and no QA measure is perfect. Any change to an established design is as likely to introduce new, unforseen issues as it is to resolve old ones. This is the reason for Intel's 'tick/tock' strategy (alternating yearly changes between die size reduction and microarchitecture refresh)- to minimize the potential complications. Apple's strategy is similar, which is why iPhones have settled in to the X/Xs rhythm, and the MacBook Pro is no different. Think of the Haswell MBP as the "rMBP 1s".