My own source didn't even mention the word "crispness", so how could he confirm that you sacrifice crispness for screen real-estate? t like how you chose where to end that quote. Here's the entire quote:
"By default, the Retina MBP ships in a pixel doubled configuration. You get the effective desktop resolution of the standard 15-inch MacBook Pro's 1440 x 900 panel, but with four physical pixels driving every single pixel represented on the screen. This configuration is the best looking, but you don't actually get any more desktop space. Thankfully Apple exposes a handful of predefined scaling options if you do want additional desktop space."
When he says it's "best looking", he's talking about SIZE, not crispness. He doesn't say anything about this "crispness" problem you're talking about, which just doesn't exist.
While I can see that you do not own a Retina macbook pro, I do, and I use it at a scaled resolution of 1050 HiDP, not the default 2x scaler. And there are NO problems with crispness, it looks every bit as sharp as the 2x scaler.
Also, the "performance and quality" impact he mentions is the fact that it still renders the image at a larger pixel density and then scales it down to whatever scale you choose. The fact that it has to divide the resolution by 1.71x as opposed to 2x doesn't produce any performance penalty on my system, and it's not widely noted elsewhere either. The performance "impact" is so tiny that he doesn't even bother to elaborate, because it isn't that much harder for a modern GPU to divide by 1.71x than by 2x, the impact is solely theoretical. I will perform benchmarks with both scaling options if you would like me to prove this point.
So, this "crispness" problem you mention being an issue with not using whole-integer scalers is not a problem at all, and the "performance impact" is unnoticeable. On a system as powerful as the iMac, with it's dedicated discrete GPU, this would be even less of an issue.