Most people who have never used PC's don't know what this even is.
Most people who have used PC's and switched to Mac's did so to avoid crap just like this from the PC world.
Personally, I use both on a fulltime basis, and can hardly recommend this. The potential to hose up your Mac or even potentially permanently harm it far outweigh the barley noticeable speed increase.
Actually, that's not true - for audio use for example, you could (back in the day) get a cheap AMD chip (sure it would burn out in three years but you could replace it) that overclocked, beat a CPU costing 5-10X more from Intel and allowed for more plug ins, faster rendering.
If you read the article from ZD you will note that the cheapest 2.8 over clocked ran FASTER than the highest 3.2 machines. This would mean rendering times MUCH faster, plus, for audio, more PLUG INS. If the CPU remains relatively cool, regardless of what form-factor (laptop, mini), in theory the 3.2 might reach speeds of 4.0 on air cooling meaning stock fans.
If you know anything about CPU's and MEMORY, you would know that sometimes the difference between a 2.8 and 3.2 is a multiplier, or setting and its the same chip, or in the case of memory, how fast they can get it to run, then they slap the label on it (5300), if it fails, they (manufacturing labs) reduce the speed until its stable, 4300, et-cetera.
In theory, you could find a batch of CPU's that are 2.8 and 3.2 (not much difference anyway) that may both reach the same speeds. Now, if they can get the FSB and voltage tweaked, you could see even higher speeds and the cool thing about it it Apple would have no way of knowing if the CPU crapped out 2.5 years down the road with a 3 year apple plan (which by the way is why 3 years is what is offered) as 3 seems to be the stable point for silicon with 4-5 being the norm with good use/handling, eventially, they all crap out.
Tip. Turning on and off the machine is actually worse than leaving it on, and its best to leave it on and let it sleep than to turn on and off all night - that's why studios, servers, business's always keep their machines on and they can last for years - another example are high end printers, they never go off, they are always on, even if drawing a few milliamps of power, they are still on as the "shock" of turning it on, decreases the wear on it each and every time.
Peace.
EDIT: I tend to think the 4-5 year window of crapping out is due to overclocking also = the board being exposed, moved around, checking fans, adding fans, more static, and so on.
On the flip side, it kills me that some old school mac a bee's when I run into them and their system is slow, they have a G4/G5 and have never run scripts, nor have run repair permissions and one guys computer had so many repairing permissions, that it was going to take more than 1-2 hours on a G4 as he never did it. I stood there for about 10 minutes watching pages upon pages or errors.....
Amazing. New generation of mac users = more tech savvy = faster machines (Overclocking) which might force the price points down (this is what intel did) with some of apples products, and perhaps, if someone can hack the GMA's and get OPENGL to work better, I think we'll see a dedicated GPU on all machines instead of Apple trying to screw the PRO user (smallest market base) in order for them to have to get a Mac Pro or Macbook Pro for any graphic use whatsoever.