Great solution for those trouble users. I don't see why you should waste anymore time.
Which is unfortunately why I've chosen to killfile a very select few unreasonable individuals on MacRumors. Anyone interested in the basis as to why are invited to go do an Advanced Search and read the archives.
I've been putting off getting a basic Kill-a-Watt meter for sometime. With an efficient power supply and plenty of vendors going cool and "green" you're not going to sacrifice much performance if any to use less power. Tighter power management, hybrid sleep, and idle performance modes are found all across the board.
Agreed, although these factors' contributions can often still include some power management shortcomings when it comes to external peripherals...its a generic advantage to the iMac in that it is more fully integrated, even before we consider contributions from 'laptop' CPUs and the like.
For example, as per cnet listings, LCD displays typically consume around 0.3w/inch^2, so a generic 24" LCD is going to consume around 75W, which if it isn't well integrated into the green management system - - or if one simply enjoys watching your screen saver - - adds to the bill. On non-integrated systems where there's more than one power plug, one has to remember to measure all the plugs (via Kill-a-watt instrumentation or whatever else). Its a simple enough concept, but easily overlooked even by the well-intentioned.
In any event, because of all of these moving parts and variation potentials in duty cycles, the effort to do a full-blown analysis down through all of the 2nd & 3rd order of magnitude contributors is rarely worth the effort for the cost of a single computer. Instead, a simple ROM seeking to see if there's potentially an order of magnitude difference between computer X or computer Y is generally sufficient, which is all that I did. IIRC, I got a number around $266, which I conservatively rounded down to $250.
This simple ROM approach is to just compare the relative magnitudes of the Power Supply Unit (PSU) ratings and then assume a 24/7 at one's current local electrical rates, whereupon one would then look through the 2nd order variables to see how profoundly they might affect this simple chop, higher or lower.
For example, for readers here who are running apps such as Boinc , then the machine is indeed working closer to the assumed "24/7", which lends credence to the 24/7 baseline value. Similarly, since this is a future expenses calcuation where there will invariablly be growth in utility rates...using the present rate is an under-estimation which serves as an offset an otherwise high assumption on utilization rates. Also, differences in the form factor of the computers can play a role: since an iMac lacks expansion bays, it discourages the home user from perpetuating the operation of legacy hard drives, so the risks of power consumption growth over the hardware's lifecycle are different: its a lower risk for the iMac, and the beige box PC gets under-estimated, since each additional legacy 3.5" HD kept around running adds another 10W at idle (and 20W when spinning). Plus there is as mentioned above the budgeting to compare at the system level to make sure that the 2nd plug that's using up 75W to drive the PC's non-integrated 24" LCD isn't accidentally overlooked...its another source of potential under-estimation of the beige box PC that influences the overall lifecycle costs.
Overall, it is reasonable to conclude that the iMac consumes roughly half as much power of a beige box PC.
And that's before we delve into other factors. For example, current popular gamer PC Video cards include the Radeon HD 4890 or GeForce GTX 285 which consume around 50W at idle and 75W-150W when running full tilt on top of all of the other power loads. Thus, it hardly becomes a surprise to find that home "gamer" PCs are running 600W-850W sized PSUs.
In the meantime, Apple lists their base 20" iMac as having an idle power consumption of ~60w, and ~100W at max, which includes driving the LCD display. Granted, many enthusiats will probably opt for the 24" iMac which is does consume more power, but per Apple, it tops out at essentially only ~200W.
So one can choose to develop your own model with whatever degree of complexity and utilization values you want to assume and so forth, but the bottom line is that the question isn't if the iMac isn't a substantially lower power consumer...the question is merely how large of a difference is present for one's specific duty cycle application...and for that, we are all always going to be different.
Because of the wide amount of varation in desktop PCs, its effectively impossible to pick a generic instance, let alone a utilization ratio. For example, a PC could arguably be an 850W PSU-rated gamer system, which is roughly 8x the power of the 20" iMac ... which means that a 20" iMac running full blast 24/7 (no idle or green power savings) will consume roughly the same amount of energy as the former in only 3 hours (and 21 hours off). If we then try to model power savings as a simple 50% idle savings on the iMac, then a (8+16)/7 utilization turns out to be roughly equal to this gamer PC box at a (2+0)/7 utilization...
...yet even this is incomplete, since we're ignoring the gamer PC's external LCD monitor's energy costs (drives its power consumption higher) as well as that we're comparing based on a rating instead of actual (drives its power consumption lower). At least these two ignored-fators offset each other, which minimizes the overall net contribution of these not being included in this version of a model.
And so on and so on and so on. Feel free to put as many variables into play as you wish, and decide for yourself if there's really a "garage sized" hole in my very simple first order ROM or not.
In any event, the next step from here would be overall lifecycle cost difference, which would be done by multiplying whatever differential you decided to go with by whatever expected system lifespan. YMMV, but for generic households which include "trickle-down" through familymembers, its undoubtedly going be more than a mere 3 years. IMO, its going to be at least 5 years and not uncommon to see 7 years.
And knowledgable readers will notice that these 5 & 7 year values happen to allign with Apple's current published definitions of "vintage" and "obsolete" hardware.
Gosh, that makes them sound so unreasonable to use, doesn't it?
If we assume a 5 year lifespan, if the per-year energy differentials are $100-$250, then we're looking at a TCO offset factor of $500-$1250. Similarly, for a 7 year, its $700 - $1750. Regardless of which number you want to use, it is a non-trivial percentage of the original cost of the hardware, so it is hard to justify ignoring it totally.
But some readers will insist on doing so anyway...its their choice, but it undermines their credibility as soon as they mention the word "Value".
-hh