Speaking of which, has anyone done the math on how much power consumption (and other costs, directly and indirectly) the average computer would incur?
Well it all depends on which computer but you can find out the load by monitoring it over 1 hour period and get a rough estimate (x) the rate of your electricity to determine the cost of it.
It's not cheap I will tell you that. My home network would cost me $183/mo at 80% utilization. Right now I average 24.8% when I'm home/active and 8% when I'm away.
A lot of my system is "smart" and will save energy when needed. I programmed a lot in Linux to learn my usage patterns and it enables to have power available when needed.
For example, if my house is armed (home) on Friday at 8pm, the storage drives and servers wake up for having movies made available. Then spins back down once the movie is complete and not accessed for 30 minutes or I turn off my "movie mode lighting". Or if it checks the weather report on a Saturday evening and its raining, I'm usually home, so it'll do the same.
My lights will also flash for 60 seconds if a severe storm is detected/reported.
If my door bell rings, all my security cameras go into recording mode for several minutes rather than motion sensing mode.
I have two temperature gauges, one for the house and one for the server room. If the server room gets too hot, a notification is sent, fans spin up, and more cool air is fed in. If the house temperature is cool, then the server room blows the hot air into the house instead of venting out the roof. This saves on heating.
If my alarm goes off, certain events happen, and I'll get a constant stream of emails notifying me what zone is being intruded.
Smart homes are very cool when you can find actual uses for them and not just to have. I'm working on voice activations and responses based on what room I'm in. Right now all my responses carry through all speakers.