Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Speaking of which, has anyone done the math on how much power consumption (and other costs, directly and indirectly) the average computer would incur?

Well it all depends on which computer but you can find out the load by monitoring it over 1 hour period and get a rough estimate (x) the rate of your electricity to determine the cost of it.

It's not cheap I will tell you that. My home network would cost me $183/mo at 80% utilization. Right now I average 24.8% when I'm home/active and 8% when I'm away.

A lot of my system is "smart" and will save energy when needed. I programmed a lot in Linux to learn my usage patterns and it enables to have power available when needed.

For example, if my house is armed (home) on Friday at 8pm, the storage drives and servers wake up for having movies made available. Then spins back down once the movie is complete and not accessed for 30 minutes or I turn off my "movie mode lighting". Or if it checks the weather report on a Saturday evening and its raining, I'm usually home, so it'll do the same.

My lights will also flash for 60 seconds if a severe storm is detected/reported.

If my door bell rings, all my security cameras go into recording mode for several minutes rather than motion sensing mode.

I have two temperature gauges, one for the house and one for the server room. If the server room gets too hot, a notification is sent, fans spin up, and more cool air is fed in. If the house temperature is cool, then the server room blows the hot air into the house instead of venting out the roof. This saves on heating.

If my alarm goes off, certain events happen, and I'll get a constant stream of emails notifying me what zone is being intruded.

Smart homes are very cool when you can find actual uses for them and not just to have. I'm working on voice activations and responses based on what room I'm in. Right now all my responses carry through all speakers.
 
Speaking of which, has anyone done the math on how much power consumption (and other costs, directly and indirectly) the average computer would incur?

Oh, that's the dark side of folding. The CPU runs ideally full throttle using; same as the GPUs. But it is very specific for your system setup and ambient temperature. Best to plug a watt-meter between wall and computer and measure.

I pull constant 700W from the wall feeding two CPU and two GPU, every hour, 24/7. Depend on what you pay for kWh this can be substantial and the "main" part of the contribution. One of the reasons I want to replace my GTX 780 with GTX 970; an amazing boost in performance with lower power.
 
Is there any other known corporations / office blocks who do this?

----------

Not to take anything away from folding@home as I think their project is great.
But is there any other worthy Internet-based public volunteer computing project?

I know of the likes SETI@home
http://setiathome.berkeley.edu/sah_about.php

Sometimes companies work together with F@H project but not often been shared in public. I think begin of this year they worked together with Google. HP also was once involved to "burn in" a data center.

Other projects: sure. There is BOINC-family of DC projects and GPUgrid similar to F@H. I personally prefer F@H though.
For the last SETI challange (WOW! Event) I joined there; temporary. Nothing wrong with it (as long your find your way home to 3446 :p ). Sometimes we also switch teams for short period of time. Just come home for time to time ...
 
Mac GPU folding question

This article got me back on the F@H trail again after a 5 year break. It's amazing to see how much more efficient computers are now. Folding so many more units with way less heat and a bit less power, very cool (I started a new username and jumped to page 9 in less than 24 hours, currently 859). It also seems like they have their act together more in terms of sharing how we are making a difference. I stopped before due to increased energy bills and the fact that it seemed worthless as no news of folding actually making any difference in the science. I'm happy to be back and putting my idle Hackintosh cycles to good use!

Anywho, my question is in regards to getting GPU folding happening. It looks like only Nvidia cards are currently supported on Mac. I have a nice AMD Radeon HD 6870 2GB that I would love to use, but instructions on getting this started seem to be missing. Is this possible?
 
This article got me back on the F@H trail again after a 5 year break. It's amazing to see how much more efficient computers are now. Folding so many more units with way less heat and a bit less power, very cool (I started a new username and jumped to page 9 in less than 24 hours, currently 859). It also seems like they have their act together more in terms of sharing how we are making a difference. I stopped before due to increased energy bills and the fact that it seemed worthless as no news of folding actually making any difference in the science. I'm happy to be back and putting my idle Hackintosh cycles to good use!

Anywho, my question is in regards to getting GPU folding happening. It looks like only Nvidia cards are currently supported on Mac. I have a nice AMD Radeon HD 6870 2GB that I would love to use, but instructions on getting this started seem to be missing. Is this possible?

Welcome back to the team. You are right: the jump in efficiency and output is amazing; specially with the current GPUs.
Bad news: no GPU folding with Mac OS. The driver are not in a status where the project is able to utilize them. GPU folding only with Windows and Linux (which I prefer).
 
Welcome back to the team. You are right: the jump in efficiency and output is amazing; specially with the current GPUs.
Bad news: no GPU folding with Mac OS. The driver are not in a status where the project is able to utilize them. GPU folding only with Windows and Linux (which I prefer).

Thanks for the info Christian. So, I could install some sort of distribution of Linux to just do GPU folding simultaneously via Parallels or something? Has anyone done this, using the mac default folding app and run linux GPU folding via emulation? (been a mac guy for many decades, never tried linux).
 
Thanks for the info Christian. So, I could install some sort of distribution of Linux to just do GPU folding simultaneously via Parallels or something? Has anyone done this, using the mac default folding app and run linux GPU folding via emulation? (been a mac guy for many decades, never tried linux).

Oh, that's my long dream. Virtual folding. Only works with CPU; never got it working with ESXi. I assume parallels will have the same issue. The native driver eventually will not work and the generic virtual drive are not supported by FAH. Mac folding at this point in time only CPU. But that's ok, too.

You could put a GPU in a small ITX board and get CentOS installed. As dedicatd box. I love the current GTX 970; get me 260kPPD

----------

Hey, I just see we got 79 new members.

http://folding.extremeoverclocking.com/team_summary.php?s=&t=3446

Welcome to all of you ! :)
 
Who are you aiding?

Pharmaceutical corporations that manipulate the media? These psychopaths are responsible for a lot of deaths with side effects of their products. Those "Ask your Doctor" commercials are a huge part of network TV revenue and so those networks would almost never dare run a story critical of a big pharma drug.

See past the propaganda and stop helping these criminals.
 
with side effects of their products.

Agree, drugs often (maybe even always) have a number of side effects causing additional health concerns; very visible when patients loosing their hair during chemo to fight against a cancer. Why ?

Because the existing drugs are not focused enough and not only attacking the cancer but impacting also the healty part of the body. Some people comparing it with the attempt to kill one mosquito with a nuke. As a regular IT guy I'm far away from understanding the details of all the biochemical interactions of drugs within the organism; but understand it is highly complex (and fascinating).

That's why I support the basic research of Stanford to allow them to better understand how those interactions work. With the aim to eventually develop focused drugs reducing the side effects and increase the chance to overcome a disease.

And as a positive side effect I can tinker around with my IT stuff, try to, optimize, help team mates, develop software with a purpose, etc. And also learn about diseases and mechanism of drugs.

(Disclaimer: I'm working for a pharmaceutical company in IT, no names or products will be mentioned to avoid conflict of interest or incompliant promotional activities)
 
Finding better-targeted treatments actually involves two approaches.

One is refining drugs or finding new drugs that can attack a disease while sparing the rest of the body.

The other is determining which drugs are most likely to help which patients, research that has benefited from the sequencing of the human genome (another highly computational effort for which Stanford ran a distributed computing project).

The goal has always been to know which drug will be most safe and most effective for a given patient, and that includes knowing when not to give drugs at all. The key is to increase our understanding of the biology and chemistry of both drugs and humans.
 
Congrats, MR F@H team!

On top of that, you may be getting a new team member very soon. I've recently acquired a handful of "new to me" computers; one is a Dell XPS 210 minitower with a decent Core 2 Duo chip and a Radeon X1300 GPU. It currently runs Windows 7, but I'm thinking seriously about replacing that with Linux (probably Mint). As I would only see occasional use with it otherwise, I could have it running F@H on Low or Medium much of the time (I'll probably avoid High, as this is a SFF computer which has been noted in reviews to run quite hot under load.)

The others are rather low-spec machines. Are Pentium 4, PowerPC G4 or Athlon 64-based PCs even worthy for running F@H at this point?
 
The others are rather low-spec machines. Are Pentium 4, PowerPC G4 or Athlon 64-based PCs even worthy for running F@H at this point?

New members are always welcomed !
If those PCs have PCIe slots and you can put in some (low-energy) GPUs in they are perfect. CPU folding would work (except G4) but not very satisfying from points point of view.
 
...after a long pause,

I actually joined back. I had a crash of my HW a while back and stopped contributing. Now, I reinstalled the screen saver and it runs at "medium" of my system. That means it runs 8 cores at 74.6% if they are not used. Now, there is only a x86 version for Windows machines. Too bad since the AMD FX 8350 really shines in x64. Well, at least the Radeon R9 270X runs unhindered.
 
does anyone know how to tell the F@H app where to store all temporary data/work units? I don't want this burning through my SSD and would rather it write to my 1TB standard Hard Disk. I've looked trough the FAHControl config section and preferences, but haven't found anything.

Also, is there a way to have the app download 3 or more work units instead of blazing through one at a time? I have moments when I plan to disconnect from the net for long periods on my workstation.
 
...

Also, is there a way to have the app download 3 or more work units instead of blazing through one at a time? I have moments when I plan to disconnect from the net for long periods on my workstation.

No, not unless you have separate processing cores. Each "core" (a high-powered GPU can be its own "core" separate from the CPU) is assigned one work unit at a time. The F@H central server is in full control of work assignments, not the individual clients.

If what you're wanting were allowed, it would open up the risk of unscrupulous "power folder" users hoarding work units for the bonus points, and cause general inefficiency in the distributed computing model. The only time a client is using the 'Net is while sending a finished WU (work unit) or downloading a new WU; it's not a constant data feed, from what I can tell.
 
I finally got around to setting up my old iMac to participate, only to discover that the program requires 64 bit and that mine is 32 bit. Oh well, I guess it really is time to recycle it. I really want to do this so I'll set it up on my mini soon.
 
I've been refurbishing some PCs with Linux, so I haven't been folding as much as when I started, but I plan to start one of those back up soon.

ChristianJapan, do you get better processing times with your Linux machines than your others?
I've refreshed a Dell XPS Core 2 Duo with Linux Mint, and was wondering if it would finish a WU in Linux faster than, say, Windows 7. (No GPU folding, as the included Radeon HD 2400 is unfortunately blacklisted, and I don't have a spare whitelisted video card that'll fit in the SFF case.)
 
I've been refurbishing some PCs with Linux, so I haven't been folding as much as when I started, but I plan to start one of those back up soon.

ChristianJapan, do you get better processing times with your Linux machines than your others?
I've refreshed a Dell XPS Core 2 Duo with Linux Mint, and was wondering if it would finish a WU in Linux faster than, say, Windows 7. (No GPU folding, as the included Radeon HD 2400 is unfortunately blacklisted, and I don't have a spare whitelisted video card that'll fit in the SFF case.)

I think in general Linux systems are few points faster compared to Windows. Don't have any figures to back that up though. For me the big advantage and reason why I use Linux: I can very easy remote connect without big overhead, runs well in virtual setup plus cost no license key for Windows. Also nice scripting build in. Windows on the other side is better for GPU overclocking (what I don't do).
So finally use an OS you feel comfortable with. Recently I switched to CentOS 7 and be quite happy with (except it don't run BOINC)
 
I think in general Linux systems are few points faster compared to Windows. Don't have any figures to back that up though. For me the big advantage and reason why I use Linux: I can very easy remote connect without big overhead, runs well in virtual setup plus cost no license key for Windows. Also nice scripting build in.

I've run a couple of WUs on the Dell box with Linux Mint these past couple of days, and I'm getting about 10-15% more bonus WU points from CPU folding in Linux than CPU folding on the same machine in Windows 7. As no hardware was changed, I can only attribute this to increases in software/OS efficiency. Perhaps a large chunk of this can be attributed to not having to run real-time antivirus/antimalware processes; Linux is about as secure as OS X, and the user has much more control over what processes are allowed to stay active.

Overall, I'm pretty happy with the Linux Mint install. Linux in general is a lot easier to install these days, as I've come to understand; I've tested over a dozen Linux flavors before settling on Mint. Linux Mint behaves in a more familiar, traditional fashion than Ubuntu, which is known for being somewhat of a resource hog.

** I'm known as dXTC over on the Linux Mint forums now, with the same avatar as here, so if you're in the market for repurposing an older machine-- or building a new one and need a solid, modern, full-featured, no-cost OS free from antivirus and activation codes-- stop by and wave hello!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.