Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

duckkg5

macrumors member
Original poster
Aug 14, 2008
82
0
Charleston, SC
I'm doing some regression analysis and have a variable that has the number of years it takes each person questioned to get a college degree. I'm trying to see how many years it takes for public vs. private college students, so I split the variable up between public and private colleges using a dummy variable. But now the list of variables for, say, public colleges has "0.000" for anyone who didn't go to public college (obviously), and when I try to average the list of years for public school, the average adds up ALL of those 0.000's and includes them in the average.

Is there a formula in Pages that will average all the numbers other than the "0.000"? Like, can you discount anything that equals 0.000?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.