So for the last few days, it has been absolutely beautiful out- sunny, warm, and downright refreshing after that long winter we just had up here in the Northeast. It seems when it gets warm out, people are happier, more active, and feel better about themselves. I know it's the first warm-spell since winter and everyone is on a "high" because they can finally open windows, wash car, etc, but even so I find myself more content and active during the warm and sunnier summer season than during the winter. Research has shown that on rainy days, people are more likely to feel depressed or get angry as opposed to a warm, sunny day. Plus, in the winter, all the vegetation appears dead, days are shorter, and numb fingers and scraping ice off your windshield can be a pain. Another example- Your health. I am much more motivated to go outside for a run in 60-70 degree sunny weather, than in the winter. If I lived in a year-round warm climate, I would think I would run everyday. So I'm wondering-- are people happier in warmer climates? If all other demographics were the same- crime rate, poverty level, income, etc. Florida? Southern California? Arizona? I'm wondering if there are any studies on this and what other people think. Around here summer is usually always a person's favorite season. If "summer" was generally year-round, would somebody's quality of life be better, or are people in colder areas just experiencing a 3-4 month "summer high"?