this is similar to the question i posed a few weeks back of summer being a thing of the past. watching tv the last few days or so got me thinking, most of the media treat the summer as the dead time where nothing happens and that it is nothing of noting that happens in those three months. it feels like that as soon as spring is over, we should just look ahead to the fall and the "new year", as we start fresh. and in florida, nothing of note really happens here, as most of the people go home for the summer, and when previously i loved summer, i always hoped summer would end and we get to fall so stuff starts happening again, as it seemed like it was the big season to release everything. being in school so long and everything, i would say i still think fall is when my year starts, but it's slowly moving to spring as it's the season when baseball starts and everything. many people say the year truly starts for them at different times, the most common being fall, new years and spring. i just want to get your thoughts on this.