Saw this on Digg. dailykos.com: Study proves college makes you liberal... foxnews.com direct video: College Skews Political Spectrum In the video they ask "how can we fix this?" which I find a little disturbing. If education leads to a certain political opinion, what's wrong? Isn't that good? The video makes it sound like college education is brainwashing students. Personally, I experienced something like this. Went into college with conservative opinions and came out a liberal. I can remember really re-thinking things during my history class. Have you experienced something like this, or the opposite?