A new study presented by Fox News (watch the vid, peeps) says that going to college will make someone’s political opinions lean left and suggests that democratic faculty push their liberal agendas on students. What’s troubling about the study’s conclusion is that the college grads become more liberal but NOT more knowledgeable; more than 35% of us can’t even name the three branches of the U.S. government!

[gorillanation id=124117 width=420 height=340]

Watch the latest news video at video.foxnews.com

Now, I can only speak for my school, one public university out of many, but I can definitely back the validity of these findings based on my experiences during the 2008 Presidential Election when everyone had Obama fever. I was kind of surprised when my professors brazenly bashed Republican ideals and tooted Obama’s horn during a lecture that was supposed to be about graphing the value of x.

Despite all the propaganda, I made a conscious effort to educate myself about both candidates. I did internet research, read books, and looked up their past political experience. I wanted to form my own opinion, and when election day rolled around, I was well-informed on many different issues facing the country and who promised to do what about them. After much deliberation, I voted for McCain via my absentee ballot.

I wasn’t annoyed that all of my friends voted for Obama. It’s their right. I was annoyed that they had no idea why they voted for him. When I asked them, I mostly got blank stares and shoulder shrugs. If I was lucky enough to get an actual response, it was either “because he’s young and cool!”, “because he’s black!”, or from some of my more promiscuous friends, “because he’s pro-choice!”

This study definitely made me think. College is supposed to be a place of higher learning, but what is it really teaching us? We should become more educated and learn to think for ourselves, but is this really what happens? Are people with degrees more liberal, and if so, is college to blame?

What do you guys think?