As children, many of us were told how important a college education is. A degree would help us to obtain an excellent career in any field we choose. It would also guarantee that we will receive the highest pay in that field.

However, is that always the case? I’ve heard of many people who study for years and years, only to owe thousands of dollars and still only earn average pay. Many of them don’t even go into the fields in which they’ve earned a degree for. Instead, they find that they’d prefer working in a different field entirely.

On the other hand, I’ve heard from others and experienced for myself, that without a degree a person can be highly successful both in title and in income. In my previous job, I started from the ground up. Perhaps even below that.

I had no clue what I was doing at the beginning, having never desired that career before. However before long, the career took off and possibilities of growth were constantly being thrown in my direction. Within just a few short years, my income had greatly increased to a level higher than many of my co-workers, who had gone to college specifically for that field of work. I’m not bragging, just trying to make a point.

So that brings me back to the question of whether or not college is always necessary. With the exception of those in the medical field, justice system, etc., I don’t believe it is.

A person excels in a career because of their work ethic, their ability to do their job to the greatest of standards, and their determination to elevate to a higher position.

Of course a person with a degree can do the same. They can put their degree to work, or they can stay idle and go nowhere. Success is up to every individual, with or without a degree.

What do you think? Is college always necessary?

This post is part of Stream of Consciousness Saturday, hosted by LindaGHill.