Anonymous
Favorite Answer
This answer is many-fold.
College is important because in today's job market, you can't expect to get a job that will offer freedom, advancement, and decent pay without some form of education past high school. University is one of those routes. Sometimes people get lucky and make billions off a good idea and ingenuity, but you can't plan to be one of those people.
College is also important from a more social standpoint. Societies that foster and encourage scholarly learning and research tend to do better in the long run. College will change your mind and teach you how to think, write, and communicate, in addition to some actual bits of knowledge. Students should hopefully come to understand that this world is bigger than you ever could imagine and that studying it in it's entirety is a noble but impossible pursuit. You'll grow through contacts you've never had before, teachers with interesting insight, and the personal freedom to choose to bog yourself down with homework every night and destroy your liver on the weekends.
By all means, college/university is not the only way to grow as a person, but it is a convenient one in this modern age that ends up with a marketable diploma as well as life experience. If you have the option, the drive, and the skills, it is fairly unlikely that you'll regret it.
Ambrielle
There are quite a few jobs you can't have without going to college. You can't be a doctor, lawyer, engineer, psychologist, chemist, etc. It's not necessary for everything, but it still helps.
iLOL
There are a lot of jobs you can't get unless you have a degree.
Then again, even with a degree, you won't necessarily get a job these days.