|
|
2003-03-31 05:09:49 ET
:)
Make no mistake about it, colleges are all about making money. Look at the financial books of any large university and you'll see that there is serious bling to be had. It is nothing more than a business: in exchange for money, you receive an education based on your programs' degree requirements.
But here's the secret, and the real reason why college is important. In most fields, it doesn't matter where you went to school or even what you majored in. Employers look at college as more of an endurance test: were you able to complete the task of graduating? Let's face it, most of college is tedious: writing papers, working on projects, etc. Those papers and projects in and of themselves have little value, but the process of learning how to complete them has immense value.
It is true that one can survive in the world without having attended college. In doing so, however, one needs to be prepared to enter their desired industry at the bottom level. I've thrown out resumes based on lack of a college education or the lack of completion of college. If someone can't finish something as easy as college, how could I feel comfortable assigning them to mission critical projects? |
|