When Will Colleges & Universities Step Up To the Plate?

It’s time for colleges and universities to stop shunning their responsibilities and provide students (and parents, who pay for their kid’s education) their money’s worth.  These institutions have been let off the hook long enough!

Think about it: what other product can you name that is successfully made and marketed over a long period of time without a structured marketing component?

None!  Yet, most colleges and universities do just that; provide an education and diploma, but no mandatory semester-long classes teaching students how to sell themselves in a highly competitive global job market.

Colleges and universities don’t teach college grads how to optimize their academic efforts; they leave it to chance.

To match feature FINANCIAL/GRADUATES

The result is that colleges and universities unleash new products (college graduates) into the most competitive job market since the Great Depression totally unprepared; and the fallout is devastating for students and society at large.  Fear, doubt, and insecurity take center stage in the lives of college graduates immediately upon graduation. Ultimately, most grads wind up settling for less than they expected or deserve – simply because they were not taught how to effectively and confidently market their talents, qualifications and potential.

Traditionally, colleges and universities have left the “marketing” component to The Department of Labor and the Workforce System ex post facto; which is understaffed, overwhelmed, under funded, and not nearly as effective as a college or university could be. They just hand it off to another source well after the damage has been done. Being institutions for higher learning, this doesn’t seem too intelligent!  When will colleges and universities get it?  When will they accept responsibility?  When will they acknowledge that today’s new economic order requires new thinking and new action?

college-classesSo my question would appear to be a common sense one (though I learned in college that Voltaire said ‘common sense is not so common’).  When will ALL colleges and universities step up to the plate and teach students how to navigate through today’s challenging job market so they get the most out of their education?  Why is it so difficult to incorporate a mandatory semester course on how to land a good paying job? I took the SAT’s and proved that I knew English. Why then, did I have to take Freshman English? I didn’t major in English. I would have been better served taking a course entitled: How to Land the Job You Want at the Pay Your Deserve Once You Leave College!

Teaching students how to conduct a positive, empowering, aggressive, and confident job campaign, which they’ll have to do 15-20 times during their professional life, should be MANDATORY!

Besides a moral obligation, institutions of higher education would benefit financially.  It’s quite obvious that the financial benefit to colleges and universities (their return on Investment) would be more consistent and greater alumni contributions.

Comments and feedback are invited.