During a recent #u30pro chat, I commented that I wished college taught us how to handle corporate politics. The concept itself isn’t something new for me. When my alma mater did a survey two years ago, it was something I mentioned they should add.
While I got a few responses, the one that got me thinking the most came from a former (and favorite) professor…@rdwaters.
He said it was possible the reason more true workplace scenarios aren’t taught is because the students would whine and complain. It really got me thinking, as a junior (when I was in his class) would I have really wanted to learn more about working in the business world.
It wasn’t my only class. I was working part-time. I wanted to enjoy myself…I mean it was college. As Dr. Waters pointed out to me, most students don’t snap to attention until their last year or even last semester.
I guess there are some things you have to learn through experience, like when to speak up, how to ask for a raise and how to deal with that one co-worker that just irks you.
Is my hindsight clouding my judgment? Or was there really something I could have learned in college?