I’m sure you’ve asked yourself this question at least once in your life. My number is a little higher than that. I’m nearing my last year in grad school and am starting to figure out what you actually learn in college. Really nothing that has to do with your job. I guess orientation and training videos wouldn’t exist if you learn about your job in college.
What I feel you learn in school is how to be a professional person. You learn time management, social skills, how to be a leader, how to take initiative in you role, and how to react with others through doing a lot of group work. I asked my dad this question, and his answer rings a lot of truth: ” you learn how to learn your new job.” Makes sense.
You can also look at it as paying your dues. I’ve found that some jobs care about your grades, some just care about that degree. My advice would be to excel in both, just in case… However, you go to college for 4ish years to show commitment and determination to better yourself. You go to grad school because your committed go that extra step, and determined to get as high in your career as possible.
You will pick up a few tricks and read a few books that are relevant to your career, but you learn your job starting the first day. College is a necessity so that learning your job takes less time and you can excel faster than your peers.