University Education, the Workplace, and the learning gray areas in-between


Many years ago, maybe around 16 years ago, I was sitting in the office of my computer science major advisor, getting my academic plan for next semester signed off on.  My computer science program was actually an offshoot of the mathematics department, and until recent years (2003?) they were one and the same.  My advisor, while looking at my transcript, noticed that (on average) I was doing better in language courses rather than my computer science courses; which was technically true, but many courses designated as CS courses (and ones that were required for my degree) were really math courses, so you need to do a deeper dive to see what I was doing better in.

I never really forgot what he said next.  He said I should switch major; and it was odd that he didn't offer any suggestions as to how to improve†...  Being a bit stubborn (and relatively close to graduation) I doubled down and completed my major requirements (ha!).  During this chat I told him that I really wish there were more coursework, required in my degree, in additional programming languages because that is what I was expected to know when I graduated for work. His response was I could learn that on the job... needless to say, my 20-year-old self was thinking "so why am I majoring in this now, anyway?"

Fast forward to the recent(ish) past, flashback brought to your courtesy of of this post on LinkedIn. I had recently completed my last degree (this time in Instructional Design) and I was having coffee with some good friends (and former classmates). We were a year or so out of school. Two of us already had jobs (same institutions as when we were in school) and one was on the hunt. His complaint was that school didn't prepare him for the work environment because he didn't know the software du jour (which at the time were captivate and articulate). I did my best to not roll my eyes because software comes and goes, but theory (for the most part) really underlies what we do as professionals. In class there wasn't a dearth of learning about software, but there were limitations: namely the 30-day trial period of these two eLearning titles.  So we did as much as we could with them in the time we had with them, and we applied what we learned from the theoretical perspective.  No, we didn't spent a ton of time (relatively speaking) on the software because that sort of practice in a graduate program should really be up to the learner, and it would cost them.  Captivate cost $1100 for a full license, while articulate costs $999/year to license. That cost is actually more than double what the course cost! Furthermore, it privileges one modality (self-paced eLearning) and two specific elearning titles. The fact of the matter is that not all instructional designers do self-paced eLearning, enabled by these titles. Not all instructional designers are content developers‡. I find the author's following suggestion a bit ludicrous:

To replace the non-value add courses, decision makers can study current open job descriptions, and ignore academic researchers' further suggestions. Programs can then be revolutionized with relevant course topics. These new courses can include relevant production tools (e.g. Storyline, Captivate, Camptasia, GoAnimate, Premier, etc.) and numerous cycles of deliberate practice, where students develop a course on their own, and receive the feedback they need. This will make hiring managers very happy.
While I do see value in learning specific technologies, that's not the point of a graduate degree, and graduate courses should prepare you to be a self-supporting, internally motivated learner.  Courses should give you the staples that you need to further make sense of your world on your own, and to pickup tools and know-how that you need for specific situations♠.  Focusing a graduate degree on production tool is a sure way to make sure to really ignore the vast majority of what makes instructional design what it is. Practice is important (i.e. building your learning solutions) but it's not the only thing that's important. I also do think that employers need to do a better job when posting instructional designer job descriptions, but that's a whole other blog post.

I do think that if you are new to any field you (as a learner) should be taking advantage of any sorts of internships, where the rubber (theory) meets the road.  In some programs internships are required, and in others they are optional.  I do think that internships are an important component for the newbies in the field.  When I was pursuing my MA in applied linguistics, and being in a program that focused on language acquisition and language teaching, the field experience (aka internship) was a requirement.  People with classroom teaching experience could waive the requirement and take another course instead, but for me it was valuable (as much as I had to be dragged to to kicking and screaming).  In hindsight, it gave me an opportunity to see what happens in different language classrooms, something I wouldn't have experienced otherwise.

So, what are your thoughts? What do you think of the LinkedIn article?


Notes:
† I guess this must have been a problem with advising in the college in general because years later the college of science and maths put together a student success center.  They were probably hemorrhaging students.

‡ I suspect this is another, brewing, blog post.

♠ So, yeah...Years later I see some of wisdom of my advisor.  I think he was partly right, in that I should be able to pick up what I need once I get the basic blocks, but I think he was wrong to suggest for me to change major, and I do think that less math, more computer science with applied cases would have been better as a curricular package.

Comments

Popular posts from this blog

Latour: Third Source of Uncertainty - Objects have agency too!

MOOC participation - open door policy and analytics

You've been punk'd! However, that was an educational experience