Monday, July 17, 2017

University Education, the Workplace, and the learning gray areas in-between


Many years ago, maybe around 16 years ago, I was sitting in the office of my computer science major advisor, getting my academic plan for next semester signed off on.  My computer science program was actually an offshoot of the mathematics department, and until recent years (2003?) they were one and the same.  My advisor, while looking at my transcript, noticed that (on average) I was doing better in language courses rather than my computer science courses; which was technically true, but many courses designated as CS courses (and ones that were required for my degree) were really math courses, so you need to do a deeper dive to see what I was doing better in.

I never really forgot what he said next.  He said I should switch major; and it was odd that he didn't offer any suggestions as to how to improve†...  Being a bit stubborn (and relatively close to graduation) I doubled down and completed my major requirements (ha!).  During this chat I told him that I really wish there were more coursework, required in my degree, in additional programming languages because that is what I was expected to know when I graduated for work. His response was I could learn that on the job... needless to say, my 20-year-old self was thinking "so why am I majoring in this now, anyway?"

Fast forward to the recent(ish) past, flashback brought to your courtesy of of this post on LinkedIn. I had recently completed my last degree (this time in Instructional Design) and I was having coffee with some good friends (and former classmates). We were a year or so out of school. Two of us already had jobs (same institutions as when we were in school) and one was on the hunt. His complaint was that school didn't prepare him for the work environment because he didn't know the software du jour (which at the time were captivate and articulate). I did my best to not roll my eyes because software comes and goes, but theory (for the most part) really underlies what we do as professionals. In class there wasn't a dearth of learning about software, but there were limitations: namely the 30-day trial period of these two eLearning titles.  So we did as much as we could with them in the time we had with them, and we applied what we learned from the theoretical perspective.  No, we didn't spent a ton of time (relatively speaking) on the software because that sort of practice in a graduate program should really be up to the learner, and it would cost them.  Captivate cost $1100 for a full license, while articulate costs $999/year to license. That cost is actually more than double what the course cost! Furthermore, it privileges one modality (self-paced eLearning) and two specific elearning titles. The fact of the matter is that not all instructional designers do self-paced eLearning, enabled by these titles. Not all instructional designers are content developers‡. I find the author's following suggestion a bit ludicrous:

To replace the non-value add courses, decision makers can study current open job descriptions, and ignore academic researchers' further suggestions. Programs can then be revolutionized with relevant course topics. These new courses can include relevant production tools (e.g. Storyline, Captivate, Camptasia, GoAnimate, Premier, etc.) and numerous cycles of deliberate practice, where students develop a course on their own, and receive the feedback they need. This will make hiring managers very happy.
While I do see value in learning specific technologies, that's not the point of a graduate degree, and graduate courses should prepare you to be a self-supporting, internally motivated learner.  Courses should give you the staples that you need to further make sense of your world on your own, and to pickup tools and know-how that you need for specific situations♠.  Focusing a graduate degree on production tool is a sure way to make sure to really ignore the vast majority of what makes instructional design what it is. Practice is important (i.e. building your learning solutions) but it's not the only thing that's important. I also do think that employers need to do a better job when posting instructional designer job descriptions, but that's a whole other blog post.

I do think that if you are new to any field you (as a learner) should be taking advantage of any sorts of internships, where the rubber (theory) meets the road.  In some programs internships are required, and in others they are optional.  I do think that internships are an important component for the newbies in the field.  When I was pursuing my MA in applied linguistics, and being in a program that focused on language acquisition and language teaching, the field experience (aka internship) was a requirement.  People with classroom teaching experience could waive the requirement and take another course instead, but for me it was valuable (as much as I had to be dragged to to kicking and screaming).  In hindsight, it gave me an opportunity to see what happens in different language classrooms, something I wouldn't have experienced otherwise.

So, what are your thoughts? What do you think of the LinkedIn article?


Notes:
† I guess this must have been a problem with advising in the college in general because years later the college of science and maths put together a student success center.  They were probably hemorrhaging students.

‡ I suspect this is another, brewing, blog post.

♠ So, yeah...Years later I see some of wisdom of my advisor.  I think he was partly right, in that I should be able to pick up what I need once I get the basic blocks, but I think he was wrong to suggest for me to change major, and I do think that less math, more computer science with applied cases would have been better as a curricular package.

Monday, July 10, 2017

MOOC CPD & SpotiMOOCdora

Last week (or was it two weeks ago?) I did my rounds on coursera, edx, miriadaX, and futurelearn and I signed up for a few new MOOCs.  I had also signed up for a course that a colleague was promoting on Canvas (innovative collaborative learning with ICT), but I've fallen behind on that one, not making the time commitment to participate.  The list of missed assignments (ones that I can no longer contribute to) actually is demotivating, even if my initial approach was not not do many assignments (or rather, play it by ear, and decide on whether I'd like to do some assignments during the MOOC). Maybe this coming week I'll 'catch up' in some fashion ;-).  The interesting thing is that there is a forum in Greek in that MOOC, which is motivational to see what my fellow Greek are doing in the arena of ICT and collaboration. I guess I still have a few more weeks before the MOOC ends...

Anyway,  I digress (probably not good practice for the dissertation).  Today's post was spurred by a recent essay on the MOOC on Inside Higher Education, where the author looked at her prognostications and examined them in the light of information we currently have about MOOCs. It is a little disheartening that the original MOOCs (connectivist MOOCs) are sort of gone (at least I don't really see a ton of connectivist stuff happening these days), and the xMOOC variety seems to be going more and more toward money making.  Even with the MOOCs I've just singed up for, there really isn't an option for a free certificate anymore.  You can still go through the course - which I am to do on my own sweet time (opportunity to explore the classics), but even a basic certificate is not free any longer. Another thing that going into this mix is thinking about continual professional development. In the two departments I am mostly connected with (applied linguistics and instructional design) graduates of these programs often need PD credits in order to maintain a teaching license, or to continue to hone their skills. Usually this is done through free webinars, in-service training, or taking additional graduate courses (depending on your field of course). This got me thinking about two things: MOOCs as CPD (which isn't really a new idea), and the all-you-can-eat MOOC (or SpotiMOOCdora - after services like spotify and Pandora).

My first pondering is this:  given that institutions such as Georgia Tech are offering a $10k MA in the MOOC format, why not consider a smaller leap into CPD (professional development courses)?  I know that maybe doing an entire MA might be a bit of leap for most institutions, heck even a certificate might be a bit of a leap (aka 'micro-masters' in the MOOC world), but CPDs have a different set of expectations and requirements, and they are often not available for graduate credit (some are, but most in my experience are not). I think it would make a ton of sense to develop professional development courses in a MOOC format, that are available for free for a target audience (let's say teachers of high school biology).  The payment can come in the form of assessment, or an in-person fee for a facilitator that brings together the course content of the MOOC (that people have done previously) in an active learning paradigm.

The second pondering is this:  Is there a market for either an all-you-can-eat month-to-month subscription to a MOOC? An example of this would be Amazon Prime video, Netflix, Hulu, Pandora, and so on.  If not all you can eat, how about a model that's more like Audible, where you get a book per month and you can spend your unused tokens anyway you want (if you are still working on a book, you can bank the token for another month for example).  If either of these models works, then what would be an appropriate price?  Netflix and Spotify at $10/month; audible is $15/month for example.  The reason I am pondering this had to do with the costs of certification.  I don't know what the secret sauce in certification is, but edx is asking me for $200 to get a certified certificate of completion (this sounds redundant).  What does $200 get me?  I don't get college credit for it, and (for me) the joy of learning is internal, so $200 is better spend elsewhere. For instance $200 gets me lifetime subscription to my favorite MMORPG...when said subscription is on sale (lots of hours of fun and additional content). Comparatively the edx certificate seems like a poor value proposition.

What do you think about these ideas?  Does a monthly subscription MOOC make sense?  What is the value proposition?  And, can we resuscitate the cMOOC?   Thoughts?

Monday, July 3, 2017

Academic Identities, Terminal Degrees, power of the network...

It's been a while since I last just sat down to think and write about something (like the good old days when I was cMOOCing...).  These past few weeks have been about conferences, and getting back on track with my dissertation proposal (although I think I am the only one who is keeping a score on that at this point).

In my attempt to get back to writing, and engaging with friends and colleagues out there in the wild blue yonder which is the internet, I thought I would pick through my accumulated Pocket list until it's almost empty.  One of the ponderings of interest came by means of an article on Inside Higher Ed titled Academic Identities and Terminal Degrees, where the overall question was: Does one need an academic terminal degree to identify professionally with that discipline? And, as Josh goes on to explicate

Can only someone with a Ph.D. in economics call herself an economist? Do you need a Ph.D. in history to be a historian? How about sociology and sociologist? Biology and biologist? Anthropology and anthropologist?

My views on the topic have changed in the past fifteen years; where I basically compare my views as someone who just finished a BA, to my current views...on the road to a earning a doctorate (are we there yet? 😂).  Originally I would have said that someone could call themselves something only if they've earned a degree in that field. I think today I would call that by the term protected professional title, and a degree or some sort of certification would be a way to demonstrate that you've been vetted into that profession somehow by somebody. Now, which titles (economist, linguist, archaeologist, biologist, etc.) are protected, and up for grabs...well...that's a subject for debate! At the time the only means of obtaining that expertise (in my mind) was through formal degree programs.

Since that time, in addition to completing a few masters programs and discovering new fields and new knowledge, I've also discovered the power of the network, the potency of communities of practice,  groups such as virtually connecting, and expanding my own learning and practice outside of the classroom.  My current feeling is that it's not really as black and white at my younger self thought.  I do think that obtaining a doctorate in the field is one path to getting there, but it's not the main criterion to developing your identity in that field.  The main criterion that I have (at this point in time anyway) is practice and expansion of your own skill set in that field. I guess a good way to describe this is through some examples that came to mind while I was trying to tease it out for myself:

Example 1: The non-practicing PhD
A few years ago I was a member of a search committee looking to fill the position of a program director for an academic program at my university. Among the requirements for this position was a terminal degree (PhD or EdD being defined in the job search posting).  We got a variety of CVs from interested applicants.  In reviewing CVs I noticed an interesting cluster of applicants: those who had earned a terminal degree (four, five, six, ten) years ago, but had no publications (or other academic work) under their name other than their dissertation.  Their dissertation was listed on their CV, but nothing else. I am not saying that publishing in academic journals is the only way to demonstrate academic work. You could for example be presenting at conferences, presenting at professional association workshops, writing for a blog or professional publication (basically translating academese to professionals). These job applicants had none of that, so they were demonstrating a lack of practice and continuous improvement in their field.  So they had earned their badge of honor by completing a doctoral program but there was no follow through.   For individuals like that I'd have a hard time calling them an economist, a biologist, a demographer, or a whatever.  I'd called them Doctor so-and-so, but they - in my mind - are not an embodiment of what it means to be a ___________ (fill in blank).


Example 2: Word ambiguity
When I was close to finishing my degree in Applied Linguistics I came across a podcast and a blog of someone who called himself a linguist. I was really happy to come across this podcast and blog because I could continue to learn about a topic of interest once I graduated (and also while I was in school), and this was exciting because back then there weren't really that many linguistics blogs or podcasts around.   My working definition of linguist a person who studies linguistics (where linguistics is the scientific study of language).  This is how I've always understood linguistics.  The person on the other end of this podcast was not a linguist in that sense.  He was a linguist in the dictionary sense of a person skilled in foreign languages.  Personally I'd call that a polyglot and not a linguist. Although, I don't think that it would have bothered me too much if this person called himself a linguist if he didn't really start to preach in his podcast about the best way to learn a language.  I find that at that moment he crossed the line into the domain of what I consider linguists: those who are either clinical linguists (for lack of a better term), and those who are teachers of language and take an inquisitive and critical approach to their teaching and either share what they've learned through their research (published or not). This individual calling himself a linguist was neither a teacher, nor a linguist (in the scientific meaning). Hence the more accurate term that I would use is polyglot not linguist.


Example 3: The practicing MA graduate
In many fields conducting an MA thesis is the only means to graduating from your Master's program.  Even if you don't conduct a thesis to graduate, but you've studied research methods, and continue to hone your skills of inquiry, and continue to read up on advances in the field, I feel like you have the right to call yourself a ________ (fill in relevant blank), if of course there isn't a regulatory board for your profession (nursing, medical, legal, accounting, and other profession of that type). There are many smart people out there who do a lot of work, and who diligently work on keeping their knowledge and skills updated.  Some of them even research and publish.  Through their continued efforts I think that they've demonstrated that they are serious enough about their profession to be included in that group that calls themselves a ___________ (fill in blank).


At the end of the day, for me, an academic identity isn't necessarily tied to a degree earned.  A degree earned on someone's CV might give you clues as to what their academic identity is, but it's not the only consideration.  I think that practice and application are key considerations when you're deciding where you are in the group, or you're not.  I think if a word has double meaning - as with example #2 - the thing to do is stick with the more accepted or widely used meaning, instead of something that isn't used.  I think it's the honest thing to do.


Your thoughts?