Thursday, February 23, 2017

Are MOOCs really that useful on a resume?


I came across an article on Campus Technology last week titled 7 Tips for Listing MOOCs on Your Résumé, and it was citing a CEO of an employer/employee matchmaking firm.  One piece of advice says to create a new section for MOOCs taken to list them there. This is not all that controversial since I do the same.  Not on my resume, but rather on my extended CV (which I don't share anyone), and it serves more a purpose of self-documentation than anything else.

The first part that got me thinking was the piece of advice listed that says "only list MOOCs that you have completed".  Their rationale is as follows:

"Listing a MOOC is only an advantage if you've actually completed the course," Mustafa noted. "Only about 10 percent of students complete MOOCs, so your completed courses show your potential employer that you follow through with your commitments. You should also be prepared to talk about what you learned from the MOOC — in an interview — and how it has helped you improve."  

This bothered me a little bit.  In my aforementioned CV I list every MOOC I signed up for(†) and "completed" in some way shape or form. However, I define what it means to have "completed" a MOOC.  I guess this pushback on my part stems from me having started my MOOC learning with cMOOCs where there (usually) isn't a quiz or some other deliverable that is graded by a party other than the learner. When I signed up for specific xMOOCs I signed up for a variety of reasons, including interest in the topic, the instructional form, the design form, the assessment forms, and so on. I've learned something from each MOOC, but I don't meet the criterion of "completed" if I am going by the rubrics set forth by the designers of those xMOOCs.  I actually don't care what those designers set as the completion standards for their designed MOOCs because a certificate of completion carries little currency anywhere. Simple time-based economics dictate that my time shouldn't be spent doing activities that leading to a certificate that carries no value, if I don't see value in those assessments or activities either. Taking a designer's or professor's path through the course is only valuable when there is a valuable carrot at the end of the path. Otherwise, it's perfectly fine to be a free-range learner.

Another thing that made me ponder a bit is the linking to badges and showcasing your work.  Generally speaking, in the US at least, résumés are a brief window into who you are as a potential candidate.  What you're told to include in a resume is a brief snapshot of your relevant education, experience, and skills for the job you are applying for.  The general advice I hear (which I think is stupid) is to keep to to 1 page.  I ignore this and go for 1 sheet of paper (two pages if printed both sides).  Even that is constraining if you have been in the workforce for more than 5 years. The cover letter expounds on the résumé, but that too is brief (1 page single spaced). So, a candidate doesn't really have a ton of space to showcase their work, and external linkages (to portfolios and badges) aren't really encouraged. At best a candidate can whet the hiring committee's palate to get you in for an interview. This is why I find this advice a little odd.

Your thoughts on MOOCs on résumé?


NOTES:
† This includes cMOOC, xMOOC, pMOOC, iMOOC, uMOOC, etcMOOC...

Wednesday, February 22, 2017

Course beta testing...


This past weekend a story came across my slashdot feed titled Software Goes Through Beta Testing. Should Online College Courses? I don't often see educational news on slashdot so it piqued my interest. Slashdot links to an EdSurge article where Coursera courses are described as going through beta testing by volunteers (unpaid labor...)

The beta tests cover things such as:

... catching mistakes in quizzes and pointing out befuddling bits of video lectures, which can then be clarified before professors release the course to students.

Fair enough, these are things that we tend to catch in developing our own (traditional) online courses as well, and that we fix or update in continuous offering cycles.   The immediate comparison, quite explicitly, in this edsurge article is the comparison of xMOOCs to traditional online courses.  The article mentions rubrics like Quality Matters and SUNY's open access OSCQR ("oscar") rubric for online 'quality'. One SUNY college is reportedly paying external people $150 per course for such reviews of their online courses, and the overall question seems to be: how do we get people to do this beta test their online courses?

This article did have me getting a bit of a Janeway facepalm, when I read it (and when I read associated comments). The first reason I had a negative reaction to this article was that it assumes that such checks don't happen.   At the instructional design level there are (well, there are supposed to be) checks and balances for this type of testing. If an instructional designer is helping you design your course, you should be getting critical feedback as a faculty member on this course.  In academic departments where only designers do the design and development (in consultation with the faculty member as the expert) then the entire process is run by IDs who should see to this testing and control. Even when faculty work on their own (without instructional designers), which happens to often be the case in face-to-face courses, there are checks and balances there.  There are touch-points throughout the semester and at the end where you get feedback from your students and you can update materials and the course as needed. So, I don't buy this notion that courses aren't 'tested'.†

Furthermore, a senior instructional designer at SUNY is cited as saying that one of the challenges "has been figuring out incentives for professors or instructional designers to conduct the quality checks," but at the same time is quoted as saying “on most campuses, instructional designers have their hands full and don’t have time to review the courses before they go live.”  You can't say (insinuate) that you are trying to coax someone to do a specific task, and then say that these individuals don't have enough time on their hands to do the task you are trying to coax them to do. When will they accomplish it?  Maybe the solution is to hire more instructional designers? Maybe look at the tenure and promotion processes for your institutions and see what can be done there to encourage better review/testing/development cycles for faculty who teach. Maybe hire designers who are also subject matter experts to work with those departments.‡

Another problem I have with this analogy on beta testing is that taught courses (not self-paced courses, which is what xMOOCs have become) have the benefit of a faculty member actually teaching the course, not just creating course packet material. Even multimodal course materials such as videos, podcasts, and animations, are in the end, a self-paced course packet if there isn't an actual person there tutoring or helping to guide you through that journey.   When you have an actual human being teaching/instructing/facilitating/mentoring the course and the students in the course there is a certain degree of flexibility.  You do want to test somewhat, but there is a lot of just-in-time fixes (or hot-fixes) as issues crop up.  In a self-paced course you do want to test the heck out of the course to make sure that self-paced learners aren't stuck (especially when there is no other help!), but in a taught course, extensive testing is almost a waste of limited resources.  The reason for this is that live courses (unlike self-paced courses and xMOOCs) aren meant to be kept up to date and to evolve as new knowledge comes into the field (I deal mostly with graduate online courses),  Hence spending a lot of time and money testing courses that will have some component of the course change within the next 12-18 months is not a wise way to use a finite set of sources.

At the end of the day, I think it's important to critically query our underlying assumptions.  When MOOCs were the new and shiny thing they were often (and wrongly) compared with traditional courses - they are not, and they don't have the same functional requirements.  Now that MOOCs are 'innovating' in other areas, we want to make sure that these innovations are found elsewhere as well, but we don't see a stop to query if the functional requirements and the environment are the same.   Maybe for a 100 level (intro course) that doesn't change often, and that is taken by several hundred students per year (if not per semester) you DO spend the time to exhaustively test and redesign (and maybe those beta testers get 3-credits of their college studies for free!), but for some courses that have the potential change often and have fewer students, this is overkill.  At the end, for me, it comes down to local knowledge, and prioritizing of limited resources.  Instructional Designers are a key element to this and it's important that organizations utilize their skills effectively for the improvement of the organization as a whole.

Your thoughts?




NOTES:
† Yes, OK, there are faculty out there have have taught the same thing for the past 10 years without any change, even the same typos in their lecture notes! I hope that these folks are the exception in academia and not the norm.

‡ The comparison here is to the librarian world where you have generalist librarians, and librarians who also have subject matter expertise in the discipline that they are librarians in. Why not do this for instructional designers?

Wednesday, February 15, 2017

Institutional Memory



It's been a long time since I've blogged about something educational, other than my classes of course.  With one thing down (and a million more to go), I decided to take a little breather to see what's accumulated on Pocket over these past few months.  I saw a post by Martin Weller on Institutional Memory, and it seemed quite pertinent to my day to day work existence these past six or so months.  Martin points to a BBC article indicating that the optimal time in a specific job is around 3 years.

This isn't the first time I've heard this.  About 11 years ago (wow!) I was working for my university library.  I was new to the Systems Department (the IT department in a library) and my supervisor was new.  When we were getting to know more about each other's work histories (before you could look at LinkedIn profiles), she had told me that she aimed to stay there for a few years and then move on. People should only stay in their current work for 3 years. At the time I found this advice a little odd, after all I had stayed with my previous department for 8 years total, before moving to the library, and even then I still stayed within the institution.

From my own experience I can say that if institutions were perfectly running machines, with perfectly documented procedures, and good version histories that we could reference to get an insight into why things are done the way they are done, then "short" 3 year stays at a job (or an institution) might (in theory) make sense.  You come in, the institution benefits from your expertise, you benefit from the experience, you (metaphorically) hug and go your separate ways at the end of your tour. However, institutions are complex organisms. The reasons why things are the way they are might not be documented. Sometimes the procedure was a backroom deal between one academic Dean and another.  Sometimes it's the duct tape and paper-clip that holds everything together because at the time the organization didn't have the ability to break everything down and rework something from scratch.  Other times it's good ol' fashioned human-human relationships that make things work (i.e. bypassing parts of the system where things are bottlenecked but no one will change things).

Given this reality, I think 3 years is a rather short time to spend at a job or an institution.  I know that when I've changed jobs it's taken me up to a year to fully "get" all the connections, the people, and the systems in place to not only do my job but to do my job effectively and efficiently. Leaving before you can make a lasting impact at the institution is a little selfish given that the employee gets good exposure to new skills and ideas, but leaves before they can really put those to use on anything more than a bandaid†.

Sure.  Even when you stay at an organization for more than 3 years, after a little while you will reach the plateau of efficiency in what you are doing. It may take you 3 years, it might take you 2, it might take more.  Sooner or later you will get there.  At that point, that's when the organization has a responsibility to keep things fresh for their employees. This benefits both the organization and the employees.  Employees feel challenged, in good ways, (think of it as a ZPD for work), and organizations get to retain and employ the talent that they've incubated.  If people leave because they feel bored that's a shortcoming of the organization.

I know that in my own experience working at my university (19 years now), even though my jobs have changed, and my departments have changed, that institutional knowledge follows me, and I share it with other people. Just because something might not be of particular use to me right now, it doesn't mean that it's not useful to another colleague who is newer at the institution.  Having this oral history, and this means of passing it down to others is of use.  Leaving your post and experiencing this high turnover rate  is detrimental to an institution‡.

Your thoughts?



NOTES:
† Don't get me wrong, private sector companies, especially ones that vehemently refuse union organization, and use globalization as a way to use and abuse employees by not paying them a living wage, by not providing good benefits, and by shirking their responsibilities in their social contracts are not worthy of employee loyalty of this nature. We just can't afford, as people to to say "I am only looking out for myself".

‡ Another thing that came to mind, as I was writing this, has to do with hiring. Hiring isn't as simple as posting a job at the university's "help wanted" site. Between the time a need for someone arises, and someone is hired, it can take a very (very) long time.  Just as an example, there are two jobs that come to mind that I applied for.  One for my current job where I applied in March, interviewed in December, started in February).  My job at library systems where I applied in February (I think), got the call for an interview in November, heard that I got the job in December, started in January. All of this is considered "fast", so when it takes that long to get hired, I would say that 3 years somewhere is a rather short time.

Saturday, February 11, 2017

EDDE 806 - Post X - it marks the spot!

This past Thursday we had our official EDDE 806 session (on Monday, Norine did a mock proposal defense, which I wasn't able to attend, but luckily it's archived for later viewing). In any case, in this session we heard from Renate who reported in on her ideas for a dissertation topic, and there were a ton of interesting things about process that were shared by Susan and others.

Renate is looking to do a study in order to understand the lived experience of pre-licensure (nursing?) students, attending their final clinical practicum, after they have been exposed to an IPE (interprofessional education) didactic curriculum. To do this she will use a qualitative, phenomenological, approach to her research design.  Phenomenology seems to be quite popular between the current cohorts (wonder why). She aims to get about 15 participants from a variety of healthcare professions (in Canada) who will be her research participants.  I am looking forward to reading this research when it's done. It reminds me a little of other professions where there is professional education, but we haven't necessarily seen if the former students practices connect with what they have learned, and how well those connect.

In terms of tips for the dissertation process (and the proposal process for that matter), Susan and Peggy Lynn shared the following (my comments are in italics):

  • Getting yourself in a routine.  Even if you are not doing much on your proposal (or you dissertation), do spend 10-15 minutes on the document anyway.  Re-read, copy edit, make notes. Just keep the process going, even if you're not actively working on it. I have not been doing this this semester, but I think that next week I'll start.  Maybe grab a cup of coffee and spend 15 minutes editing (and look at what Debra commented on from EDDE805, lol)
  • Once the changes to your dissertation (or dissertation proposal) are made (based on the committee feedback) and you have an oral defense scheduled, do not edit the document, not even copy edit!  The committee will use this document as a reference when they quiz you, so it's best if you are all on the same page.
  • Once you pass your dissertation proposal, make a copy of the proposal file for archival purposes.  File it away (I would add, maybe in PDF format!). Then take another other copy to build out your dissertation from.  This is good versioning practice, and it allows us to share successful proposals with other cohort members who might want to see a sample of what is good.
  • The runtime for a defense is about 2 hours.  There are three members on the committee, and the order for questioning is: 1) External member, 2) other member from AU, and 3) your supervisor.  Each gets about 15 minutes of Q&A.  Your presentation at the start of this is 20 minutes, so I guess it's good to practice the heck out of our presentation to make sure that we are on the mark with the points we want to make, and on time!
  • The examiners need to see your face when you start your dissertation to verify visually that it is you defending.  So...make sure that you wear appropriate clothing and present a professional environment. Also make sure that if you are at home that cats, dogs, birds, and rodents are somewhere else and that they don't provide their own soundtrack to your defense
  • Finally, a good point by Peggy Lynn: look for articles that are reporting on the opposite thing that you are proposing. This stuff might come up in your defense so you need to know how to rebut it!


By the way, if you are reading this, and you are in one of the cohorts, please feel free to add to this wiki page. We are putting together a list of topics that we are all working on  (or have worked in, in the case of previous cohorts) for our dissertations.  This will give others in future cohorts (as well as our own) what people have worked on in the past :-)


And, since it was a phenomenology sort of talk... for your learning pleasure, the muppets!


Friday, January 27, 2017

EDDE 806 - Post IX - About that 'in-process' presentation...

Yesterday evening I presented where I currently am in my dissertation proposal.  I am not sure if Susan was joking or not about 2, 3, 4 years being the 'in process' time to get a dissertation done and defended, but I certainly hope that it's not that long!  I am aiming for May 2019 at the latest for mine.

That said, earlier this week I did a few dry runs for the presentation I did last evening, and one of them I recorded.  From a timing perspective it's in the ballpark of what I was aiming for (23 minutes).  I have heard that dissertation proposal defenses and dissertation defenses (the presentation portion) are about 20-30 minutes so I wanted to keep that in mind.  This recorded version is a little rough (it was a try out after all), but it gives you an idea of what my current thoughts are on the matter.

What do you think?  I know it's just a window into the mind of this project, but any thoughts would be helpful as I am drafting this beast :)

As an aside, one lesson learned last night is this: hound your advisor, hound your committee, keep on them.  If you are waiting for feedback, go after it like there is no tomorrow ;-)