Thursday, February 28, 2013

Evaluation, some parting thoughts (#oldsmooc)

Today is a new day, and a new topic in OLDSMOOC.  Well, not so much a topic as a winding down on the learning process that has been occurring in this MOOC. On the topic of evaluation, there was an interesting discussion on the Google Group: is it the life blood of learning design or the bane of our existence?

My, short, response was as follows:
I think that there is a happy medium between the two. I do believe that evaluation and iteration (based on evaluation findings) are at the core of good learning design, but, by the same token, I do believe that there are people that can take it to the extreme. This then, can become, that tax (or smelly cod oil as someone else put it) in the learning design. 
I think that if there isn't adequate evaluation (formative and summative) then a lot of learning design efforts can go to waste. We are not perfect beings (no matter how much we may think we are ;-) ), so we will not be getting something right on the first try. Evaluation is just as important an aspect to ID and LD as other phases :)

What surprised me in this discussion  was the debate (or recommendations really) over terms to be used.  Evaluation (and assessment) seem to have negative connotations to many people. I guess many people have seen the "Stick" end too many times, and not the "carrot" end of things. Whether you call evaluation something like "periodic review" or "formative excellence" or any other phrase or euphemism you want to use, the underlying feeling people have toward evaluation will not change, and they might not feel that great about being patronized by using new words to describe something they view as inherently bad.

I think it's important to retain the current nomenclature (evaluation and assessment) and look underneath all of the fear and uncertainty that is inherent in them, for some people anyway, and try to find ways to address those issues.

Another intersting thing that came up is the dichotomy between instructional design and instructor.  An instructional designer (or learning designer) may view evaluation as something that is at the core of what they do, but instructors may not.  Instructor may view it as superfluous frou-frou that administrators impose on them because they don't trust their intuition.  This may be true in some cases (and possibly where evaluation gets a bad name), but every professional, in my opinion, needs to have a periodic evaluation of their work. Both an internal evaluation (I evaluate my own work, and question my own assumptions, in a methodical way), and an external evaluation (my peers and fellow SMEs peer review me).  I see the internal evaluation happening more frequently than the external evaluation (you want to be mindful and respectful of other people's time), but the evaluation should happen nevertheless.


Saturday, February 23, 2013

Week 7 - Evaluation (OLDSMOOC)

It's week 7 in OLDSMOOC, and as we are windowing down we are tackling the topic of Evaluation. I will be switching tracks again, from the Blended Mobile Learning course (that I've been working on for a while), and going back to the idea of offering the course as a cMOOC. Going through OLDSMOOC I've gotten some good ideas about how to implement my own cMOOC.  I've been thinking a lot about the recommended paths that are available in some weeks (the short and the long path). This, in combination with badging, and deliverable, is making me think about the assessment aspect of the MOOC; but let's not get sidetracked, let's talk about evaluation.

In terms of evaluation decisions, what immediately comes to mind are these:
  • Should the content for this learning design be expanded, reduced, or remain the same? One of the tricky things about MOOCs is that you will always have critics, since there will be instances (many of them) where the MOOC does not hit the sweet spot for quite a few people.  How does one address these issues, and how does one stay explain the scope of the MOOC to the stakeholders?
  • What is the best way to make the materials open to anyone who may want to facilitate this course? I am thinking about this course as a way to also contribute back to the community, thus I want to make the OER as accessible as possible.
  • What strategies can we identify to maintain our learning design in the future? If the MOOC is going to run more than once, then, like open source software, there needs to be some way of maintaining the learning design, and updating it
  • Should the MOOC offer assessment instruments?  If so, what are meaningful assessment instruments and, when implement, will they have met their goal?
  • What is our engagement goal? Learning does not happen in a vacuum, so how do we encourage people to engage with one another without being overbearing? You can't force engagement and cooperation onto others.
  • How did facilitators work out? How was communication between facilitators and participants?  What worked and what did not?
  • Does one facilitator work? Or should we employ many more facilitators?
  • What is the "sweet spot" for MOOCs? 4 weeks? 6 weeks? 8 weeks? 12 week? If students are taking this for credit (like they did with CCK) then their academic semester is defined, and they need to "slog through" it, but what about the open learners?
  • What role should learner supplied resources take in the course if the course is offered for credit? In MobiMOOC 2011 there were participants that posted resources like there was no tomorrow (this was a good thing in my opinion), but what do other participants feel about this?
 As I was writing this, I decided to mingle the specific question in with the decisions to be made.

Your thoughts?

Assessment & Evaluation - a review of terms

This was shared as part of OLDSMOOC, but I thought it would be a good resource for any beginning instructional designer :-)

Friday, February 22, 2013

OLDSMOOC: Peer Review #1

OK, s, I (re)discovered that there are some crowd-sourced badges. I knew this in the first week of the course, but somewhere along the lines I forgot about them (probably because I wasn't planning on applying for those badges anyway ;-)  )

In any case, if you are participating in OLDSMOOC, do have a look at the badge application page, and help out a fellow oldsmoocer!  I'm doing my part right now!

The first person's materials I examined are Tiffany Crosby's. Tiffany has a project in mind to create a "business psychology course that combines facets of psychology, sociology, business ethics, fraud and ford endive, and decision-making." I think this sounds interesting, and it reminds me a bit of my Business Ethics course that I took while I was an MBA student (this was a really nice course!). As part of the deliverables for the badge she is seeking, she submitted the results of her card sort activity, and the course map for her course.

From what I saw in the card-sort activity results, I am wondering what format the course will take; will it be online, on-campus or blended? I see in her course map that there is a Site and Course Orientation in the guidance area, but "site" can be applicable to an online site, or a physical site where learners are embedded in the learning.  One thing I would recommend is disambiguation in this area :) I also see that key areas in this course are: authentic learning, research and practice based learning, and serious engagement and interaction. I think this is all great (and I really like the element of gamification that she plans to introduce!).  The area that I'd like to see a bit more discussion or expansion on is student assessment. I see (in both activities) that self-assessment is a big thing, but I would also like to know where the assessment by a subject expert comes in (if at all), why, and why not :)

Still looking for a second person to review in order to earn the Reviewer Badge ;-)

Friday, February 15, 2013


It's week 6 of 9 in OLDSMOOC† and the topic is Curation of content, and one of the major areas is Open Educational Resources (or OER).  I had to go back to last summer, when I was working on the #ioe12 MOOC, and the week that was specifically tackling the topic of OER to see what I wrote then on the topic.

I think my main "complain" about OER is that it, in addition to spending time to find OER resources, you also have to, usually spend, a lot of time on editing and adapting the OER for usage in your own classroom. After all, materials are designed with certain uses, users, and criteria in mind, and those uses, users, and criteria may not match your own LD/ID analysis.  Thus, the process isn't one of Seeking & Deploying, but rather Seeking, Evaluating, Modifying, Testing, Deploying. The process then can become a rabbit hole, where if you jump in too deep, you feel vested in the outcome and you probably don't want to scrap your current work on OER, if you've started working on it, to just create your own materials (or use non-open materials such as publisher resources).

I think for my use case (a blended course or MOOC on mLearning) I may just poke around to see what's available, but since I have a ton of time on my hands, I probably will take some time in looking around :)  What I am mostly hoping to find is OER activities and simulations that may be useful. These are also things that can take too long to develop on one's own, so finding ready made material would be helpful.

† oddly enough you can get a badge for participating for 6 weeks, but not 9. I wonder why...

Thursday, February 14, 2013

Our loss of widsom

I came across this pretty interesting TED talk on the loss of wisdom, hidden knowledge and skills, the reductionism in what we do. It's a must-see :)

Tuesday, February 12, 2013

OLDSMOOC Week 5: all quiet ;-)

This week, on OLDSMOOC, the subject was prototyping, and all has been very quiet in the MOOC. I am not sure if people are busy prototyping (and thus not talking a lot), or if people have taken a small break from the MOOC :-) Since I decided to focus on the Blended version of my mLearning course (and not try to make a MOOC out of it just yet - mostly due to time issues), I took the stand-back and see approach to this week. I did read the materials (which reminded me of my User Interface Design course - that was a lot of fun!) but I didn't have a ton of time to prototype anything. Instead, I decided to help out and test other people's prototypes. If you need a guinea pig, let me know ;-)

Here is (one of) the presentation for this week:

Friday, February 8, 2013

MOOC Fail: Tempest in a teapot edition

Last fall, when I was on an xMOOC-binge, I decided to sign up for a MOOC called Fundamentals of Online Education: Planning and Application (#foemooc). I knew the subject matter, but I decided to participate so I can compare notes. After all, I am teaching what is the same course online this semester in a non-MOOC format. I was also curious how it would be done in a MOOC format because I've been thinking about designing some courses that could work "natively" in the MOOC format, like Connectivism and Connected Knowledge where some students take it for credit, while others just take it because they are interested in the course.

In any case, work got the better of me, and instead of focusing on xMOOCs, I decided to focus on good ol' cMOOCs, since those are the MOOCs that are pushing the envelope on pedagogy; so I dropped #foemooc in January. It also didn't help that #foemooc was misunderstood by some (and therefore advertised among the blogosphere and twittersphere) as the MOOC to learn how to design MOOCs (my reaction to that).

In any case, even though I wasn't participating in this MOOC, I decided to keep an eye out on it. Wow, it didn't fail to disappoint, and not for a good reason either! It was such an epic fail that someone decided to pull the plug on it! Now, I don't really wish to reiterate in great detail what others have written, but it's worthwhile to note that a week after the epic fail, this is no longer news. It had its moment in the limelight, and now it's gone. What's worth noting are these few points:

1. MOOCs, for better or worse, are now in the scope of the mainstream sites.  Sites like mashable and huffington post. Slow, progressive, understanding of what MOOCs are, the pedagogy behind them, the continuous refinement of the design and implementation are lost.  What sells in these places is sensationalism: big numbers and epic fails. The in-between, what really pushes us forward does not count. If you  really want to know more about MOOCs, pick up an academic journal (the upcoming JOLT issue has a special focus on MOOCs for example). Don't just listen to the pundits, in academic sites, and in popular sites.

2. What was really amiss in this #foemooc fail was the presence of the learner. On of the first things you do in Instructional Design is a learner analysis: who is in your classroom, why are they there, what do they bring to the table, and so on. This #foemooc, like many coursera MOOCs, are not really designed (in my humble opinion) but ported, like software, from one modality to another. Software porting isn't always bad. You can have a great port, and a really bad one that doesn't fit into the paradigm of the platform it's on.  In coursera MOOCs, and #foemooc in specific, I think that there hasn't been a great deal of though given to the effect that "masssiveness" has on the learning design. A course designed for 20 students is not going to work for more than 20.

Furthermore, the learner has been disintermediated in this MOOC. The decision was made to kill the MOOC for improvements because it didn't work well. Well, you can't just do that to learners! You just don't have the right to pull the plug when the learner's content is in your system!  In traditional classes, even if the class stinks, no one pulls the plug! Great efforts are made by instructors and instructional designers to salvage things when they go bad and move on.  I believe that this is what should have been done in #foemooc - fix things while they are on course. Adapt!  Shutting down is not the answer.

3. Some struggle is fine! People seem to not be OK with struggle in learning, especially in MOOCs.  If you don't struggle a bit, you don't learn.  If you struggle a lot, then it's important to reach out to organizers (something you can't do in  xMOOCs) and more knowledgeable peers (MKP) for assistance. If it's content related, you might be able to succeed by working through the struggles.  If not, some pre-requisite knowledge may be needed.  If it's technology related, perhaps a support group can help. The point is, that working through the issues is one of the elements that leads to acquisition.

4. Finally, institute some pre-requisites, please! In cMOOCs you generally don't see pre-requisites, and that's fine. Most cMOOCs that I've seen are designed to not be part of an overall curriculum, so maybe they don't have to think of all the pre-requisite knowledge. Then again, you see MOOCs like OLDSMOOC that specify who the target audience for the MOOC is, that at least gives you some ideas for pre-requisite knowledge.

xMOOCs, also never seem to have pre-requisites, but this is very odd. Most of these courses come from curricula that have pre-requisites, so why don't the MOOCs have pre-requisites? There should be an indication to the learner what sort of background knowledge and skills they need to have in order to be successful in the xMOOC. Without this, you aren't adequately preparing learners to be successful in your xMOOC. Some pre-requisite indication (and perhaps a full syllabus) would go a long way to prepare learners before the MOOC begins.

Just some (snowy) Friday thoughts on MOOCs.

Wednesday, February 6, 2013

Some sample TLAs

This week I was messing around with the pedagogical patterns collector (see here) to see what the predefined patterns were in the system (and applied some to my Blended Intro to mLearning course. I didn't think that the Patterns would be good for a MOOC environment, but I also didn't have much time to mess around in creating a MOOC-appropriate Patterns and associated TLA (teaching learning activities).  So here are two samples of what the machine gave me back based on my inputs (see end of blog post)

All things considered, the PPC was an easy and interesting tool to use. It reminded me a lot of the Absorb, Do, Connect sequence of activities that Horton talks about, something I picked up when I was an Intstructional Design student.

SAMPLE 1: Apply research based based approach in the practice under study

TLA 1 - Introduction
  • Students read through the introductory material explaining the role of research based approaches to Mobile Learning Project Design (independent , group size: 1, Read, Watch etc. -  10 minutes)

TLA 2 - Eliciting preliminary conceptions about research based approach
  • Students answer a series of questions about the value and the role of research based approaches to Mobile Learning Project Design (independent , group size: 1, Practice -  10 minutes)
  • Students produce a short written response to questions about the advantages and issues when adopting research based approaches to the Mobile Learning Project Design (independent , group size: 1, Produce -  15 minutes)

TLA 3 - Drafting a Research Study
  • Students write the design of research study relating to Mobile Learning Project Design - following a written outline that includes Question to be investigated:Approach to investigation (method):Issues to consider: (independent , group size: 1, Produce -  10 minutes)

TLA 4 - Produce a plan for conducting your study
  • Using the Academic Journals, Books on mLearning, Exploration of Mobile Apps, and Technology Whitepapers produce a plan that can be used to conduct your research study. Once you have finished share it with others. (independent , group size: 1, Practice -  15 minutes)

TLA 5 - Eliciting post-activity conceptions about research based approach
  • Students answer a series of questions about the value and the role of research based approaches to Mobile Learning Project Design (independent , group size: 1, Practice -  10 minutes)
  • Students produce a short written response to questions about the advantages and issues when adopting research based approaches to Mobile Learning Project Design (independent , group size: 1, Produce -  15 minutes)

TLA 6 - Reflection through survey
  • Students spend 5 or so minutes responding to the survey questions below.
  • - What did you learn from this activity?- Which phase of creating a research study did you find most difficult?
  • - Why did you find that phase harder?
  • - What sort of support could have been offered to make devising, designing and developing a research study easier in this lesson?
  • - I was able to develop a research study in this lesson.
  • - Academic Journals, Books on mLearning, Exploration of Mobile Apps, and Technology Whitepapers enabled me to more quickly and easily develop a research study than if I did not have this system.
  • - As a result of this lesson I am more likely to adopt research based approaches to Mobile Learning Project Design
  • - Any other comments: (independent , group size: 1, Produce -  10 minutes)

SAMPLE 2: Relate Theoretical Knowledge to Practice

TLA 1 - Briefing
  • Teacher introduces the importance of focusing on usage patterns and motivation for mLearning, as one of the general principles of Mobile Instructional Design for students to focus on in their data collection task (Read/Watch/Listen - 10 minutes)

TLA 2 - Planning Data Collection
  • Students are grouped into small teams and plan data collection on Mobile Instructional Design (Collaborate - 20 minutes)

TLA 3 - Collecting data
  • The data collection is conducted with one or more members of the team live user observation, video recordings of mLearning, user app usage patterns analysis and so on to collect data about usage patterns and motivation for mLearning in Mobile Instructional Design (Practice - 45 minutes)

TLA 4 - Analysing data and presenting data as evidence
  • The teams select the best examples of evidence in the recorded data and share them with the rest of the group, providing an explanatory summary for each piece of evidence (Collaborate - 15 minutes)
  • All class members explore the collected materials (30 minutes)

TLA 5 - Reflecting on practice using evidence
  • The teacher uses the collected materials as stimuli to facilitate a discussion amongst the whole group about the links between students' own Mobile Instructional Design and the general principles of Mobile Instructional Design (Discuss - 45 minutes)

Monday, February 4, 2013

Photos of my card sorting activity

Even though this was a Week 3 activity, I took the opportunity to mess around a bit more with the Course Features Cards and actually snap photos of the outcomes. For this, I decided to focus more on the Blended version of the Introduction to mLearning, and not the MOOC version of the course since I wanted to focus on something I had a little more hashed out (the MOOC version will require a lot more thinking than I have time to right now :-) )

Activity 1
Directions: Choose a maximum of 12 cards from the pack which define key features of your course or module. The cards are grouped into four categories :
  • Guidance and Support (marked GS / coloured Orange )
  • Content and Experience (marked CE / coloured Green 
  • Communication and Collaboration (marked CC / Blue coloured  )
  • Reflection and Demonstration (marked RD / Purple coloured

Now look at your cards you have selected:
  • Are any colours of cards missing or less well represented? 
  • If so, what might be the impact on the student experience? 
  • Are their any other cards you would like to swap in?
  • What new cards could you add?

It's interesting in this sort that the Green color (content and experience) is less well represented (sorry, my nexus is new and I haven't gotten the hang of the camera yet :-)  ) whereas communication and collaboration are more represented. Guidance and Support, and Reflection and Demonstration are equally represented. I was a bit torn by the need to have equal amounts of every category, so I thought about adding a couple more Green cards and substituting some of the blue, but I decided to let them stand in the end. I could make more green cards fit in meaningfully.

Activity 2
Directions: Choose 16 cards from the pack which define key features of your course or module. Work individually or in a team to assemble the 16 cards into a diamond shape with the most important feature at the top and the least important at the bottom. This will help you determine the
relative importance of each aspect and where your priorities lie. As you build the diamond, note down any design decisions you make.
  • How can the available resources and time be best used?
I'll have to think about the best use of time and resources :) I'd like to use the in-class time for collaborative activities, peer review and feedback; but since I have not run the course yet, I don't know how long such task will take. I am wondering if some of the prep-work should be done out of the synchronous,in person, class.