MOOC participation - open door policy and analytics
The other day I was reading ZML Didaktik on the topic of MOOC participants. In MOOCs, one of the big questions is why are people lurking and not participating? If more than 500 people join a MOOC, why are only 10% contributing with any amount of regularity? On the same blog, in a previous blog post, I had commented (it was an open stream of thought really) that perhaps there should be an open enrollment period, and then if the system sees that certain participants are bellow a threshold of activity, the system may give them the option to self-identify as a lurker, or un-register from the class.
This line of thought went along a view that compared to the traditional classroom; in a traditional classroom students register for the course, they can attend classes for a week and then decide whether or not they want to stay with the course or not. If they do stay with the course they know that there is a certain amount of "lurking" that they can get away with, but they do have to participate a certain amount. This is what makes traditional classroom analytics easier. You know exactly how many people registered, how many people dropped the course (early in the semester) or withdrew (late in the semester), and you know how often people participated and the quality of their participation. There is also some artifact involved with their participation that indicates their mastery of the topic.
In contrast, most MOOCs (that I've been a part of), have open enrollment, the dip in-jump out aspect seems to be a big thing, there are many lurkers and many people who've signed up but don't come back (not even as lurkers), and for most registered users there is no artifact that shows their mastery. I am not saying that the that I want to eject people from any MOOC I create (like jupidu I want to give people an opportunity to participate), but the question is - how does one collect meaningful learner and learning analytics when there are so many no-shows in a MOOC? Perhaps a "snooze" button would be a good idea for measuring lurkers.
If people don't participate for X period of time, they get a notification by email. They can choose to "snooze" by saying that they are a lurker (and X weeks later they get notified again), or they can cancel the alarm by saying that they decided to opt-out of the course. If they decide to opt-out we can find out why. If they decide to lurk, we can find out how often they lurk, what topics they come out lurking for, and we can figure out if there is a lurking-to-participant or lurking-to-drop-out rate (and why).
Some questions I will leave you with:
This line of thought went along a view that compared to the traditional classroom; in a traditional classroom students register for the course, they can attend classes for a week and then decide whether or not they want to stay with the course or not. If they do stay with the course they know that there is a certain amount of "lurking" that they can get away with, but they do have to participate a certain amount. This is what makes traditional classroom analytics easier. You know exactly how many people registered, how many people dropped the course (early in the semester) or withdrew (late in the semester), and you know how often people participated and the quality of their participation. There is also some artifact involved with their participation that indicates their mastery of the topic.
In contrast, most MOOCs (that I've been a part of), have open enrollment, the dip in-jump out aspect seems to be a big thing, there are many lurkers and many people who've signed up but don't come back (not even as lurkers), and for most registered users there is no artifact that shows their mastery. I am not saying that the that I want to eject people from any MOOC I create (like jupidu I want to give people an opportunity to participate), but the question is - how does one collect meaningful learner and learning analytics when there are so many no-shows in a MOOC? Perhaps a "snooze" button would be a good idea for measuring lurkers.
If people don't participate for X period of time, they get a notification by email. They can choose to "snooze" by saying that they are a lurker (and X weeks later they get notified again), or they can cancel the alarm by saying that they decided to opt-out of the course. If they decide to opt-out we can find out why. If they decide to lurk, we can find out how often they lurk, what topics they come out lurking for, and we can figure out if there is a lurking-to-participant or lurking-to-drop-out rate (and why).
Some questions I will leave you with:
- Should MOOCs continue with this come-and-go as you please policy? What are the implication of either approach?
- Should MOOCs actively interrogate/poll lurkers and drop-outs to figure out why the MOOC isn't to their liking? After all, a MOOC cannot be all things to all people.
- What are ways to conduct a learner analysis in MOOCs? After all, you can't design a course if you don't know your audience.
Comments
What you suggest is that those folks might not be ideal MOOCers, which is a value you are placing on active contributions. Learning from these folks would be useful, but I strongly caution against the assumptions in the approach outlined above.
AK, just because you are a high participating member, does not mean that only highly vocal participants are the only meaning-makers. Isn't your suggestion to send an opt-out email exactly the opposite of "open learning"? Wouldn't that be a way of saying to lurkers "hey, get out if you are not willing to contribute"? I would find that hard to take as a more lurker-ish participant.
Besides, the open policy helps to sustain the number of participants (in their own ways) in MOOC, which works better with a greater number because not everyone is available every week.
In true autonomous learning fashion, why not ask learners set and declare (or not) their own definition of participation, levels and specific configuration? Eschew privileging one over another. Those craving gold stars for participation can self-award them.
The term 'Learning analytics" suggests more than participation data available using data mining and social network analysis tools. Wiki says learning analytics is "the measurement, collection, analysis and reporting of data about
learners and their contexts, for purposes of understanding and
optimising learning and the environments in which it occurs."
I don't understand how no-shows and lurkers present a problem for this kind of analytic. Change11# is planned for 36 weeks. Organizers have already voiced an expectation that people will come and go and that few will be along for the whole 36 weeks - and why should they be? People will pop in and out as time and interest allows.
This coming and going is part of the learning environment. It would be good to map this somehow, even we aren't sure yet how we could use this data. Perhaps someone who had good knowledge of the technical end of networking can suggest ways to do that.
Don't get me wrong, I am not anti-lurker, I just want to know who's actually here, and if you are a lurker what works well in the MOOC format for you, and you decided to check out, what didn't work.
What I am questioning is how many people are actually lurkers, and how many people have checked out. My rudimentary "snooze" system would be a way to collect data on how many of the "inactive" participants are actually still here. This activity need not be gauged by tweets or blog posts. If each Daily MOOC email has a little piece of tracking code associated with it, if links from that Daily email are clicked that can register as participation!
For what it's worth, I do not see a MOOC as a CoP. I see CoPs as not limited by a certain block of time. CoPs can develop out of MOOCs as people develop their PLEs, but a MOOC in and of itself, IMHO, is not a CoP to begin with.
I do agree with you, that as an active participant I am not the only meaning maker here - meaning is created by each and every individual participant that interacts with people and artifacts produced in the MOOC, so meaning created by lurkers is there, but it is not visible to me. It's still meaning, and it's as valid meaning as the one that I have created - We are in agreement on this.
As far as Open Learning goes... I think this is going to come down to the perennial linux "free beer/free speech" at the end of the day ;-) There are shades of open, at least that is MY interpretation, thus there are shades of "open learning". Again, as I said "I am not saying that the that I want to eject people from any MOOC", those people being lurkers. I am looking for more meaningful analytics. If people have checked out from the course (i.e. they are no longer interested in the MOOC but the MOOC organizers see them as "part" of the MOOC) but they are not removed from the "roster" then you are getting false "engagement" readings. (engagement can take many forms, so don't think I am only talking about people blogging like I am or people commenting back like you are). The learning is still open, even if people un-register. Change.mooc.ca is available to all, the daily newsletter are still available, blog posts and twitter posts are free and open - so I really don't know how the course becomes "closed" if you aren't on a course roster.
I am glad you posted because I am always interested in hearing more from lurkers :-)
As I wrote above and bellow, I am not anti-lurker, I've lurked a lot in the past, and I still do. I think that simply by opening your Daily MOOC you are participating. That is a good beginning, and just like "open" there are different levels of participation.
I don't expect everyone to create video blogs, podcasts, blogs, youtube (heck, I don't even do that), but one way to measure if people are still there is to see if people are reading the Daily Newsletter and if they are visiting any of the links.
I do agree with you when you say "I think we should probably be more concerned if the numbers participating start dropping."
If there isn't any sort of metric of how many people are lurkers and how many people have left the building. I'd like to know what motivates lurkers (and instances of this blog post do bring people out - controversial as they might be) - and I'd like to know why people "Dropped" the course. As someone who will design a MOOC (or MOOCs) I am interested in these things because it means that I am hitting (or not!) my target audiences. I want to make sure that the number of participants does not drop, and this is a way I thought to make it happen :-)
No need to worry about it! My first MOOC was a rough one as well! Once you figure out what works and what doesn't for you, you will be a much better MOOCer.
One of my issues with "openness" as some people in MOOCs describe it is the lack of direction, or what may be interpreted as "anything goes". I think that if a MOOC wants to be a Class and not a Conference there needs to be some structure to help people get started in a MOOC.
I am glad you've replied to this post, hang in there! Things do get better :-) They are indeed overwhelming, but I think one of the tricks is learning to ignore some things (and developing that personal filter). Also, don't be too hard on yourself!
I do agree with you that learning analytics can be more than posting to a blog, or posting a comment or a tweet. I think people automatically jumped to the conclusion that post = participation. For me a simple reading of the Daily Newsletter is enough to count you "in" and not send you a notification asking you whether you want to continue or not.
You also make important points about course design. If the course is designed to be dip-in-jump-out, then that is fine, because presumably the pedagogy supports learners/participants that want to both continue on a week-to-week basis and those who dip-in-jump-out. If coming and going are part of the environment, it still makes sense to gather learning and learner analytics to (at the very least) see "what sells" (for lack of a better term).
Some MOOCs have had "levels" of participation (mobiMOOC comes to mind), and people can decide where they want to be. As I've said, I have nothing against lurkers and not everyone participates on the same levels throughout the duration of the course. I don't want to be the arbiter of drawing the line in the sand of who participates and who does not. I do however want to know who is in the MOOC and who is not (lurkers are defined as being "in") because then as a potential MOOC organizer I can tailor the MOOC to the needs of the participants (and yes, lurkers are participants). If I don't know who is here and who is not, I cannot do that tailoring :-)
Congratulations on sparking such an interesting dialog. I'll admit, when I started reading your post my hackles went up. My thoughts went something like this... To whom is it a big question why lurkers aren't participating? Does AK really want to set rules for what is "good" mooc behavior? Would he boot people from a mooc? What is wrong with the dip-in, jump-out factor? Bear with me this is NOT an attack. In fact as I continued reading, your questions made increasing sense. I may not agree with the solutions but you are raising a thought problem and throwing out ideas for feedback.
You state "one of the big questions is why are people lurking and not participating?" I agree that there are many questions that researchers are wondering about "lurkers." And yours is one of them. And there may be a distinction here worth making. MOOCers who are primarily learners may not be asking about lurkers. MOOCers who are learners and researchers may be. Here lies a rub that may account for some of the reactions you are getting.
As a learner AND researcher, I too would like to know who is lurking. I would like to learn about people who dip-in and jump-out, dip-in and float, dip-in and out and in and out, and those who stay on the shore and watch the bathers. I'd like to understand the many patterns of engagement in this experimental learning environment.
As far as analytics go, I think it would be great if we had buttons within the daily that would enable people to provide quick and easy engagement updates (read=I read this daily). In the spirit of openness, it makes sense for this to be optional. But we need to make sure that analytic efforts are not imposing a burden on participants such that participation becomes a hassle.
People are impressed by numbers. The numbers of enrollees in a MOOC provoke an initial "WOW" followed by a suspicious "But how many really participate?" Maybe we need to shift our attention from the fact that there are 1700 enrollees and be satisfied with the smaller WOW of there are 200 people who have posted this week (or whatever the figure is). Then rather than trying to weed out the 1000 that might not be doing anything, focussing on how we can see and understand the experience of the 500 that might be having a great time on the shore.
What do you think?
NOTE no humans or animals were harmed in the making of these invented numbers. Please don't quote them as they have no basis in reality.
You are right - I am not really interested in weeding out those 1000 that are not even at the beach. I am interested in "playing host" so to speak and going around to both people obviously engaged in something, and those who are observing the engagement and saying "hey! welcome to the beach party (MOOC) what's cool and exciting (according to you)?"
Those 1000 people who've left the beach are obviously not there, but in our online environments we have no way of making that visual distinction the same way we have it in our metaphorical beach party. Having those 1000 ghosts makes it hard to go around and say "hey" to the people observing.
I also agree with you that clicking a button that says "I read the daily" can get old quick. This is why I am thinking of some sort of mechanism like amazon or apple have, where the URL you click is automatically associated with someone who is referring you to that product (so they can get a commission). If I read the daily today, the "read" marker on my computer registers on the MOOC server, so it registers me as active (so I won't get any annoying emails about renewing my commitment to the MOOC). If 5 or 6 weeks go by without me reading the daily newsletter (and the server will automatically know if I did or not) that means that I could be considered a ghost - someone who left the party but told no one.
I have no interest in excluding anyone, after all greater diversity, greater combinations (to paraphrase Commander Spock ;-) ), but as a party organizer I am interested in knowing why people came and if they are having fun (as opposed to looking for ghosts so I can serve them some sangria)...
OK, maybe I took the metaphor a little too far :-)
Some of this is a turn of phrase such as, "they know that there is a certain amount of "lurking" that they can get away with" suggests that lurkers are doing something they ought not do. You question "Should MOOCs actively interrogate/poll lurkers and drop-outs to figure out why the MOOC isn't to their liking?" The question suggests that the reason that a person lurks or stops out is that the MOOC isn't to his/her liking. There are many other reasons and again it suggests that the behavior is a problem.
It might not be what you intended and I can see that you could be motiavted by the desire to count heads and understand how MOOCs work, but the language suggests some strong values around good participation.
At least you got us all into a discussion.
I'm actually right down participating in two MOOCs and pretty much lurking in another. I like to think that it's sort of like the Reddit take on "building karma." Those who lurk for whatever reason may not be sharing/contributing in any visible way to the MOOC but because of their presence they may be building karma that they'll share someplace else.
I'd say you built karma with this post ;-)
All quite frustrating from your perspective. I agree. Volunteer teaching ESL online, I've experienced similar frustrations when students can't or won't provide enough information. Some do - the best and most motivated learners - but most don't or can't. More can't than won't I suspect. Many leave but some stay and get something out of the class. I've noticed similar patterns with developmental students at US community colleges. More significant, the less reflective are aware that they have learned, gotten something out of the experience. One wonders how and whether sufficient. When autonomous and not for credit, sufficient is subjective and harder to quantify.
There's a closely related thread, "summative evaluations in a MOOC," at MOOC_Research in Google groups that you might be interested in looking at
Although he was writing about a site where flaming was common - very different from the Change MOOC - the majority of his comments about lurkers and their reasons for lurking still hold true.
Lurkers may just be reading and not otherwise engaging - but they're also likely to be carrying on the conversations in one-to-one environments or face to face.
In the Change MOOC stats I probably look like a lurker, because I'm not blogging or joining in with live chats or commenting on Facebook. On the other hand, I'm reading the newsletter and lots of the blog posts, discussing them with colleagues, making notes, building up sets of resources and tying these in with a face-to-face discussion group.
For a MOOC to function it needs active participants - so we can't all lurk. On the other hand, for a MOOC to be truly successful it needs to have an impact on how we live our lives offline as well as online - and making that change doesn't necessarily leave us time to blog about it.