Wednesday, January 28, 2015

Research: Process, Ethics, Validation, and Technicianship?

Derby Wharf, Salem, MA - Jan 2015 (Storm: Juno)
I am sure that last one is a word I just made up on the spot. It's been a slow week in 802.  I was reading Lisa's reflection on Lurking in 802 (she is in last year's cohort, so she is two courses ahead of us in Cohort 7), and how she viewed 802 at the time as a make or break experience for the Ed.D. program.  While 801 last semester was a whirlwind tour of Online and Distance Education, this semester is a whirlwind tour (or boot-camp perhaps) on the topic of Research and how to go about doing it.  The textbook, by Cohen and the gang, is something that I have read before (a few year ago), but this doesn't help a ton ;-)  There is a lot to unpack, and this book is dense.  Even if one could memorize everything (not a good way to learn, by the way), it's like going to philosophy course - you are there essentially to argue for and support your stance. You are also called upon to explain your underlying frameworks, or mental constructs and assumptions, for what you want to do.  To some extent this is a bit like therapy where you are called upon to reflect on why you hold those views and unpack your own biases and unspoken, but held, beliefs.

While this is something that friends and colleagues, and I, have been doing for a while, especially with our group in #rhizo14, vis-a-vis online research ethics (hey remember #massiveteaching courseraGate of 2014 and the discussions on ethics there?) being in class feel different.  In an online environment, while you may engage in these discussions, at some point if you feel like you've had your fill of the discussion you can choose to take a break and not engage any more.  In a course, however, you don't necessarily have that luxury.  You may take a small break from the discussion, and perhaps wait for other cohort-mates to step in, but you can't necessarily stay out of the arena for too long. Even if you could avoid thinking about such heavy subjects for the live seminars, or the asynchronous discussions, you still have homework to complete, which ensures that you will be thinking and articulating something about such weighty subjects.

Even though we are only about to complete week 4 (what? week 4? that's like 1/3 of the semester! Holy cow!), thus far this is a humbling experience on two levels.  First the readings make my brain hurt (figuratively). I haven't experienced this since Fall of 2010 when I was taking a course on psycholinguistics (which was also a primer on second language acquisition). Even though I had read all of the books and articles over the summer, and I had reviewed them just prior to each class session I still felt a bit lost with the majority of the readings.  It's not that I didn't get them, it's more like they all meshed in my head and only a small amount of distilled knowledge remained on the surface. It took discussion to really get those "a-ha" moments and really make connections with readings that remained beneath the surface. The second reason why this is humbling is that in the live sessions I'm don't feel adequately prepared.  This is connected to the first reason (the overwhelming amount of information that is taken in). I usually have something up my sleeve in live sessions (asynchronous discussions allow you to look things up and present your arguments), but I am now in a position (again) to consult my notes, to consult my highlighted text and articles, and the things I scribbled in the margins, and then still say "huh???" It's a bit of an academic rush, but it's humbling.

And now for a change in topic, but it's still related to 802.  A little while back I received this notification from LinkedIn for a new discussion.  The discussion is about PhD graduates being just technicians - so how can we help them improve? I suspect that by technicians the original poster means that their research is very mechanical in nature and that it follows a cookie cutter approach.  If the overall tone of the AU EdD program is like 802 I suspect that none of us will be cookie cutter researchers.  That said, I think that cookie cutter researchers exists because the overall environment supports them in some sense.  Some in the thread support Post-Doc work as a way to combat this technician mentality, but I think that this, too, is one way that higher education is prolonging academic adolescence amongst learners. In my view PostDoc positions are really temporary holding cells where people go to do more research because jobs aren't open. Those of you in higher education, if you think I am wrong in this assertion please let me know.  The PostDoc solution is sort of along the same lines of making doctoral students do more coursework before they have an opportunity to submit a proposal for research.  This is completely wrong in my view.

First proposed solution: If a student has completed an MA in the field that they are pursuing a doctorate in, then they shouldn't need 12-20 additional courses in order to get to that stage where your proposal is put forth and ready for comment by more seasoned academics. Students should have the most minimal of formal coursework which will have the effect of getting everyone on the same page and having people gel as a group of co-learners.  Courses should not be about content.  They should be about ways of thinking and more "brainy" stuff (a more proper word escapes me).  Courses contribute to mechanistic view of education and research because as learners we are seeking the path that will get us those good marks and high achievements (that the original posted wrote about), but they won't necessarily push us in the ways we need to be pushed to grow.  Students should go out and learn what they need to learn on their own, or with a group of co-learners, under the guidance of someone more seasoned. A class might be better for scheduling purposes (I am still an admin at my day-job), but this isn't necessarily what is best for learners.

Second proposed solution: In lieu of coursework, how about some qualifying papers? One of the pieces of advice that I've received from those who already have their degree is to think of my dissertation in chunks that I can pull out an publish separately with little editing.  I think that this is probably the making of a bad dissertation (or bad articles).  The instances of dissertation topics that can reasonably be transformed from 1 dissertation to multiple articles are probably few and far between.  It seems that the underlying idea is that new PhDs (I am using this as generic for "doctor" - EdDs and others Ds would fall into this category too) need publications and the way to get them is through the dissertation.  This, in my mind, contributes to the mechanization of research because, again, people are looking to get the most bang for the work that they put in.  If in lieu of courses students got their hands dirty with smaller research projects, things that culminated in publishable quality papers, then not only would those PhD students get to experience different research methods and ways of approaching knowledge generations, they would also have articles that they can submit for publication. This means that the dissertation could stand on its own without pulling double duty.

Third proposed solution: This may invite the ire of some recently minted PhD I know, but it need to be said: Newly minted PhDs (or EdDs for that matters) should not supervise students or teach in Doctoral programs until they have more experience under their belt.  I think that this is really important.  I don't think that I will know everything I ought to know once I finish my doctorate at AU.  This is not to say that I don't think that the AU is a quality program, quite the contrary!  What I mean by this is that it's impossible to have certain knowledge without the benefit of more experiences under my belt.  Learning is life-long, and the four (or five) years spent in a doctoral program is not sufficient to then turn around and mentor those who are just beginning their doctoral journey. I think that newly minted PhDs do need more time-on-task, and more intellectual brain-teasing in order to continue to hone their skills and expand their horizons. It is only through greater experience, and an open mind, that we are able to mentor others. Otherwise we fall into methodological  and disciplinary traps of our making. We are contributing to the echo chamber that we are in, instead of busting through those walls. After a period of mentorship by more experienced faculty, additional training, and greater time-on-task, in teaching, in service, and in research, should newly minted PhDs supervise new doctoral students (at that point they won't be newly minted any more).

Fourth proposed solution: Alt-Ac Careers! I know people who've gone into PhD programs, they completed them, but then the flame, the spark, the passion, the je ne sais quoi, just isn't there any more for research.  Don't belabor the point.  If someone earned their PhD but they don't do research, or they do it mechanistically because they have to produce something, then help them figure out an alternative to the academic career if that's not what they like to do.  Sometimes mechanistic application is due to lack of training (addressed through points 1-3) and sometimes it's because it's not what motivates people any longer.  It's perfectly fine if people's interests change.  The important thing is to figure out what you can do with your shiny PhD once you are done if you don't like doing research.  I don't think that teaching is the solution.  I think that those who do mechanistic research might have issues with sniffing out bad research, and this is a problem for teaching.  Teaching and Research are two sides of the same coin as far as I am concerned.  Your research (or review of the recent research!) points you to things you should be doing in the classroom in helping learners pick up new things and bridging that research and practice gap. If you can't do research well, you might not be able to evaluate it well.  It's not a rule, but something to keep an eye out for.

Well, that's all I have for that.  Your thoughts on this? How would you respond to this LinkedIn thread?




Post-script: We've had snow days the past few days, so I've had time to think about this for a while - hence the photo ;-)


Friday, January 23, 2015

Academic writing, but not in English...

One of the nice things about being a language geek and an academic is that you get access to research that has been published in other languages.  In addition to English I fare quite well with research written in French, Italian, and Greek. Even though I don't have any formal experience with learning Spanish I could probably get the gist of Spanish articles based on my familiarity of French and Italian.  When it comes to writing (producing language), the process is a bit painful.  My academic vocabulary isn't as developed in the other languages and the necessary stylistics of research publications in other languages is a skill that I don't yet have.

I am actually quite interested in developing that competency just for my own edification, but I would love to be able to publish research in other languages, with practice comes more familiarity and ability to use those languages anyway, and that's a goal that I strive toward.  That said, I started pondering the utility of publishing articles in other languages.  English seems to be the lingua franca for most things relating to my chosen fields of inquiry.  While there are journals and articles published in those other languages, they definitely seem to be the minority. So, inquiring minds, want to know how useful is it to publish in languages other than English, in the broad sense?  Would what I write get as big of an audience as the materials in English? I'd like to do my part to increase access to research in other languages, so this may be a way.

I am also wondering what the logistics are to translate, and republish, my work in other journals.  If I publish in Open Access Journals I suspect that it's easier to translate and re-publish elsewhere (with the caveat that this work is published already elsewhere), but do people really do that?  Is this something that is currently acceptable in academia - or would be too much of a disruption in the current publishing status quo?

International scholars, what are your thoughts?

Thursday, January 22, 2015

Axiology, Ontology, Epistemology, Researchology...

Alright, I made that last one up (probably).  This week (Week 2/14) in EDDE 802 we are tackling knowing, ways of knowing, "valid" knowledge and ways of known, frameworks for research and so on.  It's quite a huge topic, and something that even seasoned researchers keep coming back to and analyzing, debating, discussing, and re-evaluating. The prodding question this week to get our mental gears kicking is one of introspection - looking at our own beliefs and world views and seeing how those fit with defined ontologies and epistemologies that we are reading in the textbook.

The nice thing is that when I was preparing to teach my own (MEd level) research methods course the textbook we are using was the same (a previous edition, but the same nevertheless), so between my own experience as a learner (at the MA level) in research methods, my own experience designing and teaching a course, and now the experience of being back in the learner's seat has indicated one thing to me: regardless of how much one engages with this material, how much it is discussed and debated, there is always more to come back to and scratch your head, and proceed to ponder some more!  This isn't a bad thing, after all research does not (and should not) apply cookie cutter methods; this is simply wrong.

From an ontology point of view, I think that our interactions with animate and inanimate objects give them meaning, but we can't ascribe any old meaning to those objects. There are certain (for lack of a better word) affordances for each object. A stone can be a toy (think game pieces for checkers), it can be a weapon, it can be made into a tool (which I guess is another type of weapon), it can be a measure or counter-weight. The point is that the stone isn't all these things on its own, but it has the potential to be those things due to its properties if you add some human ingenuity.

From an epistemological frame, I used to be squarely (more or less anyway) on the qualitative side. Since I was never in the hard sciences, the quantitative was never too strong with me. For the first leg of graduate work (MBA, and MS in Information Technology)  most "research" was really there to support decision making.  While there were quantitative components, seeing as I focused on Human Resources I ended up focusing more on the qualitative human factors and less on the scientific stopwatch management.

For the last leg of my studies prior to Athabasca my applied linguistics department was, again more or less, all critical theory all the time - which is a consequence of having a critical theorist running the department and making hiring decisions while the department is small. Those who weren't critical theorists seemed mostly on the qualitative side of things. I do appreciate critical theory and the empowerment that it can bring. However, I think that too much of one thing makes you blind to other possibilities. Sometimes it seems to me that critical theorists use critical theory research to mask their opinions and rhetoric when there is little or no research involved.  I think that this is one of the weak points of critical theory and it should be addressed in some fashion.  Luckily, even though I no longer have daily contact with those critical theory professors (at least from a perspective of student/teacher), I still get to flex those critical theory muscle and discuss things with fellow Rhizo14 colleagues :-)

As far as my default, if I can call it that, my preferred mode of inquiry seems to always start with the following phrase "I wonder what would happen if..." I prefer to look at my collected data and looking for patterns (if they exist) and thinking about what they might signify, instead of starting with a hypothesis and looking to prove or disprove it. Although this too might be problematic because as humans I think we are, by nature, looking for patters and we might see patterns where there are none. I wonder if all researchers are this introspective...

What are your thoughts on this topic? How do you approach research and your own biases?

Friday, January 16, 2015

I dream of dissertation...

Week 1 of 15, of semester 2 of 8, of doctoral work is about to end!  The course that my cohort is focusing on this semester is a research methods course. Luckily neither I, nor it seems many of my classmates, are that new to research methods.  It's nice to have the group (or at least quite a few members of the group) exposed to the basics so that we can spend some time in critiquing and going deeper (and that's something we did on our cohort's facebook group this week anyway).  I also appreciate the fact the course isn't setup to only allow for one path through the course.  There are certainly foundational materials that we are expected to read and know, but for presentations it seems like we have a ton of choices in terms of what research methods we choose to present.

I've been thinking about the assignments and I think I will spend some time exploring research methods that I haven't had a ton of exposure in, or methods that I've been meaning to go much deeper into.  I think I will spend some time with Discourse Analysis - I've got a few books on my bookshelf that need  some reading on the topic, and I think I will focus on autoethnography.  Some members of the #rhizo14 community(and I) are working together on an autoethnographic paper (aka the un-paper) for a special issue of a journal and for an OLC conference presentation.  Autoethnography is new to me, so I guess I'm trying to kill two birds with one stone - both the paper for the journal and something for this course.  The whole aspect of autoethnography is making me think of the dissertation.  I've gone through many potential ideas for a dissertation topic including using design-based-research to convert the course that I teach (INSDSG 684) from a closed, institutional, course to an open course.  Seeing that the course was cancelled last fall (for low enrollment) and that this semester I don't even have the minimum amount of student to run the course, I am not sure that banking on this approach is wise. I may find myself with a re-designed course, fully open, but without learners.  No learners means no data, and no data means that there is little to write about.

In doing some initial work on autoethnography (and this is really preliminary at this point), I was thinking of using my own experiences as a MOOC learner (going 4 years strong in 2015) to write a dissertation about my MOOC experiences.  I am not sure if I will draw upon the previous 4 years, or if I will spend 18-24 months MOOCing in xMOOCs, cMOOCs, pMOOCs, rMOOCs,  and so on and do much more data gathering than I have done in the previous years.  While I have quite a lot of materials on this blog for MOOCs (over 200 blog posts...and counting) the current collection of data I have might be considered haphazard in its collection.

With the explosion of MOOC platforms, and the languages available, I am thinking that I could really sit down and learn in the various languages I know (including French, Italian, Greek, and so on).  It has been a really long time since I've considered myself an xLL (x = insert language of your choice) Language Learner.  One of the areas of research in linguistics is in ELLs (English Language Learners) and how students who have another native language are learning academic materials in a language that is not their own.  When I returned from Greece in 1994 and I started High School in the US I was, in earnest, an ELL.  While the seeds for English were in my head (I was born here and spent some years here before I moved to Greece), my language development wasn't the same as fellow classmates who were English speaking-only and had their schooling in English all of their lives.  It's obvious, at this point, that English is a language that I am no longer considered an ELL in. However, how about French, and Italian, and even German (my German isn't that great). I could pick up new knowledge in MOOCs, interact with classmates (in dreaded discussion forums), and not only pick up something new, but improve my language capacity in those languages (in theory). I think this might make an interesting dissertation.

The only trepidation I have is the method: autoethography.  While I do acknowledge the importance of critical theory in education and in research, and the validity of the researcher's and their experiences as the object of research, part of me is uncomfortable with this. Is studying and researching one's self just a tad bit narcissistic? Also, what about validity and applicability of the research findings - from a scientific point of view.  From a humanistic point of view what I write will be valid, as my own lived experience, however what would my dissertation committee think of this approach?  Something to ponder.  What do you think?



Saturday, January 10, 2015

Is our current HigherEd setup encouraging prolonged (academic) adolesence?

In a recent posting about doctoral degrees ("academic" versus "professional") there was one line of thought that I meant to explore, but I really neglected because it didn't quite fit in with the post the way it was ultimately flowed. In the ACM eLearn article that really got my mental gears going, and to which my post was a response to, the professional doctor "is more likely to consume research" (para. 5).

I find this statement  problematic on many levels with regard to to a doctoral degree, and the false differentiation between a PhD and an EdD, but I also find it problematic when I think of Higher Education in general.  My initial thoughts (last week) were that students, at the end of their Masters level studies should "consume" research, they shouldn't have to wait until their complete a doctorate in order to consume research.  After some time pondering the point I started wondering if we've come to a point in Higher Education where we are prolonging the academic adolescence of our learners by pushing activities and skills, such as access to research literature, to higher levels of academic accomplishment.  As much as I don't like the label, I am at the beginning of the millennial generation. Higher Education was promoted to us, in High School, as the thing to do in order to be setup for a career. So, Higher Education was really a means to get a job, or so it was promoted.  Courses on philosophy, and ethics, and English composition - man those felt like pulling teeth at times because I wasn't prepared to think like that, to think critically, because I always expected to get a knowledge dump in order to do a job. Despite the focus on employement and carrers I think I did well, after all I still talk about those professors with reverence.

Even at the MA level, some disciplines still seem like they are practicing knowledge dumping, or filling of "empty" vessels (that's a whole other problem, but it's a topic for another post).  Research, and critically analyzing research of others, isn't always something that we do in our MA level courses.  Again, I count myself as lucky because throughout my graduate education I have had professors who did push us to think critically about what we are reading.  It felt like pulling teeth at the time, but I think we're much better for it.  This, however, wasn't done systemically, in a curricular way, but rather on a class-to-class, or professor-to-professor basis.

When it comes to education, the statement above, the consumption of research is wrong on two counts.  First research should not be consumed.  Consumption to me has a connotation that you are not thinking about what you are reading.  You are taking in, and taking at face value, what someone else writes, says, or acts.  I think it's important as individuals who have received a certain education, be it BA, MA, or Doctoral education, that we critically analyze what we are reading.  Some research may be bunk, some may be good. But even good research needs to be thought about critically.  Just because the authors of particular research saw it going one way, it doesn't mean that it can't be applied in other, unforeseen, but equally effective, ways.  Thinking about what you are "consuming" is an important aspect of being educated.

The second thing here is that everyone who's been educated, again regardless of the level, should be able to do this.  This is not the purview of those who complete Doctoral work.  Granted, those with a Doctoral background may have an expanded scope through which to view, review, and think critically about research work, but that comes with experience and a prolonged apprenticeship period; 4 years of higher education for the BA versus 10 or more for those with some Doctoral degree.  I am wondering if such an attitude toward education (i.e. PhD is the domain of research consumption) is prolonging the learner's academic adolescence, not enabling them to be self sufficient and a life-long learner in their respective fields; thus - to some extent - making them dependent to those with a Doctorate necessary to feed them what they need to know and act as gatekeepers for knowledge.

Thoughts?