Alright, I made that last one up (probably). This week (Week 2/14) in EDDE 802 we are tackling knowing, ways of knowing, "valid" knowledge and ways of known, frameworks for research and so on. It's quite a huge topic, and something that even seasoned researchers keep coming back to and analyzing, debating, discussing, and re-evaluating. The prodding question this week to get our mental gears kicking is one of introspection - looking at our own beliefs and world views and seeing how those fit with defined ontologies and epistemologies that we are reading in the textbook.
The nice thing is that when I was preparing to teach my own (MEd level) research methods course the textbook we are using was the same (a previous edition, but the same nevertheless), so between my own experience as a learner (at the MA level) in research methods, my own experience designing and teaching a course, and now the experience of being back in the learner's seat has indicated one thing to me: regardless of how much one engages with this material, how much it is discussed and debated, there is always more to come back to and scratch your head, and proceed to ponder some more! This isn't a bad thing, after all research does not (and should not) apply cookie cutter methods; this is simply wrong.
From an ontology point of view, I think that our interactions with animate and inanimate objects give them meaning, but we can't ascribe any old meaning to those objects. There are certain (for lack of a better word) affordances for each object. A stone can be a toy (think game pieces for checkers), it can be a weapon, it can be made into a tool (which I guess is another type of weapon), it can be a measure or counter-weight. The point is that the stone isn't all these things on its own, but it has the potential to be those things due to its properties if you add some human ingenuity.
From an epistemological frame, I used to be squarely (more or less anyway) on the qualitative side. Since I was never in the hard sciences, the quantitative was never too strong with me. For the first leg of graduate work (MBA, and MS in Information Technology) most "research" was really there to support decision making. While there were quantitative components, seeing as I focused on Human Resources I ended up focusing more on the qualitative human factors and less on the scientific stopwatch management.
For the last leg of my studies prior to Athabasca my applied linguistics department was, again more or less, all critical theory all the time - which is a consequence of having a critical theorist running the department and making hiring decisions while the department is small. Those who weren't critical theorists seemed mostly on the qualitative side of things. I do appreciate critical theory and the empowerment that it can bring. However, I think that too much of one thing makes you blind to other possibilities. Sometimes it seems to me that critical theorists use critical theory research to mask their opinions and rhetoric when there is little or no research involved. I think that this is one of the weak points of critical theory and it should be addressed in some fashion. Luckily, even though I no longer have daily contact with those critical theory professors (at least from a perspective of student/teacher), I still get to flex those critical theory muscle and discuss things with fellow Rhizo14 colleagues :-)
As far as my default, if I can call it that, my preferred mode of inquiry seems to always start with the following phrase "I wonder what would happen if..." I prefer to look at my collected data and looking for patterns (if they exist) and thinking about what they might signify, instead of starting with a hypothesis and looking to prove or disprove it. Although this too might be problematic because as humans I think we are, by nature, looking for patters and we might see patterns where there are none. I wonder if all researchers are this introspective...
What are your thoughts on this topic? How do you approach research and your own biases?