Is our current HigherEd setup encouraging prolonged (academic) adolesence?
In a recent posting about doctoral degrees ("academic" versus "professional") there was one line of thought that I meant to explore, but I really neglected because it didn't quite fit in with the post the way it was ultimately flowed. In the ACM eLearn article that really got my mental gears going, and to which my post was a response to, the professional doctor "is more likely to consume research" (para. 5).
I find this statement problematic on many levels with regard to to a doctoral degree, and the false differentiation between a PhD and an EdD, but I also find it problematic when I think of Higher Education in general. My initial thoughts (last week) were that students, at the end of their Masters level studies should "consume" research, they shouldn't have to wait until their complete a doctorate in order to consume research. After some time pondering the point I started wondering if we've come to a point in Higher Education where we are prolonging the academic adolescence of our learners by pushing activities and skills, such as access to research literature, to higher levels of academic accomplishment. As much as I don't like the label, I am at the beginning of the millennial generation. Higher Education was promoted to us, in High School, as the thing to do in order to be setup for a career. So, Higher Education was really a means to get a job, or so it was promoted. Courses on philosophy, and ethics, and English composition - man those felt like pulling teeth at times because I wasn't prepared to think like that, to think critically, because I always expected to get a knowledge dump in order to do a job. Despite the focus on employement and carrers I think I did well, after all I still talk about those professors with reverence.
Even at the MA level, some disciplines still seem like they are practicing knowledge dumping, or filling of "empty" vessels (that's a whole other problem, but it's a topic for another post). Research, and critically analyzing research of others, isn't always something that we do in our MA level courses. Again, I count myself as lucky because throughout my graduate education I have had professors who did push us to think critically about what we are reading. It felt like pulling teeth at the time, but I think we're much better for it. This, however, wasn't done systemically, in a curricular way, but rather on a class-to-class, or professor-to-professor basis.
When it comes to education, the statement above, the consumption of research is wrong on two counts. First research should not be consumed. Consumption to me has a connotation that you are not thinking about what you are reading. You are taking in, and taking at face value, what someone else writes, says, or acts. I think it's important as individuals who have received a certain education, be it BA, MA, or Doctoral education, that we critically analyze what we are reading. Some research may be bunk, some may be good. But even good research needs to be thought about critically. Just because the authors of particular research saw it going one way, it doesn't mean that it can't be applied in other, unforeseen, but equally effective, ways. Thinking about what you are "consuming" is an important aspect of being educated.
The second thing here is that everyone who's been educated, again regardless of the level, should be able to do this. This is not the purview of those who complete Doctoral work. Granted, those with a Doctoral background may have an expanded scope through which to view, review, and think critically about research work, but that comes with experience and a prolonged apprenticeship period; 4 years of higher education for the BA versus 10 or more for those with some Doctoral degree. I am wondering if such an attitude toward education (i.e. PhD is the domain of research consumption) is prolonging the learner's academic adolescence, not enabling them to be self sufficient and a life-long learner in their respective fields; thus - to some extent - making them dependent to those with a Doctorate necessary to feed them what they need to know and act as gatekeepers for knowledge.
Thoughts?
I find this statement problematic on many levels with regard to to a doctoral degree, and the false differentiation between a PhD and an EdD, but I also find it problematic when I think of Higher Education in general. My initial thoughts (last week) were that students, at the end of their Masters level studies should "consume" research, they shouldn't have to wait until their complete a doctorate in order to consume research. After some time pondering the point I started wondering if we've come to a point in Higher Education where we are prolonging the academic adolescence of our learners by pushing activities and skills, such as access to research literature, to higher levels of academic accomplishment. As much as I don't like the label, I am at the beginning of the millennial generation. Higher Education was promoted to us, in High School, as the thing to do in order to be setup for a career. So, Higher Education was really a means to get a job, or so it was promoted. Courses on philosophy, and ethics, and English composition - man those felt like pulling teeth at times because I wasn't prepared to think like that, to think critically, because I always expected to get a knowledge dump in order to do a job. Despite the focus on employement and carrers I think I did well, after all I still talk about those professors with reverence.
Even at the MA level, some disciplines still seem like they are practicing knowledge dumping, or filling of "empty" vessels (that's a whole other problem, but it's a topic for another post). Research, and critically analyzing research of others, isn't always something that we do in our MA level courses. Again, I count myself as lucky because throughout my graduate education I have had professors who did push us to think critically about what we are reading. It felt like pulling teeth at the time, but I think we're much better for it. This, however, wasn't done systemically, in a curricular way, but rather on a class-to-class, or professor-to-professor basis.
When it comes to education, the statement above, the consumption of research is wrong on two counts. First research should not be consumed. Consumption to me has a connotation that you are not thinking about what you are reading. You are taking in, and taking at face value, what someone else writes, says, or acts. I think it's important as individuals who have received a certain education, be it BA, MA, or Doctoral education, that we critically analyze what we are reading. Some research may be bunk, some may be good. But even good research needs to be thought about critically. Just because the authors of particular research saw it going one way, it doesn't mean that it can't be applied in other, unforeseen, but equally effective, ways. Thinking about what you are "consuming" is an important aspect of being educated.
The second thing here is that everyone who's been educated, again regardless of the level, should be able to do this. This is not the purview of those who complete Doctoral work. Granted, those with a Doctoral background may have an expanded scope through which to view, review, and think critically about research work, but that comes with experience and a prolonged apprenticeship period; 4 years of higher education for the BA versus 10 or more for those with some Doctoral degree. I am wondering if such an attitude toward education (i.e. PhD is the domain of research consumption) is prolonging the learner's academic adolescence, not enabling them to be self sufficient and a life-long learner in their respective fields; thus - to some extent - making them dependent to those with a Doctorate necessary to feed them what they need to know and act as gatekeepers for knowledge.
Thoughts?
Comments