In any case, one of the first things for this week is to introduce yourself in the forum. I didn't do this because in my experience in massive environments you get a boatload of "hello"s or "introduction"s or "Hello from____" threads which are hard to distinguish who is coming in from where at the end of the day. This might work in smaller group settings (this type of intro activity using course fora) but, for me at least, it fails massively in a MOOC. OK, I am poo-poo-ing this approach, but I don't have something to take it's place right away, just file this under: "things to work out in MOOC pedagogy." Maybe jumping straight into the content and the discussion is better and you get to learn about your participants throughout the MOOC, but then again, that might leave out the lurkers who might want to introduce themselves, but not go beyond that. Oh well, I've got not answers do this just yet :-)
In any case, let me just jump to the heart of the matter and respond to this week's introduction and to one of the readings (more reactions to readings to come as I read them, or as I find other interesting things to respond to from fellow participants.
This week's intro:
Few systems in society are subject to the bold proclamations of reform that now assault higher education. In most countries, higher education is buffeted by two strong, but opposing trends:
1. Transform higher education by making it more cost effective and increasing learner access,
2. Build a world class university system to advance knowledge, research, and economic competitiveness.
Publicly funded universities have been significantly impacted by the reduction in state funding, with student tuition rising as a result. The for-profit and online learning sectors have flourished over the past decade. This week, we will consider the scope of higher education change and the tensions that exist.
Discussion question: Of the change drivers that you've encountered, which are the most significant?
I have to say that the first reading how the American University was killed in five easy steps was quite depressing. That said, I am too young to have direct knowledge of what happened in the '60s, but from what I've read, it seems like things were better back then. All I have, from direct experience, is my own lived experiences, those of someone born in the '80s - some might call me a digital native, I however do not. What I offer here are my own opinions and views based on readings prior to this, the above linked to article, my own experiences and the questions for CFHE12 week 1 (see above).
My own undergraduate experiences are based on going to college in order to get a good job something pointed to in the blog post, and something that neither me, my family, or my high school guidance counselors questioned (what is a good job anyway?). In any case, I picked a local state school for my own education due to cost, my financial aid paid for everything because my school was so cheap at that point (around 8k for one year, now 13 years later it's about double - travesty!). My father was eager to not have me think about the price, but given how much I don't like to take on debt (even as kid), I opted for state school - wise decision.
That being said, I think that money is the first driver for change. I think we need to go back to making education affordable, making education, again, a public good. Your definition of affordable may be different from mine. The author of the blog post was quoting something like $600 of UC Berkley. My own definition is one that is based on what I lived through. $8k a year may not be considered out-of-pocket affordable, but with things like financial aid and Pell grants, the out-of-pocket expenses can be close to nothing.
How do we reduce that? Get rid of the administrative overheard and severely reduce salaries for admins. While some 6-figure faculty members may be the lightning rods for the ire of people who believe that education has gotten too expensive, most faculty are adjuncts who get paid peanuts. The big costs are in admin salaries (and there are a lot of them), and (it seems to me) that it's like an arms race. You can't make less than what you made before, or less than your neighbor in a comparable job, so salaries keep getting inflated.
The second thing, related to price, is the "practicalness" of the degree, or getting prepared for the job market. I've had many discussions with friends who've pursued the same degrees as I have and exploring their annoyance that their Master's degree did not prepare them for the jobs they sought: i.e. they did not learn the software packages that employers have come to expect. I've been in the position to defend the departments by saying that's up to the learner, and that a degree isn't just about learning x-program, but rather higher level concepts that can be applied regardless of concept. This pulled me back to my own annoyance with my undergraduate advisor. When I was an undergraduate I expressed dismay that half my computer science curriculum was math, and that I was expecting to learn more programming languages. Don't get me wrong, the part that was CS I loved, the math part no so much. My advisor informed me that I could learn other languages on my own, and this severely pissed me off ;-) Now I am saying the same thing to others (LOL).
So what is the problem with a practical degree? Well, nothing really! The problem comes in when the degree is (1) only about practical things and not setting you up for life long learning, and (2) when a degree is required for something that you don't really need a degree for. Starting with life long learning, a college education should be about getting people acclimated with things that they don't know, and sharpening their reasoning and critical analysis skills so that as adults they can venture into areas where they aren't experts but get themselves up to speed. If you are just learning excel, or only accounting principles (for example) without analyzing those underlying assumptions and contributing to the improvement of what you've learned, you are being setup for the need for paid professional development, so as time goes by, your degree has become less and less valuable without demonstrable continuing education (i.e. certificates from some authorized body - which cost money!). So in addition to having paid a lot of money for your degree, it becomes worthless the more time has elapsed from the granting of it.
The second thing is to stop finally requiting a degree for something that does not require a degree. The BA has become the stand-in for pretty much anything. Of course people will expect some sort of job skills of they are paying in both time (4-6 years for a BA, 2-4 for an MA) and money; that is how the argument is framed, and that is what they expect, not knowing any better. My punching bag for this type of example is the ALA and their accredited institutions that provide Masters of Library Science degrees. Having worked in a library I know many librarians who say that the MLIS degree is just a union card to get you in. Most didn't pick up anything of importance during their degree that they couldn't pick up with on-the-job training.
Finally, this situation leads to the de-professionalization of teaching, where teaching has become not something that sharpens your mind and prepares you for dealing with unforeseen things, but rather it has become instruction for dealing with tasks in automated ways. The problem in higher education is that there already is a colonization of the mind of the researcher, the belief that teaching (in their minds Instructing) is something that happens in a community college while research is what happens at the University. I call bullshit. Research is all fine an dandy, but there needs to be an acknowledgement that the purpose of the professor is to teach, and research is something that happens as part of continuing professional development. Seeing teaching as something beneath them is one of our problems today in higher education, and who does the cleanup? Low paid, and seldomly respected by the institutions, adjuncts.
Does this mean that there is no place for research in the University setting? Well, don't be daft! Of course that's not what I mean! I just think that we need a bit more balanced of a distribution of faculty, and what they do. For instance there ought to be a balance between:
- research faculty (faculty who 70% research, 20% teaching and 10% on governance),
- core faculty (those who spend their time 10% researching, 30-40% on governance - which includes curriculum, and 50-60% on teaching)
- clinical faculty (those who spend 90% of their paid time teaching because they are practitioners in the field and come in to infuse the classroom with their wisdom by teaching a select amount of courses, and 10% on governance)
that's all for now - thoughts? :-)