#edcmooc - almost human
Man, it's been a crazy week. I've been jotting down notes for this post from the various viewings, readings, fellow blogger posts, and discussion forums. This was meant to be several posts over the week, but it all wrapped up into one big thing. Oh well. Such is life ;-) This week I'm creating some category headers to make things easier to read.
A follow up comment comment was questioning the necessity of viewing everything through a competitive lens. The implication is that learners work in a solitary manner, in a zero-sum environment, so my win means your loss. So, as the commenter asked, why not work together? Education doesn't have to be zero-sum. We can, and in fact do, work together. But, it all comes down to each person's individual goals and motivations for being part of an educational venture. How you as a learner traverse the path of the course from start to end will depend on a lot of things, including your own motivations.
Finally, there was an interesting discussion on essays which came about from the readings on automated or peer grading of essays. There were two distinct points that came out:
For what it's worth, I think that writing, as a process, is both an internal and external motivator. While I think many of us would like to engage with others through our writing, it's not the only motivator for our writing. There are many blogs out there with few readers, this blog included. I don't write necessarily to have people read my posts and thus engage intellectually with me, but it's a way of discussing various views with yourself and by doing it openly you have the opportunity to both involve others in this process, or share your understanding with others. In one case (discussion with one's self) you are pushing yourself to become more knowledgeable, while in the other case (discussion/engagement with others) you are potentially engaging in a vygotskian dialogue where you are the More Knowledgeable Other in some cases, and your peers in others; thus through common dialogue expanding the overall knowledge and (hopefully) understanding in the network. When it comes to automated essay scoring, I already mentioned that the emphasis is placed on the grading, and not the process in a previous post. I would also add, based on this hangout recording, that I think that essay scoring is potentially detrimental to the discussion with one's own self because you are not writing to engage with yourself or others but rather you are writing to beat the automated machine algorithm that scores your paper.
This brings us to the difficulty of defining what is being human and Fuller in defining humanity. In thinking of this TEDx video it reminded me of some really interesting characters that I've come across over the years in Science Fiction shows that are either androids, cyborgs, or some other type of artificial intelligence.
One of the first characters that comes to mind is that of Lt. Commander Data on Star Trek: The Next Generation. Throughout the series, and the subsequent movies, the audience (and the cast of the show) explore what it means to be an individual and to be human. Data refers to his maker (Dr. Soong) as his father, and Dr. Soong's significant other as his mother. Data keeps on his pursuit of becoming more human by experimenting with art, having a pet (spot the cat), having romantic relations with a crew member, and in the series also creating an android of his own as a way of procreating. When Starfleet engineering wanted to take him apart to learn what makes him tick (and risk damaging him) the whole issue of what it means to be an individual came up again. Finally, at the end of the last TNG movie, data sacrifices himself to save his friends, and the crew. As a fan this annoyed me, but in the comic leading up to the 2009 Star Trek film we see Data back from the dead, by transplanting a copy of his memories and experiences into beta, a more primitive android, of the same make. Eventually the primitive positronic brain of beta adapts to accomodate all of data's personality. So, what does that mean in terms of who is human? How does it connect to World Builder?
Finally, with Data, what I found interesting was that a Vulcan (arguably another human) told data in one of the episodes that Vulcans strive all their life to reach what Data has had since birth, but on the other hand Data would gladly give it up to be more human.
Next we have another Roddeberry creation: Andromeda. In Andromeda the spaceships have avatars that help the ship AI communicate with the crew and captain. The relationship between ship's Avatars and their Captains seems to also take more intimate dimensions from time to time since it appears that these Artificial Intelligences aren't cold calculating war machines, but rather symbiotic beings that are capable of fondness, caring or even love. As Data, from Star Trek, would say "As I experience certain sensory input patterns, my mental pathways become accustomed to them. The inputs eventually are anticipated and even missed when absent." The interesting thing about Andromeda (or "Rommie") is that she exists in, at least, three places simultaneously - an android body, a holographic projection, and on a computer screen. There have been many times where all three personifications of Andromeda have conversations to work out problems and figure out courses of action. This reminds me a lot of what Hamish mentioned in the Hangout about writing in order to have a conversation with one's self, and to be able to work out issues. I am currently re-watching Andromeda, part way through season two, since I missed a lot of it when it was originally on television. Who knows what else comes up in terms of being human as the series progresses.
The next person that comes up is Cameron from Terminator: The Sarah Connor Chronicles. In the movies the terminators are portrayed as cold heartless machines that do what they are programmed to do. This usually involves lots of killing of humans. However, as John Connor (from the future) captures and re-programs one of them (Arnold Schwarzenegger) that cyborg is then turned to protecting his younger self in the past. The television series actually took a different approach from the movies. Not all of the terminators were seen as one minded assassins sent to the past. Cameron, the protector cyborg, shows us a glimpse of what might be happening in that metal head of hers. She doesn't just adapt to fit in by using new slang, stances and acting "more normal" to fit in. It seems to me that the writers of the show tried to show us her interest in the arts when she was practicing ballet on her own in one of the episodes; even after the mission which involved knowledge and skills of ballet was over. She kept on experimenting with it. She also showed an emotional component, a connection to John's younger self. This may have been a preservation mechanism, we don't know. The series was killed off after two seasons, but it would be interesting to explore more what constitutes human and what sort of human elements can these killer cyborgs show. Another interesting character is the T-1001 terminator, Catherine Weaver, and her "son" John Henry. Too much to go into at this moment though. This was a good series ;-)
Finally, I was thinking of the Cylongs in Battlestar Galactica , both "Chromejobs" (robots) and "skinjobs" (humanoids, made of flesh and blood) and the duality of what it means to be non-human. Both chromejobs and skinjobs were cylon, the "enemy" of the "ragtag crew" of Adam. The chromejobs came before the skinjobs but they were seen as equal...or where they? I don't remember a lot from the series since it's been a while since I've seen it. So, I will focus instead on my last case: Dorian, the android from the new series Almost Human. At this point there have only been two episodes, so it's not that easy to discuss a lot about these characters, but the portrayal of androids in this series, and Dorian in particular is pretty interesting.
From the week 2 synchronous session
The synchronous Google+ session was pretty interesting. From it came a few interesting points to ponder. One of the participants of EDCMOOC brought up a question on whether or not it is necessary for everything to be a game? The question was probably geared toward questioning gamification and the implication that learning shouldn't need to entice learners to partake in and engage with learning. I personally disagree. Everyone finds a reason to participate, or not participate in a learning venture. For some people learning is a thrill ride, so even when they are down in the dumps and struggling with material they are having fun. For others, when they are struggling it may feel like they are publicly flogged - not a nice feeling to have. By incorporating game mechanics into learning you aren't just trying to make something more enjoyable, but rather I would argue you are trying to provide additional appropriate supports, like learning scaffolds, and appropriate rewards for meeting certain crucial checkpoints. Focusing only on the badge or the fun aspect really does gamification a disservice. A related comment to this had to do with the notion that if we view education as a game, then we will find way to try to "beat" it or maybe "cheat" our way through. Personally students think we currently do that and we're not treating education in a game-like manner. Students always haggle for one more point on that exam, or ways of getting extra credit, or they figure out the professor's preferences and just regurgitate what they think the professor wants to hear. In these cases there may not be any actual learning happening, but rather a way to "cheat" the system. We, as humans, are problem solving animals. Gamification or not, we'll try to beat the system.A follow up comment comment was questioning the necessity of viewing everything through a competitive lens. The implication is that learners work in a solitary manner, in a zero-sum environment, so my win means your loss. So, as the commenter asked, why not work together? Education doesn't have to be zero-sum. We can, and in fact do, work together. But, it all comes down to each person's individual goals and motivations for being part of an educational venture. How you as a learner traverse the path of the course from start to end will depend on a lot of things, including your own motivations.
Finally, there was an interesting discussion on essays which came about from the readings on automated or peer grading of essays. There were two distinct points that came out:
- Why write it if no one else will read it an interact with it (Jen?)
- Isn't one of the points of essays to have that conversation with yourself? Taking a conversation and internalizing your conversational partner (Hamish)
For what it's worth, I think that writing, as a process, is both an internal and external motivator. While I think many of us would like to engage with others through our writing, it's not the only motivator for our writing. There are many blogs out there with few readers, this blog included. I don't write necessarily to have people read my posts and thus engage intellectually with me, but it's a way of discussing various views with yourself and by doing it openly you have the opportunity to both involve others in this process, or share your understanding with others. In one case (discussion with one's self) you are pushing yourself to become more knowledgeable, while in the other case (discussion/engagement with others) you are potentially engaging in a vygotskian dialogue where you are the More Knowledgeable Other in some cases, and your peers in others; thus through common dialogue expanding the overall knowledge and (hopefully) understanding in the network. When it comes to automated essay scoring, I already mentioned that the emphasis is placed on the grading, and not the process in a previous post. I would also add, based on this hangout recording, that I think that essay scoring is potentially detrimental to the discussion with one's own self because you are not writing to engage with yourself or others but rather you are writing to beat the automated machine algorithm that scores your paper.
From the Week 3 viewings
There were a few interesting videos this week to poke around the old "meat brain" and make it do some work. The They are Made of Meat is pretty hilarious, and the World Builder video was quite touching. It reminded me of the Animus is Assassin's Creed and moving around within the animus, both the reconstructed environments and the "getting your bearings" environments (game play introduction) that look more blocky. Specifically the World Builder made me think a bit of what it means to be disabled, and if our bodies are incapacitated but the spirit (the ghost in the shell, if you will) is there and ready to participate, what does that mean about the human ways in which we can interact? If there is a separate, but connected, matrix-like reality that connects the minds of people in comma conditions, what that do to our definition of human communication>This brings us to the difficulty of defining what is being human and Fuller in defining humanity. In thinking of this TEDx video it reminded me of some really interesting characters that I've come across over the years in Science Fiction shows that are either androids, cyborgs, or some other type of artificial intelligence.
Lt. Commander Data, Star Trek |
Finally, with Data, what I found interesting was that a Vulcan (arguably another human) told data in one of the episodes that Vulcans strive all their life to reach what Data has had since birth, but on the other hand Data would gladly give it up to be more human.
Rommie, Andromeda |
Cameron, Terminator |
The next person that comes up is Cameron from Terminator: The Sarah Connor Chronicles. In the movies the terminators are portrayed as cold heartless machines that do what they are programmed to do. This usually involves lots of killing of humans. However, as John Connor (from the future) captures and re-programs one of them (Arnold Schwarzenegger) that cyborg is then turned to protecting his younger self in the past. The television series actually took a different approach from the movies. Not all of the terminators were seen as one minded assassins sent to the past. Cameron, the protector cyborg, shows us a glimpse of what might be happening in that metal head of hers. She doesn't just adapt to fit in by using new slang, stances and acting "more normal" to fit in. It seems to me that the writers of the show tried to show us her interest in the arts when she was practicing ballet on her own in one of the episodes; even after the mission which involved knowledge and skills of ballet was over. She kept on experimenting with it. She also showed an emotional component, a connection to John's younger self. This may have been a preservation mechanism, we don't know. The series was killed off after two seasons, but it would be interesting to explore more what constitutes human and what sort of human elements can these killer cyborgs show. Another interesting character is the T-1001 terminator, Catherine Weaver, and her "son" John Henry. Too much to go into at this moment though. This was a good series ;-)
Dorian, Almost Human |
Finally, I was thinking of the Cylongs in Battlestar Galactica , both "Chromejobs" (robots) and "skinjobs" (humanoids, made of flesh and blood) and the duality of what it means to be non-human. Both chromejobs and skinjobs were cylon, the "enemy" of the "ragtag crew" of Adam. The chromejobs came before the skinjobs but they were seen as equal...or where they? I don't remember a lot from the series since it's been a while since I've seen it. So, I will focus instead on my last case: Dorian, the android from the new series Almost Human. At this point there have only been two episodes, so it's not that easy to discuss a lot about these characters, but the portrayal of androids in this series, and Dorian in particular is pretty interesting.
Dorian was decommissioned and replaced with more sterile, tactical, police androids because the DRN (Dorian) model emotion engine made them "unpredictable." I guess by "unpredicatable" the script writers meant to imply that they were quite human and acting "illogically" as a Vulcan would put it. Seems to me that in high risk situations where police need a police android partner that unpredictability would be a benefit, not a hinderance. From the two episodes we've seen Dorian in, it would appear that there are feelings there toward humans they care for, but also for fellow androids. When an android was put down (through no fault of her own it should be added), Dorian stayed in the lab to ease her passing in a way. Is this a human trait? Do humans actually do this? I would argue that this, empathy, isn't a universal human trait. I'm quite curious to see what the writers have for us as the season continues.
Finally, back to the TEDx talk, it was interesting to think about the "elevation" or raising of all humans to a certain level coming in with Christianity. A concern about the poor that wasn't there before. Since I don't have much background in this arena I'll go with it, but I am thinking about the current rhetoric about "democratizing" education with MOOCs by serving underrepresented student populations or the less advantaged in developing nations.
The other thing from this article that I guess I don't understand what his "Human Presence Learning Environment," based on the Moodle LMS, has that is so different from other learning management systems. Just incorporating video doesn't seem like such a great leap forward and these days you can do this with many external providers, including Google+. It seems to me that they just wanted to have something new in terms of a name or an acronym to get their 15 minutes of fame. A fellow academic, a couple of decades older than I, told me recently that he thought that new acronyms that viewed something existing from a slightly different angle were silly to him when he was first starting out, but he found out that this was the way to get funding for research. The review, validation and reframing of the existing just isn't sexy enough to get you attention. Too bad for our profession.
Finally, I think in MOOCs the "human connection," whatever that might be, can help make MOOC "completion" rates higher, however you define completion rates (I personally am still a skeptic on this front). But I am wondering how one can increase communication when you are in the virtual equivalent of a stadium with many unknown peers, and facilitators moving around in the crowd with them in their bright yellow shirts handing out participation stickers and handing the microphone to someone with a bright idea. There is an idea that has been brewing in this arena since last spring. More on this as I hash it out.
Next was the article Human Touch on EducationNext.org. This article reminded me of a classmate I had once who had two kids who he never allowed to use a computer, watch TV or play video games. I think that this quote from the article perfectly summarized his position:
Finally, a funny xkcd comic shared by a fellow participant in week 2.
What do you all think of these things?
Finally, back to the TEDx talk, it was interesting to think about the "elevation" or raising of all humans to a certain level coming in with Christianity. A concern about the poor that wasn't there before. Since I don't have much background in this arena I'll go with it, but I am thinking about the current rhetoric about "democratizing" education with MOOCs by serving underrepresented student populations or the less advantaged in developing nations.
From the week 3 Readings
Finally there were some interesting articles this week, although I must admit that I didn't find them as interesting as the past couple of weeks. There were, however some intersting points made in the Human Element on InsideHigherEd.com. One of them keeps coming up over and over again in one of the courses I teach. This point is as follows (quotes from IHE)But Hersh believes there is another major factor driving the gap between retention rates in face-to-face programs and those in the rapidly growing world of distance education: the lack of a human touch.One of the misconceptions that students have coming into their first online course is that they expect that online courses will be a straight replication of the processes and procedures that exist in face-to-face courses. This, of course, is not possible, and having such an expectation will lead to an inevitable sense of disappointment. Recording a play and posting it on YouTube for viewing will not give you the same feeling and engagement you have when you go to the theater. The audience will engage differently in the theater than they will on YouTube. Thus, if you are presenting and attempting to engage in a new medium using another medium's rules and expectations for action and reaction you will not be very successful at your end goals. Distance education doesn't have a lack of human touch, it's just a different human touch than people are expecting.
The other thing from this article that I guess I don't understand what his "Human Presence Learning Environment," based on the Moodle LMS, has that is so different from other learning management systems. Just incorporating video doesn't seem like such a great leap forward and these days you can do this with many external providers, including Google+. It seems to me that they just wanted to have something new in terms of a name or an acronym to get their 15 minutes of fame. A fellow academic, a couple of decades older than I, told me recently that he thought that new acronyms that viewed something existing from a slightly different angle were silly to him when he was first starting out, but he found out that this was the way to get funding for research. The review, validation and reframing of the existing just isn't sexy enough to get you attention. Too bad for our profession.
Finally, I think in MOOCs the "human connection," whatever that might be, can help make MOOC "completion" rates higher, however you define completion rates (I personally am still a skeptic on this front). But I am wondering how one can increase communication when you are in the virtual equivalent of a stadium with many unknown peers, and facilitators moving around in the crowd with them in their bright yellow shirts handing out participation stickers and handing the microphone to someone with a bright idea. There is an idea that has been brewing in this arena since last spring. More on this as I hash it out.
Next was the article Human Touch on EducationNext.org. This article reminded me of a classmate I had once who had two kids who he never allowed to use a computer, watch TV or play video games. I think that this quote from the article perfectly summarized his position:
A computer can inundate a child with mountains of information. However, all of this learning takes place the same way: through abstract symbols, decontextualized and cast on a two-dimensional screen. Contrast that with the way children come to know a tree–by peeling its bark, climbing its branches, sitting under its shade, jumping into its piled-up leaves. Just as important, these firsthand experiences are enveloped by feelings and associations–muscles being used, sun warming the skin, blossoms scenting the air. The computer cannot even approximate any of this.Having grown up in what is really a village, with lots of land around me, and dirt, and some farm animals, I think that this is an important part of growing up: the great outdoors, the fresh smell, the dirt (and subsequent washing up), however I wouldn't exchange computing for this, nor this for computing. I think that these days there are ways of thinking that need to be nurtured, not just treating a computer like a tool to be learned in your final year. I think there are measures of creativity that can be accomplished with games and computing that cannot be accomplished in real life. There is also a lot of real life that cannot be accomplished virtually. This is something I saw as a computer trainer in a previous job. Many students had learned the tool mechanically, so when an update came, and things moved around, there was a difficulty in being critical and finding the right information in that giant mountain of information. The skill that they have picked up is using a pre-determined critical path, not finding their own critical path from a mountain of information. This, to me, is much more important than having someone learn how to use a computer as a tool in their final year.
Of course, computers can simulate experience. However, one of the byproducts of these simulations is the replacement of values inherent in real experience with a different set of abstract values that are compatible with the technological ideology. For example, “Oregon Trail,” a computer game that helps children simulate the exploration of the American frontier, teaches students that the pioneers’ success in crossing the Great Plains depended most decisively on managing their resources. This is the message implicit in the game’s structure, which asks students, in order to survive, to make a series of rational, calculated decisions based on precise measurements of their resources. In other words, good pioneers were good accountants.I remember playing the Oregon Trail when I was in high school on an Apple IIgs. Of course, by that time it was old, but I didn't care because I had not grown up with this technology. I approached the game as a game, not as a way to learn history. I think that games will always fall short on the goals that we want for them to reach. There is just no way, with today's technology, to reach the levels of sophistication that are portrayed in Star Trek's Holodeck. I think games are a good start to get a hook into student learning. We can then take that interest and expand upon it with additional information that would benefit them in the long run. You could even tie-in the great outdoors in this! Give the learners the materials that frontier settlers had, and tell them that they need to solve a problem with these seemingly unconnected materials. Make them junior McGyevers and help them learn a lot of different skills, not just names and dates, and resource management.
Finally, a funny xkcd comic shared by a fellow participant in week 2.
Simple Answers to Technology (xkcd.com) |
What do you all think of these things?
Comments