DALMOOC, Episode 4: policy, planning, deployment and fun with analytics
Continuing with my exploration of DALMOOC, we've reached the end of Week 2 (only a few days late ;-) ). I've been playing with Tableau, which I can describe as Pivot Tables on steroids. I briefly explored the idea of getting some IPEDS data to mess around with, however that proved to be a bit more challenging than I had anticipated. So, I ended up using the sample data of course evaluations to figure out how to work Tableau. The following are some interesting visualizations of the data that I had:
The one thing I realized, as I was playing around with the data, is that it's really important to really know what your data means. I thought I knew what the categories meant, because I thought that institutions of higher education used similar lingo. The more I played with the data, the more I realized that some things weren't what I was expecting them to be. Thus, in order to know what is being described and portrayed through the visualizations one needs to know the underlying data categories really well. The other thing that came to mind is that you can't just produce a visualization and call it a day. A picture may be worth a thousand words, however succinct textual explanations, analyses of the visuals, will go a long way to clue people into what's happening.
Another aspect of week 2 revolved around policy, planning, and deployment of analytics. This actually came up in my EDDE 801 course as well as we are discussing an article† around learning analytics. The issue that has come up is around the ethics of analytics. A classmate of our has posted the OU's policy on the Ethical Use of Student Data for Learning Analytics. I have not read this yet (it's short, but this was posted to the course forums as I was writing this post) but it's certainly on my list of things to read. This trepidation around learning analytics on the part of some learners I think may be due to a perceived big brother aspect of the institution. How are these people who are looking at my digital footprints and for what reasons? I think that if any institution is interested in setting up a learning analytics initiative, it would be important to establish protocols at the institutional level for what types of data will be collected, from which sources, for what purposes, and (quite important) who's got access to this data. These policies should keep an eye on laws, such as FERPA in the USA, to make sure that data collection and data utilization policies are in compliance with those laws. I know that institutional research collects data about various aspects of the university, so coming with appropriate policies might not be a major issue.
As far as planning and deployment go, I think that the crucial thing will be front-end tools (like Tableau for instance) as well as training for those who use these tools. Just going in and creating nice graphics isn't enough. There will need to be a firm understanding of what the underlying data is, how it's collected, and what are any limitations that might exist with this data. I've met a number of people in my professional career who seem to have stats in mind, without really acknowledging what the stats mean. "We've had fewer enrollments in x-program this year". OK, so what I might answer. Enrollments are just one metric, what else is happening that might influence those enrollments? What role do departments, physical plant, faculty and other students play in attracting and retaining students to any given program? We can't just look at the raw numbers for students enrollments and think that we are coming to meaningful conclusions. The same is true about our learning analytics data.
SIDENOTES
The one thing I realized, as I was playing around with the data, is that it's really important to really know what your data means. I thought I knew what the categories meant, because I thought that institutions of higher education used similar lingo. The more I played with the data, the more I realized that some things weren't what I was expecting them to be. Thus, in order to know what is being described and portrayed through the visualizations one needs to know the underlying data categories really well. The other thing that came to mind is that you can't just produce a visualization and call it a day. A picture may be worth a thousand words, however succinct textual explanations, analyses of the visuals, will go a long way to clue people into what's happening.
Another aspect of week 2 revolved around policy, planning, and deployment of analytics. This actually came up in my EDDE 801 course as well as we are discussing an article† around learning analytics. The issue that has come up is around the ethics of analytics. A classmate of our has posted the OU's policy on the Ethical Use of Student Data for Learning Analytics. I have not read this yet (it's short, but this was posted to the course forums as I was writing this post) but it's certainly on my list of things to read. This trepidation around learning analytics on the part of some learners I think may be due to a perceived big brother aspect of the institution. How are these people who are looking at my digital footprints and for what reasons? I think that if any institution is interested in setting up a learning analytics initiative, it would be important to establish protocols at the institutional level for what types of data will be collected, from which sources, for what purposes, and (quite important) who's got access to this data. These policies should keep an eye on laws, such as FERPA in the USA, to make sure that data collection and data utilization policies are in compliance with those laws. I know that institutional research collects data about various aspects of the university, so coming with appropriate policies might not be a major issue.
As far as planning and deployment go, I think that the crucial thing will be front-end tools (like Tableau for instance) as well as training for those who use these tools. Just going in and creating nice graphics isn't enough. There will need to be a firm understanding of what the underlying data is, how it's collected, and what are any limitations that might exist with this data. I've met a number of people in my professional career who seem to have stats in mind, without really acknowledging what the stats mean. "We've had fewer enrollments in x-program this year". OK, so what I might answer. Enrollments are just one metric, what else is happening that might influence those enrollments? What role do departments, physical plant, faculty and other students play in attracting and retaining students to any given program? We can't just look at the raw numbers for students enrollments and think that we are coming to meaningful conclusions. The same is true about our learning analytics data.
SIDENOTES
- The assignment bank is an interesting concept, something that I came across in DS106 a few years ago. The only issue I have with the assignment bank is that I stumbled upon it by accident (here is the link if anyone else is interested). I've submitted one of my blogs for one of the assignments. I only realized today, though, that I was addressing the wrong assignment - #facepalm :)
- ProSolo is interesting, however there is one thing that I stumbled upon last week, that I didn't bookmark, and now I can't find it again: the calendar of published materials. There is some sort of "daily-like" (see CCK11 daily as an example) notification that is part of ProSolo (or it seemed like it). Quite useful if you want to check up on what's occured in DALMOOC in the previous 24 hours. Where the heck did I find it though?
- I wonder why weeks 3 through 8 showed up all at once when previous weeks were done one at a time...
- † Macfadyen, , L. & Dawson, D. (2010) Mining LMS data to develop an early warning system for educators: a proof of concept. Computers and Education, 54, 588-599.
Comments