Same old tired narrative: "Classes were built for the 1900s" 👴
![]() |
| Old Timey School House (Lego version) |
I came across a post on LinkedIn the other daaaay (read this in a Letterkenny cadence, if you know what that is 😆). Here's a direct link to that post if you'd like to engage with it and its author.
Over the past couple of years, I've been trying to get my mojo back when it comes to discussing issues like this. For a brief time, we had MOOCs (well, cMOOCs) with a daily recap of what was happening on Twitter using a specific hashtag, blogs, and other places on the web (Downes' gRSShopper, if anyone remembers this). Now things are difused though LinkedIn posts, people's blogs or substacks branded blogs, or on Discord; and there isn't a place that collects this discussion. Anyway, don't mind my "old man yelling at clouds" moment.
So, one of the things that I've been observing over the last decade (or more) is that a tried and true narrative exists any time there is a new technology out there. Namely, that kids these days do the Techy McTechface thingy, and we all need to adapt to Techy McTechface, because Techy McTechface is the future, and we don't want to leave anyone behind, and by doing so, disadvantage learners. Part of this narrative is that we do not simply just need the new thing, but the old is really inefficient, outdated, and plain on boooohhhriiiinnnggg! During this line of argument existing assumptions about why things are the way they are, and what the underlying constraints are, are never examined.
Case in point, the post I linked to above. I came across this one, after George Siemens reposted it on his LinkedIn timeline with a kudos. To be faaaaaair, there are some merits to what's being said, but these critiques are also not new, and the what should be done as we move forward ignores other elements of the environment that shape how we can design and offer instruction. But let me dissect this post a little bit, from a higher education perspective:
Claim (quote):
Most online courses feel like they were built for a different era.
And learners can tell.
The baseline of what constitutes a “good” online learning experience has shifted. When people are used to fast feedback, responsive systems, and support that adapts to their thinking, static course models start to feel archaic very quickly.
Response:
This seems like a false authority falacy to me. Here we are positioning the learners as some kind of expert as to what good (online) learning is. Now, don't get me wrong, we all have our own personal opinions of how we want our course, but as learners we don't know if that's any good. My main analogy here is having a personal trainer at the gym. I can go to the gym and tell you I want to work on my legs (because that's where my strength is and I feel confident), and a personal trainer might help me do some leg reps and push me a bit on that front, but what I might really need is stamina training (to reach whatever goal I have); so a good personal trainer won't have me just working on my leg strength, but would push me toward activities that meet whatever that stated goal is. Just because people are used to something like learner response systems and (what seems like) quick adaptation, doesn't mean that this is the only valid sort of pedagogy.
Claim:
What still shows up in many online courses:
📄 Long, text-heavy pages
📚 Too much content, too little thinking
🖱️ Click fatigue and legacy SCORM packages
💬 Unwieldy discussion forums with little direction
🎥 Live sessions bolted on rather than designed in
📦 Resources that stay frozen year after year
That standard used to pass. But not anymore.
Respose
Again, here, don't get me wrong. Information design is important when designing online learning spaces. Long, text-heavy, HTML pages aren't necessarily great for usability. Having the facility to get an alternative format for this is important. If I can download the page as an MP3 and listen and take notes, great. If I can get a PDF so I can place-shift my learning, also great. The length of the content isn't an issue. I do take issue with "too much content, too little thinking" because it pits two things against each other in a way that doesn't make sense. Thinking, assessment, and processing (and coming to a conclusion) require information. Content in a class is the information. You can't have one without the other. And, having common content grounds the exploration for a cohort of students so that everyone is on the same page. This is not an either/or situation.
Shovelware (click fatigue in SCORM) has been an issue since I was a wee learning design padowan in 2008. We discussed this back then, too. You know what hasn't changed? The operating environment we work in. Organizations want click-and-submit kind of eLearning - for better or for worse. This is mostly for compliance. I don't see a reason why these kinds of packages are useful in higher ed, but I assure you, they do exist, and they are most likely there to break up the perceived monotony on an online course. In a corporate setting, learning designers can push back lightly against death by click, but if the organization only values compliance training, you get the cheapest kind of product to produce.
Live sessions that are bolted on rather than designed in - again, this has been an issue since 2008, and what's made it worse, IMO, is the radical flexibility for learners. If live sessions are optional, they will continue to be bolted on.
Finally, resources that are frozen year after year...is this a problem? It can be, but it might not. There is a nuance here that is lost. If that article from 1975 still makes the point you wanted to student to get, and it still works as showing a foundation for something you're trying to impart, is the date of the article still a problem? On the other hand, if your resouces for eLearning include Adobe Flash...then yes, that's a problem. As an instructional designer, those nuances are important to interogate.
Claim:
Stronger online course designs are moving toward:
✨ Less content, more sense-making
🧠 Experiential tasks that require context specific judgment, reflection & interpretation
🎬 Videos created during the course, not all upfront
🧩 Shorter, more frequent live sessions designed for connection and synthesis
🔁 Resources & reading lists that evolve with questions, interests, and context
🤝 Visible peer thinking through annotation and shared work
🤖 Integrated AI literacy
🎯 Hyper-specific/bespoke learning tools
👥 Peer learning groups for sustained support
🧪 Assessment that values process, revision, and decision-making
The expectations of quality have increased and weak design choices are more visible than they once were.
But this is also an opportunity to design online courses in a more pedagogically deliberately and responsive way.
Respose
Honestly, I'd love to see some citation here for some of these things. Again, I don't disagree with some of the broad strokes; and no, they are not totally new, and we've been discussing this in the field for the last 20 years or so, but what I want to highlight here is the cost associated with some of these things. Creating video content in situ as the semester progresses is very expensive in terms of hours. Any kind of JIT content creation is very, very, time-consuming. Creating evergreen content, ahead of time, is a much more prudent use of time. About 10 years ago, when I first started teaching, I spend A LOT of time creating a weekly course podcast. This would include going through the weekly forums, picking up submissions from students (there were about 120 posts per week), and recording a podcast that highlighted key ideas, contributions, aha moments, and extended the content for the week. I'd also mention things that I thought students missed. I'd record, edit, produce and post an episode every Sunday so students could listen on their Monday morning commute. It was lots of fun, but that was a 5-6 hour commitment. As a point of reference, the institution only pays for about 10 hours of teaching labor each week, so those 6 hours creating that audio content (which didn't have a transcript back then!) took away time from giving people feedback on their actual assignments. As a result, I ended up putting double the time each week in the class. The students appreciated it (according to the evaluations), but it was not worth my time. The same problem exists with bespoke tools. Yeah, cool idea. But institutions subscribe to a certain amount and type of tools, so there's a standardization, both for training and troubleshooting. Anyone that is bespoke falls upon the instructor to be tech support, which again can suck up a lot of time. I paint outside the lines when I teach, but I know that this potentially comes at the cost of my spending a ton of time supporting that tool.
Claim:
Moving from content overload (PDFs, slides, articles) to curated depth and dynamic resources is essential if anything is to stick. Stronger courses focus on fewer concepts explored more deeply, structured comparison, interpretation, and synthesis, & deliberate pauses for sense-making.
Respose
Again, this really depends on context. In an introductory course, or in certain undergrad courses, where the goal is to get people quickly up to speed on some certain basics, yes. You remove the chaff so that students can focus on the wheat. In other contexts, you do want students to struggle a bit with the readings. No one comes to a new discipline ready to read in that discipline, and ready criticaly inquire and analyze. The LLM "era" of the last few years didn't start the content slop, content slop existed before that in the guise of low-quality listicles, unsourced claims on low-quality pages, repeat after repeat after repeat on YouTube of debunked learning theories, and predatory journals that will publish anything for a fee. Learning how to be a critical consumer of information is really part and parcel of being a professional in a given field. Furthermore, the depth and breadth of content is dependent on the intended learning outcomes. Less isn't always more, just as more isn't always less.
Claim:
IMHO this also challenges the rigid, predict-everything-upfront course production model.
We used to spend 6–12 months meticulously planning online courses. Outlining every module, scripting every video, filming and building everything in one intensive production sprint.
The new reality demands a different approach: build, test, iterate, stay responsive.
You can’t always predict what students will need, or what the landscape will look like when they are actually taking the course.
Traditional online course development often confuses thoroughness with quality. You can have a meticulously planned 12-week course that completely misreads what students struggle with in week 3, making everything after that less useful.
The best course designers now work in tight cycles: teach something live, refine it based on real feedback, rework it, then move to the next piece. They build courses that evolve with students rather than aging out before they are finished.
This is better for faculty engagement as well, many of whom hate the design-everything-to-the-nth-degree model.
It is also better for institutions, with lower upfront costs, better resource allocation, faster responsiveness and easier course maintenance.
Respose
Build, test, iterate is a core tenet of good instructional design IMO. You won't get an argument from me, but this has very little to do with the original proposition stated in the first couple of claims of how "courses feel like they were built for a different era." Just because you test and iterate doesn't mean that you don't take your time to plan and execute a project that is as feature-complete as possible when you start. I've seen the 6-12 month figure quoted a lot, and I am sure it's there as rage bait at times 😅. Listen, if faculty had the space to only focus on course development for a sprint cycle, that 6-12 month timeframe would drop drastically. However, faculty (both tenure-track and non-tenure lecturers) have to balance course design with research requirements and committee work. When those things are tallied up into a 40-hour work week (which I take a few in the faculty ranks work), you can see how a course design project balloons in terms of how long it takes to complete, even for well-meaning, well-trained, course designers. I've tried to source this quote (the 6-12 months; elghouth in my experience it's also been seen as a 9-12 month leadtime), and I've fallen short. I may have seen a tangential mention ina Gilly Salmon book (if I remember correctly), but it was more observational or conjecture rather than an actual study.
The other issue with the above arguments is that we don't design bespoke courses for each cohort that comes in each semester. We have courses that fit in a curriculum, and based on that planning, we have some potential garden paths to follow. If someone wants to go out in the wilderness on their own, they can (they are adults), but a course always exists within an ecosystem. A good designer/instructor will have additional maps, paths, materials, and activities for folks who want to expand upon what is learned in the class, but a class isn't a free-for-all. Classes should adapt to a certain extent, but if folks are getting stuck in Week 3, that needs to be investigated as part of that build, test, iterate cycle.
As I wrap up, I also want to share my observation that the plan is nothing, but planning is everything. Planning and designing ahead of time is important to know the depth that you will go to. It also illuminates potential pathways. You may choose to get rid of the plan as you teach, but not doing that upfront work is problematic on a lot of systemic levels. And no, AI won't save you if you don't plan ahead.
Thoughts?

Comments