Technology

I have run out of interesting things to write about edtech.

Welcome to the new More or Less Bunk. I think this is version #4, if memory serves me well. I redesigned it again because I’ve started guesting in that computer science class I described in this post. Since I knew I was going to have to describe how to build actual web pages, I had to build one myself.  That would be my new landing page, and I had to redesign here at the same time because of the way I structured this site back in 2014.  I have more to do here, but this is yet another example of learning by doing on my part. I remain stunned that this sort of thing is now technically half my job description.

With more actual doing, I’ve become far less interesting in pontificating.  It helps that I’ve been writing my next actual history book all summer. Lately, I’ve been doing a deep dive into the history of catsup!  So you’ll understand why I don’t much care about MOOCs or personalized learning or the coming faculty apocalypse (which, of course, JP and I already covered here).  Since I’m running a Faculty Learning Community (a term I picked up from the one and only Adam Croom) for our very incipient Domain of One’s Own project on campus starting in August, I still have to follow this stuff to some degree.  However, I’m pretty sure that I’ve run out of interesting things to write about edtech.*

However, before I leave this subject for what may be a pretty long while, I thought I’d review where we’ve been over four versions of this blog.  In 2012, a bunch of people in Silicon Valley started claiming that MOOCs were going to disrupt education and make universities obsolete.  I spilled a ton of pixels worrying that they might be right.  It turns out they were wrong.  But the really interesting question from the history of technology standpoint is exactly why they were wrong.  The rather surprising popularity of this post about edtech and refrigerators made me want to review this because maybe it’s not quite as obvious to some people as it is to me.

Disruption theory is built on analogies.  If I remember right, Clayton Christensen invented disruption theory by looking at the computer storage industry, then applying those lessons elsewhere.  Eventually, he applied the same principles to higher education.  The same way that Silicon Valley shills like to pitch things as “Uber for____,” there are useful versions of this kind of argument and less useful versions of this kind of argument.  Frankly, I’m not sure that this is the correct chronological order, but “Uber for hotels” gets you Airbnb.  On the other hand, disrupting education the same way that Zip Disks disrupted the computer industry during the 1990s gets you a really shitty education – a.k.a. MOOCs.

The obvious reason for this is the degree to which the new thing replicates the old thing.  Storage is storage.  Someone’s house still gives you shelter, just like a hotel.  Someone’s car still gets you where you’re going.  And in all three of these cases, it gets you what you want much, much cheaper.  Reach back to refrigerators, and the new technology is actually a vast improvement over the old one, ice boxes.  But i turns out that there isn’t much of a paying market for watching professors lecture and answering a bunch of multiple choice questions, at least among potential college students.

But even if there was, completely disruption isn’t exactly inevitable.  Sometimes the hotel itself is the reason for your visit.  Whether it’s a conference or just the pool and the buffet downstairs, hotels will always have something on Airbnb.  To go back to that refrigerator post again, some people  actually prefer going to the laundromat that owning their own washer/dryer – particularly if they don’t have their own house.  Sometimes even if the experience seems better, disruption may take time or might never happen at all because of strange cultural considerations that mere business professors will never bother to contemplate.

So what’s the deal with education technology?  MOOCs were and remain a mostly lousy experience, except for corporate training apparently – perhaps because corporations don’t much care about the quality of the student experience.  Various efforts to disrupt other aspects of the college experience with edtech have met varying receptions.  Sometimes the reception has been good (think textbook rental services, for instance).  Sometimes the reception has been bad (think e-textbooks, for instance).  If the savings are worth the inconveniences of an inferior experience or can somehow provide a better experience, those companies will prosper.  If they aren’t, then we’ll have yet another fad on our hands.

What I’ve learned in my years of studying this topic, is that there are actually a ton of really devoted people who are trying to develop and utilize various educational technologies to create useful and – at least in some cases – superior experiences to how colleges and classes operate now.  These efforts are, as you might expect, hugely labor intensive.  Therefore, they seldom appeal to private Silicon Valley companies trying to make a quick buck.  They do, however, appeal to all of us who are in higher education for the long run and a willing to try something new.

I got drafted to teach WordPress in a computer science class because I became one of those people.  What used to be peripheral to my job has moved to the center thanks to learning by doing.  While I may share a few of those experiments in this space moving forward, I’m afraid my days of long-winded pontificating about edtech are over.

Maybe it’s time to try history blogging again.  Anyone want to hear about the history of catsup?

* The one exception to that statement is an article that JP and I have in the hopper.  Actually, I drafted it from one of Poritz’s ideas and he’s been sitting on it for a few weeks now. It may see the light of day eventually, but if you’re reading this JP, I think you know what you have to do in order to make that happen.

Posted by Jonathan Rees in MOOCs, Teaching, Technology, 2 comments

Gophers.

I got exciting news yesterday: I’m becoming a computer science professor! I’m alright. Nobody worry ’bout me. It’s just for three days.

You see, my friend JP is teaching a CS class for pre-college Freshmen this summer and it’s going to start with getting them Reclaim Hosting sites, then teaching them how to control their own domains. Poritz, who codes his own pages like most people write prose, is so far ahead on this he actually needs help explaining this simplified process to ordinary people, so I’m coming in for the first three days to help talk the students through this process. Ironically, I’m hardly the greatest WordPress web designer in the world. [Indeed, THE Jason Jones owes me an e-mail or at least a post on improving one’s WordPress skills so I can redesign this site again as practice.] Nevertheless, over the last few years I’ve become quite good at modeling “Let’s all learn this together” behavior.

This is necessary because this whole concept of “Digital Natives” is complete rubbish. Yeah, I know that’s a rather common sentiment (at least in well-informed circles), but I’d actually go one step further: A lot of old people like me are a lot closer to being digital natives than college students are. After all, I was on a college campus for most of the Nineties. I actually learned (and have now completely forgotten) Gopher in an 80-part e-mail course. By which I mean, this gopher:

Not this one:

via GIPHY

Or this one:

So I literally have decades of experience being uncomfortable on the Internet.

I’d argue that this is a good thing. One of the many things I learned writing a book with Poritz was the origins of the fake word “app.” Yes, I already knew it’s short for “application,” but what JP taught me is that the whole point of applications is to perform a particular function for you so that you don’t have to worry about it. By making things more easy, you’re more likely to hand over your cash, your data or perhaps both.

As a stereotypical liberal college professor, the whole “Fake News” thing from last year scared the Hell out of me, and would have done so regardless of the outcome of the election. Since the Internet is so important to everyday life and is already (for good or for evil) taking over the college classroom, I’m committed to helping students understand how to think critically about something that’s inevitably such an important part of their lives. With an epidemic of fake Founding Fathers quotes perverting our politics, the relationship between this project and history professing should be obvious.

Or we can all be gophers and climb back into our holes and wait for Bill Murray to blow up the golf course for us. Pardon me if I prefer to be more pro-active.

Posted by Jonathan Rees in Personal, Teaching, Technology, 0 comments

The means of educational production.

I’ve had two articles come out in the last two days, and I think both deserve at least a shout-out here. The first is a Chronicle Vitae “column” about teaching that has been well-received on Twitter. Give it a look if you’re interested in teaching….or trucks.

The second is a collaboration between my co-author Jonathan Poritz and I in the AAUP journal Academe. While it obviously shares some similarities to Education Is Not an App, I like it a lot because it’s such a good collaboration that I can’t tell where my ideas stop and JP’s begin. The one exception to that is the reference to the “shopfloor” in the title of the essay (as I’m the labor historian of the two of us) – and a few very stray references to Marxism/Leninism in the text.

This is the residual to what was the first conclusion to this piece, all of which ended up the cutting room floor. However, I want to resurrect a bit of it here for the sake of added value. While JP and I were discussing shared governance during the planning process for that article, it suddenly struck me just how unique shared governance is. After all, what other worker besides college professors have even a fraction of the control over the conditions of production that we do? We work alone. As long as we don’t make the mistake of using the learning management system there are few direct records of our work and our output is almost impossible to measure accurately.

I’m not saying that professors should have completely unfettered control over their workplace. That’s why it’s called shared governance, after all. However, our training and expertise has traditionally bought that us far more autonomy than most other workers. Technology is a threat to that autonomy. If you want to see why, look at practically every other post on this blog going back five or six years.

But – and this is where my epiphany come in – unlike skilled production workers, college professors don’t have to unite with anybody in order to control the means of production. By employing whatever educational technology best suits our needs, we can ride the wave of automation all by themselves – like my Chronicle Vitae piece suggests, automating the tasks that should actually be automated, and utilizing our skills to combat the edge cases that come up in teaching every day. Because we already control the means of educational production, we don’t have to give it up without a fight.

The problem comes up when either the labor supply expands beyond what the market can absorb – see Marc Bousquet on grad students as a waste product – or when technology enables our employers to try to re-define what learning is. Shared governance is our protection against both these kinds of changes. That’s why fighting for its continuation can be revolutionary all by itself.

Posted by Jonathan Rees in Academic Labor, Shared Governance, Technology, Writing, 1 comment

With or without edtech.

Earlier today, my Twitter friend Jon Becker @ed me a link to an EdSurge essay about edtech and refrigerators, suggesting that I was “the only person qualified to comment on this.” Indeed, I can say with some certainty that I am the only person in the world who has written two books on refrigerators who is also interested in education technology. So when the Clayton Christensen Institute for Cheerleading Disruption of All Kinds throws a slow, hanging softball made especially for me, how can I possibly resist?

For those of you who refuse to read anything produced by the Clayton Christensen on principle (and I have some sympathy for that position these days), let may save you a click. The author, Julia Freeland Fisher, uses research on comparative appliance adoption rates by her colleague, Horace Dediu, to argue that:

[I]t’s becoming increasingly acknowledged that we need to pair investments in edtech tools with investments in professional development. But for the tools and models that least conform to traditional school structures, we’re also likely to need investments in fundamental reengineering—that is, not just developing teachers’ proficiency in using tools but rethinking processes like schedules, evaluations and staffing throughout an entire school building or district.

What do refrigerators have to do with restructuring schools? In order to use a new refrigerator, consumers only had to plug them in. In order to use washing machines, on the other hand, consumers needed plumbers to help them and maybe a whole new set of pipes in their houses. That’s why refrigerators became much more popular, much faster than washing machines and that’s why you need to change the way schools are structured so that they can best take advantage of all the wonderful new education technology that EdSurge must cover every day.

The first thing that jumped out at me about this article was Fisher’s basic dates in the history of the refrigerator. She says the refrigerator debuted in the 1930s. The first electric household refrigerators appeared during the 1910s. They were already being mass-produced by the late-1920s. “Refrigerators quickly took hold,” she writes “gaining over 90 percent adoption by the late 1950s.” I actually used the exact same statistic in my book Refrigeration Nation (p. 179, for all my fellow refrigerator aficionados who want to consult your own copies), but I used it to make the exact opposite point about refrigerators. In 1957, when over 90% of American households had refrigerators, only 12% of French households had refrigerators and less than 10% of English households did. If refrigerators were really that great, why didn’t they too just plug them in and enjoy the bounty?

As a historian, this is where I became really curious about where Fisher got her statistics. While she namechecks her colleague Dediu, there’s no link in the piece to any published study about refrigerators and washing machines. Indeed, the only link in the entire essay is to a general study about technological diffusion. There’s a chart in Fisher’s essay about comparative adoption curves, but there’s no source listed for that either. Other than completely leaving out the bottom left, the curve for refrigerators looks OK to me, but how can I trust her point about washing machines if I don’t anything about the source? How can I be sure that this isn’t the edtech equivalent of fake news?

That’s why I opened a book. Ruth Schwartz Cowan’s More Work for Mother is a classic in the history of technology and pretty close to the only scholarly work that tackled the history of refrigerators at any length before I did. Since it is a general history of appliances, I figured it might have a little bit about washing machine adoption rates in one of the sections I had forgotten about. So I pulled it down off my shelf, turned to the index and quickly hit the jackpot; “washing machines…diffusion, 195-96.” Here’s the quote:

“[I]n 1941–roughly thirty years after they came on the market, and twenty years after the prices had fallen to more or less reasonable levels as a result of mass production–only 52 percent of the families in the United States owned or had “interior access” to a washing machine. Thus, just under half the families in the land were either still hand rubbing or hand cranking their laundry or using commercial services.”

If you’re wondering, the Fisher/Dediu number is about 10 percentage points lower than the one that Cowan used. Perhaps this can be explained by the difference between owning a washing machine and “accessing” a washing machine in the basement of your apartment building or taking your dirty laundry down the street to a laundromat. But for purposes of Fisher’s overall point about edtech, this distinction means everything.

Can you live without a refrigerator? Most Americans can’t. [Indeed, the refrigerator adoption rate in the modern US is actually 99.5%.] However, French or English people in 1957 still had easy access to fresh meat and produce at large markets.  Many still choose to live that way today because fresh perishable food tastes better. Americans, on the other hand, tend to preference convenience over taste. That’s why the refrigerator industry was one of only three in the whole United States to grow during the Great Depression.  Anyone who had any money to spend at that time greatly valued the added convenience of electric refrigerators over ice. By 1960, the old ice industry basically disappeared because it ran out of customers.

Can you live without a washing machine? Of course you can. That’s why there are still coin-operated washing machines and laundromats. Keeping your food in other people’s refrigerators isn’t an option in the United States, but you don’t need constant access to a washing machine in order to get your clothes washed by machine when needed. In other words, owning your own refrigerator is close to the only way to have access to refrigeration, but dragging your dirty clothes to any laundromat is a reasonable way to get access to a washing machine even if there is none in your home or apartment.  There’s only one way to keep your perishable food fresh, but there are plenty of ways to get your clothes washed whether you own a washing machine or not. In short, refrigerators are close to a necessity. Washing machines are just really, really convenient.

Can you live without edtech? [You just knew I had to get around to edtech here eventually, right?] Shockingly enough, there were actually good schools in the United States long before Bill Clinton and Al Gore decided to put a computer in every classroom. Plenty of teachers and professors offer great classes of all kinds without anything more sophisticated than their voices and a chalkboard. Weirdly enough, just this morning, right after I read that article, I was pitching our dean on starting a digital humanities program in our college. “What about the professors who don’t want to use technology?,” he asked me. I said I would never in a million years force any teacher to use technology if they don’t want to, but it’s a actually a good thing if students have a wide range of classes in which they can enroll, some of which use educational technology and some of which don’t.

Which brings me to the fundamental problem with the Clayton Christensen Institute for Cheerleading Disruption of All Kinds. The whole assumption behind that article is that one technology will always inevitably drive another technology to extinction: Refrigerators will completely replace ice, washing machines will completely replace washboards and edtech will completely replace conventional teaching. That is only true for the first of those examples (and even then, only really in the United States). Whether teachers want to teach with or without edtech is a cultural decision, not some hard and fast rule determined by the universal laws of technology.

Unless, of course, you have some other axe to grind…

Posted by Jonathan Rees in Refrigeration Nation, Refrigerator, Teaching, Technology, 5 comments

My adventures in digital history.

These are my remarks as written (if not exactly as delivered) in Paul Harvey’s history seminar at the University of Colorado – Colorado Springs this morning:

I recently wrote an essay for the Chronicle of Higher Education called “Confessions of an Ex-Lecturer.” Yet my appearance this class (well, the first part of this class anyway) is going to be a lecture. Yes, I’m going to lecture about why and how I stopped lecturing. To get past this enormous contradiction, let me make a distinction between conveying historical content and making a pedagogical argument. You have no reason to memorize anything I say today. There will be no quiz later. Instead, this lecture explains my thinking about teaching history to you and see if I can convince you I’m right. I’ve adopted a lecture format here because I have to tell the story of how my thinking has changed in order for you to follow along with my reasoning.

My opinions on this subject are not popular in historical circles. As one of my former graduate school acquaintances put it on Twitter the other day: “[T]hey will pry the lecture out of my cold, dead hands.” I sympathize. Old habits die hard. That’s the way I learned history when I was in college. Indeed, I never had a class of any kind in college that had fewer than thirty people in it and the vast majority of those class periods consisted of people lecturing at us. A lot of those professors were really good at what they did – although I did take a class from a political science professor who looked up at the ceiling as he talked, which drove me completely crazy….but that’s a story for another time. The reasons I’ve sworn off lecturing in my own classes are twofold.

First, there’s the advent of the cell phone. These small supercomputers have so permeated daily life that the average person – notice how I didn’t say average student – average person can’t go ten minutes without reaching for their phone at least once. Indeed, stick me in some meeting where someone starts lecturing about something that I’m not particularly interested in and I’ll reach for my phone far faster that. I could be the most interesting lecturer in the world (which I most certainly am not), and a good number of you would still reach for your phones at some point during the presentation.

Please understand that I’m not blaming millennials here. I’m blaming everybody. For so many of us, the temptations of the Internet is just to hard to resist. “When people say they’re addicted to their phones, they are not only saying that they want what their phones provide,” writes the MIT psychologist Sherry Turkle, “they are also saying that they don’t want what their phones allow them to avoid.” If I’m talking at you in a classroom of any size, it is ridiculously easy for you to avoid me and I’m not going to be able to change that. Therefore, I have to talk at you I better make darn sure that I have something interesting to say.

So what if I give you the opportunity to do something rather than to passively absorb information? What the Internet take away, it also giveth. My interest in digital history comes from my interest in finding some alternative to lecturing about historical facts and then testing students on how many of those facts they’re retained. I know this is sacrilege in most historical circles, but I’m gonna say it anyways: You really can Google anything.

The Internet is well-developed enough that most of the time a discerning consumer of information can get reasonably reliable factual information very quickly with limited effort. But, and this is the second reason I’ve basically given up lecturing, with limited technical knowledge it is now possible for ordinary college students to make useful contributions to the great pool of historical information available online. Not only that, by doing so, they can pick up practical computer skills that will increase their employability upon graduation. With that kind of upside, taking some of the attention in class off of me seemed like a small price to pay.

One of the most interesting things about digital history is that this field lets you make professional use of skills that you probably picked up just by being an active digital citizen. For example, I started blogging right after I got tenure in 2003 because I was a lot less worried about someone threatening my employment because of my political opinions. Oddly enough, I devoted my entire blogging life to one subject: Walmart. I learned WordPress from a guy named Jeff Hess in Cleveland, Ohio via e-mail. Jeff was the administrator of our group anti-Walmart blog.

In 2007, when my department wrote and was awarded a teaching American History grant from the federal Department of Education, I used those skills in class for the first time. We were funded to take teachers to historic sites on the East Coast over the summer and this was a way that they could write easily from the road and that we could still follow them. So could their relatives friends and even students, which served as a nice side benefit – a benefit that applies to all sorts of history undertaken on the open web.

Another skill I already had which turns out to have enormous digital history ramifications is some proficiency in social media. Personally, I’m a stonecold Facebook hater, but Twitter has been a godsend to me with respect to digital history not so much in class but for keeping up with the field. Your professor, for example, (if you didn’t already know) is a prolific Tweeter, if mores on American religious history than digital history and things technological. More importantly, my students have used it to reach out to scholars in fields that they’re researching.

It’s also a great tool for publicizing the work you do online. I actually got a book contract thanks to Twitter (although not in history). If you’ve spent any time listening to the Canadian scholar Bon Stewart as I have, you’ll understand how social media in general and Twitter in particular is a great tool for building communities of interest – and I mean that both in terms of what you enjoy and as a way to fight for what you believe.

With respect to digital history in particular, the turning point for me in particular was the summer of 2014 when I attended an NEH Institute at the Roy Rosenzweig Center for History and New Media at George Mason University in Virginia. Me and a bunch of other folks who never studied this stuff in Grad School got a very intensive tour of what’s on the web, web tools and how we might want to integrate them into our classes. Some of it was old hat for me. Unlike a lot of my fellow professors, I had already heard of two-factor authentication and Password protection programs.

However, when it came to history-specific web tools almost everything they touched on was brand new to me. One I was already using, but learned to use better is Zotero, which actual began at the Roy Rosenzweig Center for History and New Media and really ought to be on every historian’s must-use list. Zotero is a notes program that lets you gain intellectual control of your research by allowing you to search it at the word level. That includes content inside digital copies of things that you’ve scanned and uploaded. As someone who wrote his dissertation on 4×6 notecards I can tell you I am never, ever going backwards on this. That’s why I’m now requiring all my students doing research papers to use it. My students constantly tell me how grateful they are to know about Zotero, and how they wish they knew about it two or three years earlier.

A jaw-dropping research tool for digital historians that I first learned about in Virginia is Camscanner. Camscanner is an app that turns your cell phone scanner into a document scanner. If I could show you the huge pile of Xerox copies I made for my dissertation at 25 cents, 50 cents…even a dollar a pop, you’d know why this is so amazing. Having access to free copies of documents from archive make sit easier to acquire information over what is often very limited research time. I had some experience with researching this way when the Manuscripts Division at the Library of Congress installed the greatest book scanners that I had ever seen in order to preserve the physical well-being of their collections (since bending things back for ordinary copying does so much damage). Now I’m swimming in information – information that’s searchable using Zotero. The same is true for my students as I have them working with local archives in my digital history classes.

The program I settled on for them to use is Scalar, which comes out of the University of Southern California. It’s actually designed as a book publishing program, something that allows books to appear on the web with digital media embedded into them. I’ve been using it in class for web exhibits. Study after study has shown that putting resources up on the web drive traffic to physical archives and libraries rather than take it away, so I’ve had my student create Scalar projects using local resources and putting them up on the web. Here’s a recent example from the Steelworks Center for the West that I liked a lot. Here’s another about a place I think that everyone in this class ought to know well.

Why Scalar? You don’t have to know how to program in order to make it look good. Indeed, as the experience of countless of my students has more than proven, you can learn how to use it within just an hour or two of starting to play with it. Indeed, I have plenty of students who can Scalar far better than I can because they’ve had far more reason to use more features than I have since I simply use it to put up a few syllabi (although I have trained to do more now).

Another reason I like Scalar is that students and faculty who use it can host their own Scalars if they go through Reclaim Hosting. This is not the place to argue why faculty and students should take back the web from university administrators and private companies (although I did co-author a book that fits in well with that argument), but one of the best things about the Reclaim-related “Domain of One’s Own” project is that it allows students to keep access to their digital work even after they’ve graduated. Scalars students create through Reclaim therefore can serve as evidence to potential employers that they can do something other than just historicize things. Not that there’s anything wrong with the ability to historicize things, but in this manner digital history might actually be the answer to the age-old question, “What can you actually do with a history degree (besides teach)?”

On personal level, my digital history experiments has proved much more interesting than standing up and lecturing to disinterested students about the same old things that I had always been lecturing about. In the future, I’m dying to get into digital mapping, as the Steelworks Center of the West has an absolutely astounding collection of mine maps that cover both towns and mines. I imagine a digital project that traces the physical impact of mining on Southern Colorado’s landscape as soon as I have enough theoretical background to pitch it to some funding agency. What’s really great is that thanks to my changes in pedagogy I’ll be able to get my students to pitch in.

When I was at the American Historical Association meeting in Denver a few weeks ago, I attended almost nothing but digital history sessions. I was really struck by all the people at those sessions by how willing everyone was to admit that they have no idea what they’re doing – that the whole field of digital history is kind of a running experiment. To paraphrase one scholar I heard at the meeting, digital history blurs the line between research, teaching and service. In my case, I’m having students do historical research and putting on the web for the benefit of local historical non-profits. I think the benefits of doing this far outweigh whatever harm that gets done to my ego if I’m no longer the center of attention in class anymore.

Posted by Jonathan Rees in Digital Humanities, Teaching, Technology, 0 comments

MOOCs: A Postmortem

MOOCs are dead. “How can I possibly argue that MOOCs are dead?,” you may ask. After all, to borrow the stats just from Coursera, they have: 1600 courses, 130+ specializations, 145+ university partners, 22 million learners and 600,000 course certificates earned. More importantly, it appears that Coursera has received $146.1 million dollars over the years. Even though it hasn’t gotten any new funding since October 2015, unless Coursera tries to copy “Bachmanity Insanity” (Is Alcatraz still available for parties?) the company is going to be sticking around for quite a while.

What I mean when I say that MOOCs are dead is not that MOOCs no longer exist, but that MOOCs are no longer competing against universities for the same students. Continuing with the Coursera theme here, in August they became the last of the major MOOC providers to pivot to corporate training. While I did note the departure of Daphne Koller on this blog, I didn’t even bother to mention that pivot at the time because it seemed so unremarkable, but really it is.

Do you remember Daphne Koller’s TED Talk? Do you remember how incredibly utopian it was?  n truth, it made no bloody sense even then. For example, she suggested back at the height of MOOC Madness that:

[M]aybe we should spend less time at universities filling our students’ minds with content by lecturing at them, and more time igniting their creativity, their imagination and their problem-solving skills by actually talking with them.

I agree with that now. In fact, I agreed with that then too. The problem with that observation to almost anyone who actually teaches for a living remains that talking with students is obviously impossible when you have ten thousand people in your class. More importantly, showing students tapes of lectures (even if they’re broken up into five minute chunks) is still lecturing.

That’s why MOOCs were never going to destroy universities everywhere. There will still be far more than ten universities fifty years from now. Or to put it another way, the tsunami missed landfall.

But just because this blow proved to be glancing doesn’t mean that the punch didn’t leave a mark. For example, a lot of rich schools threw a lot money out the window investing in Coursera and its ilk. [Yeah, I’m looking at you, alma mater.] Others simply decided to spend tens of thousands of dollars on creating individual MOOCs that are now outdated almost by definition since they’re not designed for corporate training.  Yes, I know that MOOC producers claim that their MOOC experience improved teaching on campus, but think how much better teaching on campus would have been if they had just invested in improving teaching on campus.

At best, MOOCs were a distraction. At worst, MOOCs were a chronic condition designed to drain the patient of life-giving revenue. Instead, those schools could have used that revenue (as well as its initial investments) for other purposes, like paying all their faculty a living wage.

My inspiration for this observation (and this entire post) is the MOOC section of Chris Newfield’s terrific new book, The Great Mistake: How We Wrecked Public Universities and How We Can Fix Them.*  This is from page 227:

MOOCs were not going to leverage public colleges by buying them.  But they could acquire a share of their revenue streams–that combination of student tuition and enrollment-based public funding–whose capture is one of the five key elements of privatization…MOOCs could leverage their private capital with far greater sums flowing colleges and universities without buying anything up front.  This offered the attractive prospect to strapped public colleges of gradually replacing even more tenure-track faculty with technology that could be managed by private MOOC firms off campus, for a reasonable fee.

To make one of my favorite distinctions, this applies to schools that are MOOC-producers (like Arizona State) even if those MOOCs are mainly for internal consumption, and especially all those colleges and universities that were potential MOOC consumers – any place that considered replacing their humdrum, ordinary faculty with all the “world’s best lecturers.”

In order to capture part of that revenue stream, MOOC providers had to argue that their courses were better than the ones that students were taking already.  That explains all the hating on large lecture courses.  Except, MOOCs were nothing but large lecture courses dressed up with technological doodads.  As Newfield explains on pp. 242-43:

     In effect, MOOC advocates were encouraging students to leave their state universities to attend liberal arts colleges, where they could go back to the future of intensive learning in the seminars that typify “colleges that change lives.”  But of course they weren’t.  Advocates were actually claiming MOOC ed tech could create liberal arts colleges all for next-to-no-cost (Koller) or greatly lowering costs (Thrun).  In making this claim, they ignored existing historical knowledge about learning at high-quality institutions, which made the technology seem original, when it was not.

MOOCs may have been cheaper (and Newfield even disputes that), but they certainly weren’t better – even than large lecture classes.

Again, the vast majority of us faculty foresaw this particular Titanic hitting the iceberg (including me, even if it did take me a while). Nevertheless, university administrators that partnered with MOOC providers or (even worse) bought their products, trusted Silicon Valley types more than their own faculty. This course of action was a reflection of the same self-loathing that Audrey Watters describes here:

There seems to be a real distaste for “liberal arts” among many Silicon Valley it seems – funny since that’s what many of tech execs studied in college, several of whom now prominently advocate computer science as utterly necessary while subjects like ethics or aesthetics or history are a waste of time, both intellectually and professionally.

Yet at least these Silicon Valley types had enough self awareness to go into a different field after they left college. What’s the excuse for a university administrator with an advanced degree in the humanities (or anything else for that matter) to hate their educations so much that they spend hundreds of thousands of dollars to deliberately undermine them?  There is none. They should have known better.

Next time Silicon Valley comes up with a new way to “disrupt” education, let’s see if we faculty can invest more time and effort in getting our bosses to listen to common sense.  Instead, as Newfield notes in his postmortem of Koller’s TED Talk on p. 241 of The Great Mistake:

The categorical discrediting of faculty pedagogy made this bypass of faculty expertise and authority seem reasonable and necessary for the sake of progress.

So in the meantime, let’s fight to improve shared governance everywhere so that we’re prepared to fight for quality education if our bosses refuse to accept the obvious.  Some of us becoming temporarily famous is not worth wasting so much money and effort on any technology that is obviously going to prove to be so fleeting.

* Full Disclosure: Newfield and I have the same publisher even though we publish in entirely different fields.

Posted by Jonathan Rees in Academic Labor, MOOCs, 4 comments

Get your side hustle off.

I’ve been streaming a lot of Simpsons with my son lately. Backwards. Since I quit watching the show regularly sometime in the late-90s, this was the best way that we could both enjoy all-new material. The quality of even the most recent stuff is obviously the good thing about streaming the Simpsons. The bad thing is being locked into watching all those Fox commercials since my cable company (or maybe it’s Fox) won’t let us fast forward. The above Uber commercial has been on full rotation for months. In fact, it sometimes plays twice an episode. I’ve been making so many “earnin’/chillin'” jokes that my son now leaves the room when it comes on.

I thought of that commercial twice while I was at the AHA in Denver last weekend. The first time was when I explained to four historians from Northern California (ironically, the first place that I ever took an Uber) how Uber works. [Key lesson: Always tip your driver!] The second time was when I went to my first digital humanities panel on Saturday morning. The commentator, Chad Gaffield from the University of Ottawa, was talking about how DH makes it possible to break down the false dichotomy between work and play. That spoke to me, because I’ve been having an awful lot of fun teaching all my classes lately. Indeed, I’m going to bring that up the next time I hear someone who teaches like it’s still 1995 start talking about “rigor.”

The other point Gaffield mentioned that I thought was really important was the way that DH blends the traditional roles of teaching, research and service. In my case, I teach students how to research using local resources that help the community once they appear online. However, I suspect there are a million variations to that. In any event, when you fill out your annual performance review, we can all include DH work in whichever category we don’t have enough material in already.

In the very early days of this blog, the role of tech critic was something of a side hustle for me. It wasn’t my day job, but my writing nonetheless found an audience. It’s through the conversations which that writing inspired that I stumbled into a large, multi-disciplinary pool of scholar/teachers who were trying to utilize the Internet to create unique educational experiences rather than cheap, dumb carbon copies of face-to-face courses. I started teaching online so that I could try to set a positive example for other people who might be reluctant to make the same jump because so much of what’s out there has a justifiably bad reputation. I still have a long way to go, but one of the most refreshing things I got out of all the DH panels I went to last weekend is that so does everybody else. Even historians who get their DH papers onto AHA panels readily admit that their learning curve remains steep.

By the time I left Denver in Sunday, I had decided I’m never going back. I don’t want my conventional courses to be entirely conventional anymore. In other words, I’ve been convinced that the digital needs to be present in every course I teach.

I am hardly the first person to draw such a conclusion. CU-Boulder’s Patty Limerick wrote in the September 2016 issue of AHA Perspectives that:

In innumerable settings, historians in Colorado are stepping up to this challenge. In the process, they are devising practices that transcend the conventional turf fights between “academic history” and “public history,” uniting in the strenuous and satisfying work of “applied history.”

I think you could make a pretty good case that food and refrigerators are relevant today, but it’s my classes which take students into the Steelworks Center of the West and the Pueblo Public Library that fit this definition of “applied history” the best.

While such activities have little to do with my current research, teaching is 50% of my job according the annual performance review I’ll have to turn in a couple of weeks from now. In short, what was once my side hustle has now become my regular hustle. While there’s still a lot of tech criticism left to write and I plan to write at least some of it when I have the time, this blog, when I have time for it (and why would I have redesigned it if I had intended to never use it again?) is going full pedagogy.

In the meantime, I have another actual history book I want to write…

Posted by Jonathan Rees in Digital Humanities, Teaching, Technology, 0 comments

More mistakes from my first semester teaching the US History survey online.

My first semester as an online instructor is almost over. Who knows where the time goes?

Curating a respectable online survey course experience comes with a lot of responsibility. In my humble opinion, too many online US history survey courses cling to the vestiges of the traditional lecture model. As I’ve explained here and here, mine is more like an English composition class. While I’ve enjoyed teaching it so far, the whole thing is far from perfect. So in the interests of transparency and helping anyone out there who might actually be interested in following my path, I’m going to try to explain more of the mistakes I’ve made (besides this one), as well as all the fixes that I’ll be implementing when I re-write the syllabus over Christmas break for next semester’s students.

1) Many years ago, when I first started at CSU-Pueblo, I asked an Associate Provost whether he thought I should have an attendance policy. “Do it for their own good,” he responded, and I have mostly stuck with that advice. Of course, an attendance policy makes no sense in the context of an asynchronous, entirely online course, but you still need your students to log in to do the work. I can’t tell you the number of times over the years that I have lamented the fact that when I remind students that there is an attendance policy, the people who need to hear it usually aren’t in the room. Telling students to log in and do the work when they never bother to log in is even more frustrating.

That’s why I’m moving to mandatory meetings (in person or via Skype) during the first two weeks of class, when everyone’s working on setting up the various accounts and programs I require. On a human level, I suspect it’s a little harder to abandon a course when the professor is a person rather than screen presence. On the more practical level, my one question during those meetings is going to be, “How do I reach you if you suddenly disappear?” Yes, I realize that online courses have always had higher dropout rates than their face-to-face alternatives, but that doesn’t mean that I can’t try to make my course something of an exception.

2) Another typical problem I’ve had is with the discussion aspect of the course. For one thing, it proved next to impossible to get and keep a good discussion going with a very small number of students (although things have been better now towards the end of our time together), even though Slack has worked beautifully for student/teacher communications. Besides getting a bigger course, I think my problem here was requiring too little. I’ve been asking for a question and an answer and a document summary for each two-week unit. In the future, I’m going to up that to once a week, and increase the percentage of the grade that goes for discussion. I also need to recommend the Slack mobile app a little more forcefully, as it has been great for keeping track of those discussions that went well.

3) As part of those unit assignments, I’ve been requiring students to bring in a source from the wider Internet in order to evaluate it. I LOVE the fact that I can conceivably do that well in this format, especially after reading the summaries of that bone-chilling Stanford study about students and fake news. The problem in my class has been is that students don’t have any context to evaluate what makes something reliable. Indeed, the best answers I could get all revolved around the origin of the story. “It’s from the New York Times, of course it’s reliable.” Nobody cares where the NYT was getting its info.

My plan is to move the outside sources out of the week-to-week writing assignments and into the pre-exam section of the course, and try to ban them outright for the bi-weekly essays. Too many people were using Google to write their assignments anyway, and not looking at the credible assigned texts. If I move the wider web stuff to the end of the course sections, then they’ll have weeks of assignments and textbook reading that they can compare their outside sources too. If a reliable, already assigned primary source (or even the textbook) corroborates their outside source, then we can all gain a better understanding of what reliability really means.

4) I’m just gonna come out and say it: Online grade books are shit. Yes, the one in Canvas is better than the one in BlackBoard, but if your grading scheme includes something as simple as dropping the lowest grade of any kind of assignment (as mine does) it is impossible to get these systems to do what you want them to do ad still have a reliable total at the end of the row of columns. And don’t even ask me about converting letter grades into points. It’ll just make me angry.

This whole problem reminds me of why I resented grade books for so long back when I was only teaching more conventional classes. Students would constantly ask me what they’re grade was and I’d say, “Do the math.” The math wasn’t that hard, but they were so used to getting their simple running final grade totals on a platter that response made me look like an asshole. Yet there are advantages to not keeping a running total.  For example, I can do crazy things like grade up for improvement over the course of the semester or even curve my results if I decide that my constant pedagogical experiments proved too much for that semester’s students.

So what am I going to do about this? First, I’m going to try to disable the final grade mechanism entirely so all that students can do is read their letter grades. I think that might work if I assign zero points for each assignment and use a separate spreadsheet at the end of the semester. If that doesn’t work, I’m going to stop using a grade book entirely. Sometimes old school is better than dumb school.

5) Read the syllabus:

If you thought reading the syllabus was important for regular courses, it is probably five times as important for online courses because your students don’t get the benefit of listening to you repeat reminders at them all semester. As usual, some students clearly did do so, but others clearly didn’t.

What to do here? At first I was thinking about a syllabus quiz, but that’s so boring. My new idea is an online treasure hunt that will force students to go back through the other programs I’m forcing then to use. [What is in the crazy online GIF that I embedded in the first response in Slack #random channel? Send the response to me as a Slack direct message.] Stick those commands in random places in the middle of the syllabus (and grade them), and maybe I can kill two birds with one stone.

Yes, there are a few more mistakes that I know I’ve made, but the of Gravatars or my my troubles with Hypothes.is groups were in no way pivotal to the success or failure of the class. The mistakes covered here are enough for public consumption. In the meantime, your thoughts and suggestions to what’s here would be much appreciated both by me and my fellow denizens of the CSU-Pueblo Center for Teaching and Learning who are teaching online for the first few times and trying to make their courses better too.

Posted by Jonathan Rees in Online Courses, Teaching, 4 comments

I’ve come out of MOOC retirement.

I remember exactly when I took my first MOOC. It was Fall 2012 (during the run up to the last presidential election) and I was on sabbatical.  If I didn’t have all that extra time on my hands I never would have finished it.  I’ve signed up for a couple of more since then (like the one with edX just so I could see if their syllabi actually had required reading), but I never really did any work on those.  Recently, I started classifying my Coursera e-mail as spam so I wouldn’t even have to think about MOOCs quite so much anymore.

Yet much to my shock, I’ve come out of MOOC retirement. While I’m not doing the work for the University of Houston’s Digital Storytelling MOOC, I have decided to do watch the videos because I really want to be able to introduce digital storytelling as a possibility into my next digital history class.  All I really need is some knowledge about the tools with which I can experiment. When I actually teach this stuff we’ll all kind of fake it together.

That’s good, because if I had wanted to do the work in the MOOC and get it graded, I’d have had to pay Coursera for a certificate.  So much for open.

Not only is the grading now a privilege you have to pay for, Coursera is pushing the opportunity to get a certificate at the end of every video. Here’s an exact quote of their nagging ad (at least in week #1) in its entirety:

Purchase a Certificate today, and you’ll support Coursera’s mission to provide universal access to education.

Open access.  We want everyone to have access to the world’s top courses. We  provide financial aid to all learners with need.

New courses. Revenue from Certificates funds new course development.

Of course, the very existence of Coursera’s many investors is never acknowledged.

Going back through my blog archives, it wasn’t hard for me to find the post where I saw this coming. I wrote this in 2014:

[Coursera co-founder Daphne] Koller, and by extension the rest of the MOOC Messiah Squad, are performing a huge intellectual switcheroo by making arguments like this one. They’re replacing the promise of universal higher education with the promise of universal ACCESS to higher education. We’ll let you listen to our superprofessors for free, she is essentially saying, but you have to do the hard work of obtaining an actual education all by yourself.

If you think this change is why Daphne Koller left Coursera, remember that it took her two more years to actually leave. The investors’ desire to monetize Coursera overtook the promise of educating the world long before she actually departed.  At least when you give PBS $50, they’ll give you a free tote bag.

Actually, Coursera’s business plan now reminds me now more of a company like Evernote than it does public broadcasting. Provide a free service that people find useful, then constantly upsell your customers in the hopes that they might pay up for it. I still use Evernote even after they limited the free service to a total of two devices because it’s useful to me. I haven’t paid them a cent. Evernote is well on its way to going out of business.

I’m well past caring whether any particular Silicon Valley company, be it Evernote or Coursera, is actually making money. What I remain concerned about is the creeping corporatization of higher education.

To explain what I mean here, I’ll pick on my alma mater, the University of Pennsylvania. Here’s Penn President Amy Gutmann from way back in 2012 (when MOOCs were young):

“Penn is delighted to participate in this innovative collaboration that will make high-quality learning opportunities available to millions of people around the world,” Gutmann says. “Expanding access to higher education both nationally and globally remains one of our most critical responsibilities. This initiative provides an invaluable opportunity for anyone who has the motivation and preparation to partake of a world-class education.”

But Coursera isn’t helping Penn provide “high-quality learning opportunities” to “millions of people around the world” anymore. They’re helping Penn provide mostly static content to millions of people around the world and access to low-quality learning opportunities for people with the willingness and resources to pay for it. Heck, they might as well just go back to the old MIT model of taping course lectures there and putting them online. Why not (partially) cut out the middleman and just put your videos up on iTunes U?  Because Penn is an investor in Coursera, that’s why.

MOOCs were never about universal higher education. They were always about making money.  Faculty and students at any university with a MOOC partner ought to recognize that by now, and pressure their schools to un-partner immediately. Then they can develop their own platforms and offer their own MOOCs on any terms they want. Hopefully, those terms will go back to really being open again.

Posted by Jonathan Rees in MOOCs, 7 comments

My favorite mistake.

 

When I decided to teach online for the first time this semester, I was determined to throw out my old survey class and rebuild a new one from the bottom up. My main design concept was to create a US history survey class that didn’t do what the web does badly, and take advantage of what the web can do well. The class is built around writing. [You can see my two earlier posts about the structure of the course here and here.] I though those of you who are still bothering to read this blog might be interested in how it’s going.

Without getting into too many student-specific details: Not too bad. I am very fortunate to have a very small class. That gives me the freedom to make mistakes with a minimum of embarrassment. It also means that I’m not burdened with too much grading as I try to to read everything I assigned (often for the first time) and continually reach out to all the students who are having trouble with either the technology or the history itself.

I have certainly made tons and tons of mistakes. Most of them have had to do with the syllabus. I spent most of last summer working on that thing, and (inevitably) there are plenty of sections in it where I could have explained what I want better. For example, I don’t think there’s a better tool out there if you want comment on and discuss student writing than Hypothes.is. I had used it a bit last semester, but after I went to a Hypothes.is workshop in Denver a couple of months ago I was absolutely dying to use it more and to use it differently. Unfortunately, I hadn’t used it enough at that point to explain how I wanted it utilized particularly well. Now that I’m using it a lot, I can assure you that that explanation will be much more clear next time around.

Before that workshop, I was actually thinking about dropping Hypothes.is entirely because there are so many different programs or publications requiring separate sign ins that I’m using in this course. Five actually. With respect to an LMS, I’m using the free version of Canvas (at a BlackBoard campus). I’m also using Slack, Hypothes.is, Milestone Documents and an online textbook. Oh yeah, there’s also a class blog (but all the technical work there is mine). Yes, I knew this would confuse students — but I did it anyway, and this has become my favorite mistake, one that I plan to repeat next semester.

Why? I wanted to use the best tools available. Period. These tools are simply not available under one technical umbrella. Moreover, since all of these tools are outside the direct control of my university, I feel happily free of direct surveillance.

More importantly, I’ve come to believe that this kind of student confusion is in and of itself a tremendous learning opportunity. One of the things that my volunteer remote instructional design coach (the fabulous Debbie Morrison) told me while this course was still in the planning stages is that you have to give up some time at the beginning so that students feel comfortable with the technology. As a result, I planned two weeks of tech work and historical activities that didn’t count towards the final grade before the students had to start writing. Of course, some students got the tech instantly. For others, though, it was a longer struggle than I ever expected — perhaps in large part (but not entirely) due to my poor instructions.

Now that the essays are coming apace, I’ve decided that those first two weeks were in and of themselves valuable. While you really can now Google anything about history and get at least an O.K. explanation eventually, getting over a fear of technology is a lot harder to do. Digital natives my Aunt Fannie. [And I’ve known this for years, not just when I started teaching online.] While I never, ever expected to teach this kind of thing back when I was in graduate school it’s now pretty clear to me that this may be the most important behavior I’m modelling in this whole online survey class.

On the other hand, all those professors in whatever discipline who say things in public like “I don’t do computers” are not only sending the opposite message. They really are  preparing students for a world that no longer exists for every White Collar job in America, as well as an awful lot of the rest of them. They’re also doing a terrible job preparing their students for the outside world – even the world of history graduate school these days (should they be so foolhardy to actually choose that route).

No, I’m not telling you to turn all your courses into computer classes. I’ve basically turned my survey class into a kind of English composition seminar, so it’s not as if I’m abandoning the humanities or anything that heretical. All I’m telling you is that the world is changing all around you whether you like it or not, and all of my colleagues in academia really ought to at least make some effort to get with the program.

Posted by Jonathan Rees in Online Courses, Teaching, Technology, 4 comments