The means of educational production.

I’ve had two articles come out in the last two days, and I think both deserve at least a shout-out here. The first is a Chronicle Vitae “column” about teaching that has been well-received on Twitter. Give it a look if you’re interested in teaching….or trucks.

The second is a collaboration between my co-author Jonathan Poritz and I in the AAUP journal Academe. While it obviously shares some similarities to Education Is Not an App, I like it a lot because it’s such a good collaboration that I can’t tell where my ideas stop and JP’s begin. The one exception to that is the reference to the “shopfloor” in the title of the essay (as I’m the labor historian of the two of us) – and a few very stray references to Marxism/Leninism in the text.

This is the residual to what was the first conclusion to this piece, all of which ended up the cutting room floor. However, I want to resurrect a bit of it here for the sake of added value. While JP and I were discussing shared governance during the planning process for that article, it suddenly struck me just how unique shared governance is. After all, what other worker besides college professors have even a fraction of the control over the conditions of production that we do? We work alone. As long as we don’t make the mistake of using the learning management system there are few direct records of our work and our output is almost impossible to measure accurately.

I’m not saying that professors should have completely unfettered control over their workplace. That’s why it’s called shared governance, after all. However, our training and expertise has traditionally bought that us far more autonomy than most other workers. Technology is a threat to that autonomy. If you want to see why, look at practically every other post on this blog going back five or six years.

But – and this is where my epiphany come in – unlike skilled production workers, college professors don’t have to unite with anybody in order to control the means of production. By employing whatever educational technology best suits our needs, we can ride the wave of automation all by themselves – like my Chronicle Vitae piece suggests, automating the tasks that should actually be automated, and utilizing our skills to combat the edge cases that come up in teaching every day. Because we already control the means of educational production, we don’t have to give it up without a fight.

The problem comes up when either the labor supply expands beyond what the market can absorb – see Marc Bousquet on grad students as a waste product – or when technology enables our employers to try to re-define what learning is. Shared governance is our protection against both these kinds of changes. That’s why fighting for its continuation can be revolutionary all by itself.

Clayton Christensen hates you and other observations.

I know this article about our old friend Clayton Christensen is old news now, but I was caught up in the end of the semester when it came out and have only gotten to writing about it now:

In a speech Thursday at Salesforce.org’s Higher Education Summit here, Christensen spoke at length about disruption theory broadly and discussed its application to colleges and universities. Higher education, he explained, was among the industries that “for several centuries was not disrupted,” but “online learning has put a kink in that.”

Technology itself is never the disruptor, Christensen said; a new business model is. But “it is technology that enables the new business model to coalesce, and that’s what is happening in higher ed now….

“If you’re asking whether the providers get disrupted within a decade — I might bet that it takes nine years rather than 10,” he said, to a smattering of gasps among the nearly 1,500 attendees.

So there’s absolutely no evidence yet of disruptive innovation in higher ed yet Christensen doubled down on his theory? What else did you expect from someone who runs “a nonprofit, nonpartisan think tank dedicated to improving the world through disruptive innovation.” Whose world is the Christensen Institute allegedly improving?:

Our higher education research aims to find innovative solutions for a more affordable, sustainable postsecondary system that better serves both students and employers.

Faculty? Not so much.

Reading Christensen party like it’s 2012 again reminded me of the first online conversation I had with Stephen Downes way back in those days before I even knew who Stephen Downes was. This is me in the comments to that old post, after Downes criticized me for being more interested in my own job than in universal education:

I’m certainly not going to remain in the global one percent if you succeed in making my job obsolete. Yes, there will still be a Harvard and there will still be a Yale, but state regional comprehensive universities will dry up like dust when the government funding moves entirely online.

You seem to welcome that, Stephen. Do you expect the tens of thousands of people who depend on these kinds of universities and the communities that depend on those universities to welcome that too? [A]m I supposed to just sit quietly and take one for the team?

But forget about me for a moment. If half the colleges in America actually closed, as Christensen STILL predicts, not just faculty would suffer. Administrators, staff, cafeteria workers…all of them would become jobless whole college towns would keel over and die without the economic engine that the local university currently provides. Billions of dollars that would have stayed circulating in those communities would be sucked up and distributed among investors and programmers in Silicon Valley. How exactly does this outcome serve those area employers? And what good is your online college degree if your hometown just died in the process of making it affordable.

Around the same time that Christensen Chicken-Littled himself onto the front page of IHE again, I marked yet another keynote by Audrey Watters which I thought might be useful at some point in the future (and, of course, it was). She’s talking about a different subject here, but I think this principle remains applicable:

Our institutions do not care for students. They do not care for faculty. They have not rewarded those in it for their compassion, for their relationships, for their humanity.

Christensen claims his schtick is non-partisan and improving the world, but it’s really just warmed over Social Darwinism from the late-nineteenth century.  You can dress up Herbert Spencer in the fig leaf of social science and technological philanthropy, but that doesn’t make his core philosophy any less cruel.

With or without edtech.

Earlier today, my Twitter friend Jon Becker @ed me a link to an EdSurge essay about edtech and refrigerators, suggesting that I was “the only person qualified to comment on this.” Indeed, I can say with some certainty that I am the only person in the world who has written two books on refrigerators who is also interested in education technology. So when the Clayton Christensen Institute for Cheerleading Disruption of All Kinds throws a slow, hanging softball made especially for me, how can I possibly resist?

For those of you who refuse to read anything produced by the Clayton Christensen on principle (and I have some sympathy for that position these days), let may save you a click. The author, Julia Freeland Fisher, uses research on comparative appliance adoption rates by her colleague, Horace Dediu, to argue that:

[I]t’s becoming increasingly acknowledged that we need to pair investments in edtech tools with investments in professional development. But for the tools and models that least conform to traditional school structures, we’re also likely to need investments in fundamental reengineering—that is, not just developing teachers’ proficiency in using tools but rethinking processes like schedules, evaluations and staffing throughout an entire school building or district.

What do refrigerators have to do with restructuring schools? In order to use a new refrigerator, consumers only had to plug them in. In order to use washing machines, on the other hand, consumers needed plumbers to help them and maybe a whole new set of pipes in their houses. That’s why refrigerators became much more popular, much faster than washing machines and that’s why you need to change the way schools are structured so that they can best take advantage of all the wonderful new education technology that EdSurge must cover every day.

The first thing that jumped out at me about this article was Fisher’s basic dates in the history of the refrigerator. She says the refrigerator debuted in the 1930s. The first electric household refrigerators appeared during the 1910s. They were already being mass-produced by the late-1920s. “Refrigerators quickly took hold,” she writes “gaining over 90 percent adoption by the late 1950s.” I actually used the exact same statistic in my book Refrigeration Nation (p. 179, for all my fellow refrigerator aficionados who want to consult your own copies), but I used it to make the exact opposite point about refrigerators. In 1957, when over 90% of American households had refrigerators, only 12% of French households had refrigerators and less than 10% of English households did. If refrigerators were really that great, why didn’t they too just plug them in and enjoy the bounty?

As a historian, this is where I became really curious about where Fisher got her statistics. While she namechecks her colleague Dediu, there’s no link in the piece to any published study about refrigerators and washing machines. Indeed, the only link in the entire essay is to a general study about technological diffusion. There’s a chart in Fisher’s essay about comparative adoption curves, but there’s no source listed for that either. Other than completely leaving out the bottom left, the curve for refrigerators looks OK to me, but how can I trust her point about washing machines if I don’t anything about the source? How can I be sure that this isn’t the edtech equivalent of fake news?

That’s why I opened a book. Ruth Schwartz Cowan’s More Work for Mother is a classic in the history of technology and pretty close to the only scholarly work that tackled the history of refrigerators at any length before I did. Since it is a general history of appliances, I figured it might have a little bit about washing machine adoption rates in one of the sections I had forgotten about. So I pulled it down off my shelf, turned to the index and quickly hit the jackpot; “washing machines…diffusion, 195-96.” Here’s the quote:

“[I]n 1941–roughly thirty years after they came on the market, and twenty years after the prices had fallen to more or less reasonable levels as a result of mass production–only 52 percent of the families in the United States owned or had “interior access” to a washing machine. Thus, just under half the families in the land were either still hand rubbing or hand cranking their laundry or using commercial services.”

If you’re wondering, the Fisher/Dediu number is about 10 percentage points lower than the one that Cowan used. Perhaps this can be explained by the difference between owning a washing machine and “accessing” a washing machine in the basement of your apartment building or taking your dirty laundry down the street to a laundromat. But for purposes of Fisher’s overall point about edtech, this distinction means everything.

Can you live without a refrigerator? Most Americans can’t. [Indeed, the refrigerator adoption rate in the modern US is actually 99.5%.] However, French or English people in 1957 still had easy access to fresh meat and produce at large markets.  Many still choose to live that way today because fresh perishable food tastes better. Americans, on the other hand, tend to preference convenience over taste. That’s why the refrigerator industry was one of only three in the whole United States to grow during the Great Depression.  Anyone who had any money to spend at that time greatly valued the added convenience of electric refrigerators over ice. By 1960, the old ice industry basically disappeared because it ran out of customers.

Can you live without a washing machine? Of course you can. That’s why there are still coin-operated washing machines and laundromats. Keeping your food in other people’s refrigerators isn’t an option in the United States, but you don’t need constant access to a washing machine in order to get your clothes washed by machine when needed. In other words, owning your own refrigerator is close to the only way to have access to refrigeration, but dragging your dirty clothes to any laundromat is a reasonable way to get access to a washing machine even if there is none in your home or apartment.  There’s only one way to keep your perishable food fresh, but there are plenty of ways to get your clothes washed whether you own a washing machine or not. In short, refrigerators are close to a necessity. Washing machines are just really, really convenient.

Can you live without edtech? [You just knew I had to get around to edtech here eventually, right?] Shockingly enough, there were actually good schools in the United States long before Bill Clinton and Al Gore decided to put a computer in every classroom. Plenty of teachers and professors offer great classes of all kinds without anything more sophisticated than their voices and a chalkboard. Weirdly enough, just this morning, right after I read that article, I was pitching our dean on starting a digital humanities program in our college. “What about the professors who don’t want to use technology?,” he asked me. I said I would never in a million years force any teacher to use technology if they don’t want to, but it’s a actually a good thing if students have a wide range of classes in which they can enroll, some of which use educational technology and some of which don’t.

Which brings me to the fundamental problem with the Clayton Christensen Institute for Cheerleading Disruption of All Kinds. The whole assumption behind that article is that one technology will always inevitably drive another technology to extinction: Refrigerators will completely replace ice, washing machines will completely replace washboards and edtech will completely replace conventional teaching. That is only true for the first of those examples (and even then, only really in the United States). Whether teachers want to teach with or without edtech is a cultural decision, not some hard and fast rule determined by the universal laws of technology.

Unless, of course, you have some other axe to grind…

BYOB (Be Your Own Boss).

You might not know this about me (as I don’t write about it much here), but I’m Co-President of the Colorado Conference of the American Association of University Professors (or AAUP).  In that capacity, I knew about this story long before it got reported (even though I didn’t participate in the investigation or contribute at all to the report):

A new report from the American Association of University Professors alleges that Colorado’s Community College of Aurora terminated an adjunct because he refused to lower his expectations for his introductory philosophy class. The report sets the stage for the AAUP to vote on censuring Aurora for alleged violations of academic freedom later this spring, but the college denies such charges. It blames Nathanial Bork’s termination on his own teaching “difficulties.”

I know Nate pretty well, so I’m more than a little biased when it comes to a case like this. Nevertheless, there are a couple of things about this incident that just made my head explode. First, as you can see from that IHE article, Nate still teaches at Arapahoe Community College, which is part of the same community college system as the Community College of Aurora. At CCA, Nate was allegedly such a bad teacher that the college fired him “virtually on the spot,” yet he’s still working productively down the road. If Nate was really such a menace, don’t you think CCA might have wanted to warn its sister school about him?

The second, even-more-mind blowing part of this case goes back to Nate being fired “virtually on the spot.” Nate was apparently so awful that they fired him DURING the semester, leaving all of his students in a lurch with some patchwork of substitute teacher(s) until finals week ended. He’d have to have been pretty darn awful for the benefits of that maneuver to outweigh the considerable costs. Of course, it wasn’t really about Nate’s teaching.

Nate’s firing was about making a point. The authors of the AAUP report on Nate’s case cover this subject very deftly:

A cannier administration might have let Mr. Bork finish the semester and then have declined to renew his contract. Insofar as this could have been done for exactly the reasons that appear to have motivated the CCA administration’s summary mid-semester dismissal of Mr. Bork, it would have constituted just as severe a violation of academic freedom. But the administration would have enjoyed the plausible deniability afforded by policies and procedures that enshrine arbitrary nonrenewal of appointments for adjunct faculty members.

It is certainly no secret that adjunct faculty lack real academic freedom precisely because of their precarious employment. Yet the administration at CCA made no pretense of the idea that Nate and other adjuncts there have the same control over their classrooms that tenure track faculty at most places (hopefully) have. They came up with this “Gateway to Success Initiative,” imposed it indiscriminately upon faculty of all kinds and fired Nate in response to his desire to be his own boss (at least as far as the way that he chooses to run his classroom is concerned).

While the AAUP’s Committee A (which oversees investigations like the one at CCA) doesn’t take that many cases in any given year, it should be obvious why this one is really important. Here is an administration that won’t even make the usual happy noises about all faculty having academic freedom. They think they should have more power over curricular decisions than their own faculty do. While I ran a few courses on spec in grad school, I’ve never taught as adjunct. Nevertheless, I have to imagine that one of the reasons that you’d put up with low pay, no benefits and zero job security is precisely that you can be your own boss in the classroom setting.

Yes, if you’re a terrible teacher, you might be subject to observation and discipline. But that discipline should be meted out by other faculty (like your department chair) and not by your administrators. You should also have the opportunity to change course if your teaching is somehow not up to snuff, and not get summarily dismissed in the middle of the semester.

Everything I’ve written about this case so far should be obvious to any informed faculty member who considers the issues at stake. But I want to make two more points that might not be so clear to everyone.

First, adjuncts are just the low-hanging fruit in a long-term administrative movement towards trying to control the way that faculty to teach. You can discipline adjuncts, particularly CC adjuncts, because they have few expectations of academic freedom and (often) a dire need for continued employment. Once this becomes the norm, there is no reason to believe that administrators will let tenure-track and tenured faculty exercise their traditional prerogatives in their own classrooms. Running a university like a business means closely controlling exactly how work gets done. If faculty acquiesce to this kind of academic Taylorism, we’re all gonna end up working with stopwatches behind us no matter what our employment status happens to be.

Second, to get back to a subject more common on this blog, technology is already greatly enabling administrators in this quest to control the classroom. My old obsession, mandatory LMS usage is just part of this phenomenon. But the destruction of faculty prerogatives goes beyond just administrators. Consider this observation from Jonathan Poritz and I in our book Education Is Not an App (p. 65):

While a typical face-to-face course, or even a regular fully-online course, does not have to cater to the recommendations of the nineteen or twenty people who may collaborate to produce a MOOC, the rise of online learning tools has meant that professors of all kinds have less say over their own classrooms than they did even twenty years ago. One reason that the power of [“teaching and learning specialists”] has increased is that the power of faculty has dwindled as technology has made it easier for faculty prerogatives to be divided when the work of teaching gets unbundled.

Now we’re not saying that instructional designer stink and they all must be destroyed. What we are saying is that the final decision about how the classroom will operate must belong to the professor, no matter what their status of employment happens to be. If you build a better mousetrap, use the carrot not the stick. Most faculty are smart and caring enough to join any technological bandwagon worth joining.

For all these rasons, by taking a stand on Nate’s behalf, the AAUP is actually taking a stand on behalf of us all. If you’re appreciative of this kind of work, you should consider joining us.

My adventures in digital history.

These are my remarks as written (if not exactly as delivered) in Paul Harvey’s history seminar at the University of Colorado – Colorado Springs this morning:

I recently wrote an essay for the Chronicle of Higher Education called “Confessions of an Ex-Lecturer.” Yet my appearance this class (well, the first part of this class anyway) is going to be a lecture. Yes, I’m going to lecture about why and how I stopped lecturing. To get past this enormous contradiction, let me make a distinction between conveying historical content and making a pedagogical argument. You have no reason to memorize anything I say today. There will be no quiz later. Instead, this lecture explains my thinking about teaching history to you and see if I can convince you I’m right. I’ve adopted a lecture format here because I have to tell the story of how my thinking has changed in order for you to follow along with my reasoning.

My opinions on this subject are not popular in historical circles. As one of my former graduate school acquaintances put it on Twitter the other day: “[T]hey will pry the lecture out of my cold, dead hands.” I sympathize. Old habits die hard. That’s the way I learned history when I was in college. Indeed, I never had a class of any kind in college that had fewer than thirty people in it and the vast majority of those class periods consisted of people lecturing at us. A lot of those professors were really good at what they did – although I did take a class from a political science professor who looked up at the ceiling as he talked, which drove me completely crazy….but that’s a story for another time. The reasons I’ve sworn off lecturing in my own classes are twofold.

First, there’s the advent of the cell phone. These small supercomputers have so permeated daily life that the average person – notice how I didn’t say average student – average person can’t go ten minutes without reaching for their phone at least once. Indeed, stick me in some meeting where someone starts lecturing about something that I’m not particularly interested in and I’ll reach for my phone far faster that. I could be the most interesting lecturer in the world (which I most certainly am not), and a good number of you would still reach for your phones at some point during the presentation.

Please understand that I’m not blaming millennials here. I’m blaming everybody. For so many of us, the temptations of the Internet is just to hard to resist. “When people say they’re addicted to their phones, they are not only saying that they want what their phones provide,” writes the MIT psychologist Sherry Turkle, “they are also saying that they don’t want what their phones allow them to avoid.” If I’m talking at you in a classroom of any size, it is ridiculously easy for you to avoid me and I’m not going to be able to change that. Therefore, I have to talk at you I better make darn sure that I have something interesting to say.

So what if I give you the opportunity to do something rather than to passively absorb information? What the Internet take away, it also giveth. My interest in digital history comes from my interest in finding some alternative to lecturing about historical facts and then testing students on how many of those facts they’re retained. I know this is sacrilege in most historical circles, but I’m gonna say it anyways: You really can Google anything.

The Internet is well-developed enough that most of the time a discerning consumer of information can get reasonably reliable factual information very quickly with limited effort. But, and this is the second reason I’ve basically given up lecturing, with limited technical knowledge it is now possible for ordinary college students to make useful contributions to the great pool of historical information available online. Not only that, by doing so, they can pick up practical computer skills that will increase their employability upon graduation. With that kind of upside, taking some of the attention in class off of me seemed like a small price to pay.

One of the most interesting things about digital history is that this field lets you make professional use of skills that you probably picked up just by being an active digital citizen. For example, I started blogging right after I got tenure in 2003 because I was a lot less worried about someone threatening my employment because of my political opinions. Oddly enough, I devoted my entire blogging life to one subject: Walmart. I learned WordPress from a guy named Jeff Hess in Cleveland, Ohio via e-mail. Jeff was the administrator of our group anti-Walmart blog.

In 2007, when my department wrote and was awarded a teaching American History grant from the federal Department of Education, I used those skills in class for the first time. We were funded to take teachers to historic sites on the East Coast over the summer and this was a way that they could write easily from the road and that we could still follow them. So could their relatives friends and even students, which served as a nice side benefit – a benefit that applies to all sorts of history undertaken on the open web.

Another skill I already had which turns out to have enormous digital history ramifications is some proficiency in social media. Personally, I’m a stonecold Facebook hater, but Twitter has been a godsend to me with respect to digital history not so much in class but for keeping up with the field. Your professor, for example, (if you didn’t already know) is a prolific Tweeter, if mores on American religious history than digital history and things technological. More importantly, my students have used it to reach out to scholars in fields that they’re researching.

It’s also a great tool for publicizing the work you do online. I actually got a book contract thanks to Twitter (although not in history). If you’ve spent any time listening to the Canadian scholar Bon Stewart as I have, you’ll understand how social media in general and Twitter in particular is a great tool for building communities of interest – and I mean that both in terms of what you enjoy and as a way to fight for what you believe.

With respect to digital history in particular, the turning point for me in particular was the summer of 2014 when I attended an NEH Institute at the Roy Rosenzweig Center for History and New Media at George Mason University in Virginia. Me and a bunch of other folks who never studied this stuff in Grad School got a very intensive tour of what’s on the web, web tools and how we might want to integrate them into our classes. Some of it was old hat for me. Unlike a lot of my fellow professors, I had already heard of two-factor authentication and Password protection programs.

However, when it came to history-specific web tools almost everything they touched on was brand new to me. One I was already using, but learned to use better is Zotero, which actual began at the Roy Rosenzweig Center for History and New Media and really ought to be on every historian’s must-use list. Zotero is a notes program that lets you gain intellectual control of your research by allowing you to search it at the word level. That includes content inside digital copies of things that you’ve scanned and uploaded. As someone who wrote his dissertation on 4×6 notecards I can tell you I am never, ever going backwards on this. That’s why I’m now requiring all my students doing research papers to use it. My students constantly tell me how grateful they are to know about Zotero, and how they wish they knew about it two or three years earlier.

A jaw-dropping research tool for digital historians that I first learned about in Virginia is Camscanner. Camscanner is an app that turns your cell phone scanner into a document scanner. If I could show you the huge pile of Xerox copies I made for my dissertation at 25 cents, 50 cents…even a dollar a pop, you’d know why this is so amazing. Having access to free copies of documents from archive make sit easier to acquire information over what is often very limited research time. I had some experience with researching this way when the Manuscripts Division at the Library of Congress installed the greatest book scanners that I had ever seen in order to preserve the physical well-being of their collections (since bending things back for ordinary copying does so much damage). Now I’m swimming in information – information that’s searchable using Zotero. The same is true for my students as I have them working with local archives in my digital history classes.

The program I settled on for them to use is Scalar, which comes out of the University of Southern California. It’s actually designed as a book publishing program, something that allows books to appear on the web with digital media embedded into them. I’ve been using it in class for web exhibits. Study after study has shown that putting resources up on the web drive traffic to physical archives and libraries rather than take it away, so I’ve had my student create Scalar projects using local resources and putting them up on the web. Here’s a recent example from the Steelworks Center for the West that I liked a lot. Here’s another about a place I think that everyone in this class ought to know well.

Why Scalar? You don’t have to know how to program in order to make it look good. Indeed, as the experience of countless of my students has more than proven, you can learn how to use it within just an hour or two of starting to play with it. Indeed, I have plenty of students who can Scalar far better than I can because they’ve had far more reason to use more features than I have since I simply use it to put up a few syllabi (although I have trained to do more now).

Another reason I like Scalar is that students and faculty who use it can host their own Scalars if they go through Reclaim Hosting. This is not the place to argue why faculty and students should take back the web from university administrators and private companies (although I did co-author a book that fits in well with that argument), but one of the best things about the Reclaim-related “Domain of One’s Own” project is that it allows students to keep access to their digital work even after they’ve graduated. Scalars students create through Reclaim therefore can serve as evidence to potential employers that they can do something other than just historicize things. Not that there’s anything wrong with the ability to historicize things, but in this manner digital history might actually be the answer to the age-old question, “What can you actually do with a history degree (besides teach)?”

On personal level, my digital history experiments has proved much more interesting than standing up and lecturing to disinterested students about the same old things that I had always been lecturing about. In the future, I’m dying to get into digital mapping, as the Steelworks Center of the West has an absolutely astounding collection of mine maps that cover both towns and mines. I imagine a digital project that traces the physical impact of mining on Southern Colorado’s landscape as soon as I have enough theoretical background to pitch it to some funding agency. What’s really great is that thanks to my changes in pedagogy I’ll be able to get my students to pitch in.

When I was at the American Historical Association meeting in Denver a few weeks ago, I attended almost nothing but digital history sessions. I was really struck by all the people at those sessions by how willing everyone was to admit that they have no idea what they’re doing – that the whole field of digital history is kind of a running experiment. To paraphrase one scholar I heard at the meeting, digital history blurs the line between research, teaching and service. In my case, I’m having students do historical research and putting on the web for the benefit of local historical non-profits. I think the benefits of doing this far outweigh whatever harm that gets done to my ego if I’m no longer the center of attention in class anymore.

MOOCs: A Postmortem

MOOCs are dead. “How can I possibly argue that MOOCs are dead?,” you may ask. After all, to borrow the stats just from Coursera, they have: 1600 courses, 130+ specializations, 145+ university partners, 22 million learners and 600,000 course certificates earned. More importantly, it appears that Coursera has received $146.1 million dollars over the years. Even though it hasn’t gotten any new funding since October 2015, unless Coursera tries to copy “Bachmanity Insanity” (Is Alcatraz still available for parties?) the company is going to be sticking around for quite a while.

What I mean when I say that MOOCs are dead is not that MOOCs no longer exist, but that MOOCs are no longer competing against universities for the same students. Continuing with the Coursera theme here, in August they became the last of the major MOOC providers to pivot to corporate training. While I did note the departure of Daphne Koller on this blog, I didn’t even bother to mention that pivot at the time because it seemed so unremarkable, but really it is.

Do you remember Daphne Koller’s TED Talk? Do you remember how incredibly utopian it was?  n truth, it made no bloody sense even then. For example, she suggested back at the height of MOOC Madness that:

[M]aybe we should spend less time at universities filling our students’ minds with content by lecturing at them, and more time igniting their creativity, their imagination and their problem-solving skills by actually talking with them.

I agree with that now. In fact, I agreed with that then too. The problem with that observation to almost anyone who actually teaches for a living remains that talking with students is obviously impossible when you have ten thousand people in your class. More importantly, showing students tapes of lectures (even if they’re broken up into five minute chunks) is still lecturing.

That’s why MOOCs were never going to destroy universities everywhere. There will still be far more than ten universities fifty years from now. Or to put it another way, the tsunami missed landfall.

But just because this blow proved to be glancing doesn’t mean that the punch didn’t leave a mark. For example, a lot of rich schools threw a lot money out the window investing in Coursera and its ilk. [Yeah, I’m looking at you, alma mater.] Others simply decided to spend tens of thousands of dollars on creating individual MOOCs that are now outdated almost by definition since they’re not designed for corporate training.  Yes, I know that MOOC producers claim that their MOOC experience improved teaching on campus, but think how much better teaching on campus would have been if they had just invested in improving teaching on campus.

At best, MOOCs were a distraction. At worst, MOOCs were a chronic condition designed to drain the patient of life-giving revenue. Instead, those schools could have used that revenue (as well as its initial investments) for other purposes, like paying all their faculty a living wage.

My inspiration for this observation (and this entire post) is the MOOC section of Chris Newfield’s terrific new book, The Great Mistake: How We Wrecked Public Universities and How We Can Fix Them.*  This is from page 227:

MOOCs were not going to leverage public colleges by buying them.  But they could acquire a share of their revenue streams–that combination of student tuition and enrollment-based public funding–whose capture is one of the five key elements of privatization…MOOCs could leverage their private capital with far greater sums flowing colleges and universities without buying anything up front.  This offered the attractive prospect to strapped public colleges of gradually replacing even more tenure-track faculty with technology that could be managed by private MOOC firms off campus, for a reasonable fee.

To make one of my favorite distinctions, this applies to schools that are MOOC-producers (like Arizona State) even if those MOOCs are mainly for internal consumption, and especially all those colleges and universities that were potential MOOC consumers – any place that considered replacing their humdrum, ordinary faculty with all the “world’s best lecturers.”

In order to capture part of that revenue stream, MOOC providers had to argue that their courses were better than the ones that students were taking already.  That explains all the hating on large lecture courses.  Except, MOOCs were nothing but large lecture courses dressed up with technological doodads.  As Newfield explains on pp. 242-43:

     In effect, MOOC advocates were encouraging students to leave their state universities to attend liberal arts colleges, where they could go back to the future of intensive learning in the seminars that typify “colleges that change lives.”  But of course they weren’t.  Advocates were actually claiming MOOC ed tech could create liberal arts colleges all for next-to-no-cost (Koller) or greatly lowering costs (Thrun).  In making this claim, they ignored existing historical knowledge about learning at high-quality institutions, which made the technology seem original, when it was not.

MOOCs may have been cheaper (and Newfield even disputes that), but they certainly weren’t better – even than large lecture classes.

Again, the vast majority of us faculty foresaw this particular Titanic hitting the iceberg (including me, even if it did take me a while). Nevertheless, university administrators that partnered with MOOC providers or (even worse) bought their products, trusted Silicon Valley types more than their own faculty. This course of action was a reflection of the same self-loathing that Audrey Watters describes here:

There seems to be a real distaste for “liberal arts” among many Silicon Valley it seems – funny since that’s what many of tech execs studied in college, several of whom now prominently advocate computer science as utterly necessary while subjects like ethics or aesthetics or history are a waste of time, both intellectually and professionally.

Yet at least these Silicon Valley types had enough self awareness to go into a different field after they left college. What’s the excuse for a university administrator with an advanced degree in the humanities (or anything else for that matter) to hate their educations so much that they spend hundreds of thousands of dollars to deliberately undermine them?  There is none. They should have known better.

Next time Silicon Valley comes up with a new way to “disrupt” education, let’s see if we faculty can invest more time and effort in getting our bosses to listen to common sense.  Instead, as Newfield notes in his postmortem of Koller’s TED Talk on p. 241 of The Great Mistake:

The categorical discrediting of faculty pedagogy made this bypass of faculty expertise and authority seem reasonable and necessary for the sake of progress.

So in the meantime, let’s fight to improve shared governance everywhere so that we’re prepared to fight for quality education if our bosses refuse to accept the obvious.  Some of us becoming temporarily famous is not worth wasting so much money and effort on any technology that is obviously going to prove to be so fleeting.

* Full Disclosure: Newfield and I have the same publisher even though we publish in entirely different fields.

Get your side hustle off.

I’ve been streaming a lot of Simpsons with my son lately. Backwards. Since I quit watching the show regularly sometime in the late-90s, this was the best way that we could both enjoy all-new material. The quality of even the most recent stuff is obviously the good thing about streaming the Simpsons. The bad thing is being locked into watching all those Fox commercials since my cable company (or maybe it’s Fox) won’t let us fast forward. The above Uber commercial has been on full rotation for months. In fact, it sometimes plays twice an episode. I’ve been making so many “earnin’/chillin'” jokes that my son now leaves the room when it comes on.

I thought of that commercial twice while I was at the AHA in Denver last weekend. The first time was when I explained to four historians from Northern California (ironically, the first place that I ever took an Uber) how Uber works. [Key lesson: Always tip your driver!] The second time was when I went to my first digital humanities panel on Saturday morning. The commentator, Chad Gaffield from the University of Ottawa, was talking about how DH makes it possible to break down the false dichotomy between work and play. That spoke to me, because I’ve been having an awful lot of fun teaching all my classes lately. Indeed, I’m going to bring that up the next time I hear someone who teaches like it’s still 1995 start talking about “rigor.”

The other point Gaffield mentioned that I thought was really important was the way that DH blends the traditional roles of teaching, research and service. In my case, I teach students how to research using local resources that help the community once they appear online. However, I suspect there are a million variations to that. In any event, when you fill out your annual performance review, we can all include DH work in whichever category we don’t have enough material in already.

In the very early days of this blog, the role of tech critic was something of a side hustle for me. It wasn’t my day job, but my writing nonetheless found an audience. It’s through the conversations which that writing inspired that I stumbled into a large, multi-disciplinary pool of scholar/teachers who were trying to utilize the Internet to create unique educational experiences rather than cheap, dumb carbon copies of face-to-face courses. I started teaching online so that I could try to set a positive example for other people who might be reluctant to make the same jump because so much of what’s out there has a justifiably bad reputation. I still have a long way to go, but one of the most refreshing things I got out of all the DH panels I went to last weekend is that so does everybody else. Even historians who get their DH papers onto AHA panels readily admit that their learning curve remains steep.

By the time I left Denver in Sunday, I had decided I’m never going back. I don’t want my conventional courses to be entirely conventional anymore. In other words, I’ve been convinced that the digital needs to be present in every course I teach.

I am hardly the first person to draw such a conclusion. CU-Boulder’s Patty Limerick wrote in the September 2016 issue of AHA Perspectives that:

In innumerable settings, historians in Colorado are stepping up to this challenge. In the process, they are devising practices that transcend the conventional turf fights between “academic history” and “public history,” uniting in the strenuous and satisfying work of “applied history.”

I think you could make a pretty good case that food and refrigerators are relevant today, but it’s my classes which take students into the Steelworks Center of the West and the Pueblo Public Library that fit this definition of “applied history” the best.

While such activities have little to do with my current research, teaching is 50% of my job according the annual performance review I’ll have to turn in a couple of weeks from now. In short, what was once my side hustle has now become my regular hustle. While there’s still a lot of tech criticism left to write and I plan to write at least some of it when I have the time, this blog, when I have time for it (and why would I have redesigned it if I had intended to never use it again?) is going full pedagogy.

In the meantime, I have another actual history book I want to write…

More mistakes from my first semester teaching the US History survey online.

My first semester as an online instructor is almost over. Who knows where the time goes?

Curating a respectable online survey course experience comes with a lot of responsibility. In my humble opinion, too many online US history survey courses cling to the vestiges of the traditional lecture model. As I’ve explained here and here, mine is more like an English composition class. While I’ve enjoyed teaching it so far, the whole thing is far from perfect. So in the interests of transparency and helping anyone out there who might actually be interested in following my path, I’m going to try to explain more of the mistakes I’ve made (besides this one), as well as all the fixes that I’ll be implementing when I re-write the syllabus over Christmas break for next semester’s students.

1) Many years ago, when I first started at CSU-Pueblo, I asked an Associate Provost whether he thought I should have an attendance policy. “Do it for their own good,” he responded, and I have mostly stuck with that advice. Of course, an attendance policy makes no sense in the context of an asynchronous, entirely online course, but you still need your students to log in to do the work. I can’t tell you the number of times over the years that I have lamented the fact that when I remind students that there is an attendance policy, the people who need to hear it usually aren’t in the room. Telling students to log in and do the work when they never bother to log in is even more frustrating.

That’s why I’m moving to mandatory meetings (in person or via Skype) during the first two weeks of class, when everyone’s working on setting up the various accounts and programs I require. On a human level, I suspect it’s a little harder to abandon a course when the professor is a person rather than screen presence. On the more practical level, my one question during those meetings is going to be, “How do I reach you if you suddenly disappear?” Yes, I realize that online courses have always had higher dropout rates than their face-to-face alternatives, but that doesn’t mean that I can’t try to make my course something of an exception.

2) Another typical problem I’ve had is with the discussion aspect of the course. For one thing, it proved next to impossible to get and keep a good discussion going with a very small number of students (although things have been better now towards the end of our time together), even though Slack has worked beautifully for student/teacher communications. Besides getting a bigger course, I think my problem here was requiring too little. I’ve been asking for a question and an answer and a document summary for each two-week unit. In the future, I’m going to up that to once a week, and increase the percentage of the grade that goes for discussion. I also need to recommend the Slack mobile app a little more forcefully, as it has been great for keeping track of those discussions that went well.

3) As part of those unit assignments, I’ve been requiring students to bring in a source from the wider Internet in order to evaluate it. I LOVE the fact that I can conceivably do that well in this format, especially after reading the summaries of that bone-chilling Stanford study about students and fake news. The problem in my class has been is that students don’t have any context to evaluate what makes something reliable. Indeed, the best answers I could get all revolved around the origin of the story. “It’s from the New York Times, of course it’s reliable.” Nobody cares where the NYT was getting its info.

My plan is to move the outside sources out of the week-to-week writing assignments and into the pre-exam section of the course, and try to ban them outright for the bi-weekly essays. Too many people were using Google to write their assignments anyway, and not looking at the credible assigned texts. If I move the wider web stuff to the end of the course sections, then they’ll have weeks of assignments and textbook reading that they can compare their outside sources too. If a reliable, already assigned primary source (or even the textbook) corroborates their outside source, then we can all gain a better understanding of what reliability really means.

4) I’m just gonna come out and say it: Online grade books are shit. Yes, the one in Canvas is better than the one in BlackBoard, but if your grading scheme includes something as simple as dropping the lowest grade of any kind of assignment (as mine does) it is impossible to get these systems to do what you want them to do ad still have a reliable total at the end of the row of columns. And don’t even ask me about converting letter grades into points. It’ll just make me angry.

This whole problem reminds me of why I resented grade books for so long back when I was only teaching more conventional classes. Students would constantly ask me what they’re grade was and I’d say, “Do the math.” The math wasn’t that hard, but they were so used to getting their simple running final grade totals on a platter that response made me look like an asshole. Yet there are advantages to not keeping a running total.  For example, I can do crazy things like grade up for improvement over the course of the semester or even curve my results if I decide that my constant pedagogical experiments proved too much for that semester’s students.

So what am I going to do about this? First, I’m going to try to disable the final grade mechanism entirely so all that students can do is read their letter grades. I think that might work if I assign zero points for each assignment and use a separate spreadsheet at the end of the semester. If that doesn’t work, I’m going to stop using a grade book entirely. Sometimes old school is better than dumb school.

5) Read the syllabus:

If you thought reading the syllabus was important for regular courses, it is probably five times as important for online courses because your students don’t get the benefit of listening to you repeat reminders at them all semester. As usual, some students clearly did do so, but others clearly didn’t.

What to do here? At first I was thinking about a syllabus quiz, but that’s so boring. My new idea is an online treasure hunt that will force students to go back through the other programs I’m forcing then to use. [What is in the crazy online GIF that I embedded in the first response in Slack #random channel? Send the response to me as a Slack direct message.] Stick those commands in random places in the middle of the syllabus (and grade them), and maybe I can kill two birds with one stone.

Yes, there are a few more mistakes that I know I’ve made, but the of Gravatars or my my troubles with Hypothes.is groups were in no way pivotal to the success or failure of the class. The mistakes covered here are enough for public consumption. In the meantime, your thoughts and suggestions to what’s here would be much appreciated both by me and my fellow denizens of the CSU-Pueblo Center for Teaching and Learning who are teaching online for the first few times and trying to make their courses better too.

I’ve come out of MOOC retirement.

I remember exactly when I took my first MOOC. It was Fall 2012 (during the run up to the last presidential election) and I was on sabbatical.  If I didn’t have all that extra time on my hands I never would have finished it.  I’ve signed up for a couple of more since then (like the one with edX just so I could see if their syllabi actually had required reading), but I never really did any work on those.  Recently, I started classifying my Coursera e-mail as spam so I wouldn’t even have to think about MOOCs quite so much anymore.

Yet much to my shock, I’ve come out of MOOC retirement. While I’m not doing the work for the University of Houston’s Digital Storytelling MOOC, I have decided to do watch the videos because I really want to be able to introduce digital storytelling as a possibility into my next digital history class.  All I really need is some knowledge about the tools with which I can experiment. When I actually teach this stuff we’ll all kind of fake it together.

That’s good, because if I had wanted to do the work in the MOOC and get it graded, I’d have had to pay Coursera for a certificate.  So much for open.

Not only is the grading now a privilege you have to pay for, Coursera is pushing the opportunity to get a certificate at the end of every video. Here’s an exact quote of their nagging ad (at least in week #1) in its entirety:

Purchase a Certificate today, and you’ll support Coursera’s mission to provide universal access to education.

Open access.  We want everyone to have access to the world’s top courses. We  provide financial aid to all learners with need.

New courses. Revenue from Certificates funds new course development.

Of course, the very existence of Coursera’s many investors is never acknowledged.

Going back through my blog archives, it wasn’t hard for me to find the post where I saw this coming. I wrote this in 2014:

[Coursera co-founder Daphne] Koller, and by extension the rest of the MOOC Messiah Squad, are performing a huge intellectual switcheroo by making arguments like this one. They’re replacing the promise of universal higher education with the promise of universal ACCESS to higher education. We’ll let you listen to our superprofessors for free, she is essentially saying, but you have to do the hard work of obtaining an actual education all by yourself.

If you think this change is why Daphne Koller left Coursera, remember that it took her two more years to actually leave. The investors’ desire to monetize Coursera overtook the promise of educating the world long before she actually departed.  At least when you give PBS $50, they’ll give you a free tote bag.

Actually, Coursera’s business plan now reminds me now more of a company like Evernote than it does public broadcasting. Provide a free service that people find useful, then constantly upsell your customers in the hopes that they might pay up for it. I still use Evernote even after they limited the free service to a total of two devices because it’s useful to me. I haven’t paid them a cent. Evernote is well on its way to going out of business.

I’m well past caring whether any particular Silicon Valley company, be it Evernote or Coursera, is actually making money. What I remain concerned about is the creeping corporatization of higher education.

To explain what I mean here, I’ll pick on my alma mater, the University of Pennsylvania. Here’s Penn President Amy Gutmann from way back in 2012 (when MOOCs were young):

“Penn is delighted to participate in this innovative collaboration that will make high-quality learning opportunities available to millions of people around the world,” Gutmann says. “Expanding access to higher education both nationally and globally remains one of our most critical responsibilities. This initiative provides an invaluable opportunity for anyone who has the motivation and preparation to partake of a world-class education.”

But Coursera isn’t helping Penn provide “high-quality learning opportunities” to “millions of people around the world” anymore. They’re helping Penn provide mostly static content to millions of people around the world and access to low-quality learning opportunities for people with the willingness and resources to pay for it. Heck, they might as well just go back to the old MIT model of taping course lectures there and putting them online. Why not (partially) cut out the middleman and just put your videos up on iTunes U?  Because Penn is an investor in Coursera, that’s why.

MOOCs were never about universal higher education. They were always about making money.  Faculty and students at any university with a MOOC partner ought to recognize that by now, and pressure their schools to un-partner immediately. Then they can develop their own platforms and offer their own MOOCs on any terms they want. Hopefully, those terms will go back to really being open again.