Jonathan Rees

Professor of History, Colorado State University - Pueblo.
Professor of History, Colorado State University - Pueblo.

I have run out of interesting things to write about edtech.

Welcome to the new More or Less Bunk. I think this is version #4, if memory serves me well. I redesigned it again because I’ve started guesting in that computer science class I described in this post. Since I knew I was going to have to describe how to build actual web pages, I had to build one myself.  That would be my new landing page, and I had to redesign here at the same time because of the way I structured this site back in 2014.  I have more to do here, but this is yet another example of learning by doing on my part. I remain stunned that this sort of thing is now technically half my job description.

With more actual doing, I’ve become far less interesting in pontificating.  It helps that I’ve been writing my next actual history book all summer. Lately, I’ve been doing a deep dive into the history of catsup!  So you’ll understand why I don’t much care about MOOCs or personalized learning or the coming faculty apocalypse (which, of course, JP and I already covered here).  Since I’m running a Faculty Learning Community (a term I picked up from the one and only Adam Croom) for our very incipient Domain of One’s Own project on campus starting in August, I still have to follow this stuff to some degree.  However, I’m pretty sure that I’ve run out of interesting things to write about edtech.*

However, before I leave this subject for what may be a pretty long while, I thought I’d review where we’ve been over four versions of this blog.  In 2012, a bunch of people in Silicon Valley started claiming that MOOCs were going to disrupt education and make universities obsolete.  I spilled a ton of pixels worrying that they might be right.  It turns out they were wrong.  But the really interesting question from the history of technology standpoint is exactly why they were wrong.  The rather surprising popularity of this post about edtech and refrigerators made me want to review this because maybe it’s not quite as obvious to some people as it is to me.

Disruption theory is built on analogies.  If I remember right, Clayton Christensen invented disruption theory by looking at the computer storage industry, then applying those lessons elsewhere.  Eventually, he applied the same principles to higher education.  The same way that Silicon Valley shills like to pitch things as “Uber for____,” there are useful versions of this kind of argument and less useful versions of this kind of argument.  Frankly, I’m not sure that this is the correct chronological order, but “Uber for hotels” gets you Airbnb.  On the other hand, disrupting education the same way that Zip Disks disrupted the computer industry during the 1990s gets you a really shitty education – a.k.a. MOOCs.

The obvious reason for this is the degree to which the new thing replicates the old thing.  Storage is storage.  Someone’s house still gives you shelter, just like a hotel.  Someone’s car still gets you where you’re going.  And in all three of these cases, it gets you what you want much, much cheaper.  Reach back to refrigerators, and the new technology is actually a vast improvement over the old one, ice boxes.  But i turns out that there isn’t much of a paying market for watching professors lecture and answering a bunch of multiple choice questions, at least among potential college students.

But even if there was, completely disruption isn’t exactly inevitable.  Sometimes the hotel itself is the reason for your visit.  Whether it’s a conference or just the pool and the buffet downstairs, hotels will always have something on Airbnb.  To go back to that refrigerator post again, some people  actually prefer going to the laundromat that owning their own washer/dryer – particularly if they don’t have their own house.  Sometimes even if the experience seems better, disruption may take time or might never happen at all because of strange cultural considerations that mere business professors will never bother to contemplate.

So what’s the deal with education technology?  MOOCs were and remain a mostly lousy experience, except for corporate training apparently – perhaps because corporations don’t much care about the quality of the student experience.  Various efforts to disrupt other aspects of the college experience with edtech have met varying receptions.  Sometimes the reception has been good (think textbook rental services, for instance).  Sometimes the reception has been bad (think e-textbooks, for instance).  If the savings are worth the inconveniences of an inferior experience or can somehow provide a better experience, those companies will prosper.  If they aren’t, then we’ll have yet another fad on our hands.

What I’ve learned in my years of studying this topic, is that there are actually a ton of really devoted people who are trying to develop and utilize various educational technologies to create useful and – at least in some cases – superior experiences to how colleges and classes operate now.  These efforts are, as you might expect, hugely labor intensive.  Therefore, they seldom appeal to private Silicon Valley companies trying to make a quick buck.  They do, however, appeal to all of us who are in higher education for the long run and a willing to try something new.

I got drafted to teach WordPress in a computer science class because I became one of those people.  What used to be peripheral to my job has moved to the center thanks to learning by doing.  While I may share a few of those experiments in this space moving forward, I’m afraid my days of long-winded pontificating about edtech are over.

Maybe it’s time to try history blogging again.  Anyone want to hear about the history of catsup?

* The one exception to that statement is an article that JP and I have in the hopper.  Actually, I drafted it from one of Poritz’s ideas and he’s been sitting on it for a few weeks now. It may see the light of day eventually, but if you’re reading this JP, I think you know what you have to do in order to make that happen.

Posted by Jonathan Rees in MOOCs, Teaching, Technology, 2 comments

Gophers.

I got exciting news yesterday: I’m becoming a computer science professor! I’m alright. Nobody worry ’bout me. It’s just for three days.

You see, my friend JP is teaching a CS class for pre-college Freshmen this summer and it’s going to start with getting them Reclaim Hosting sites, then teaching them how to control their own domains. Poritz, who codes his own pages like most people write prose, is so far ahead on this he actually needs help explaining this simplified process to ordinary people, so I’m coming in for the first three days to help talk the students through this process. Ironically, I’m hardly the greatest WordPress web designer in the world. [Indeed, THE Jason Jones owes me an e-mail or at least a post on improving one’s WordPress skills so I can redesign this site again as practice.] Nevertheless, over the last few years I’ve become quite good at modeling “Let’s all learn this together” behavior.

This is necessary because this whole concept of “Digital Natives” is complete rubbish. Yeah, I know that’s a rather common sentiment (at least in well-informed circles), but I’d actually go one step further: A lot of old people like me are a lot closer to being digital natives than college students are. After all, I was on a college campus for most of the Nineties. I actually learned (and have now completely forgotten) Gopher in an 80-part e-mail course. By which I mean, this gopher:

Not this one:

via GIPHY

Or this one:

So I literally have decades of experience being uncomfortable on the Internet.

I’d argue that this is a good thing. One of the many things I learned writing a book with Poritz was the origins of the fake word “app.” Yes, I already knew it’s short for “application,” but what JP taught me is that the whole point of applications is to perform a particular function for you so that you don’t have to worry about it. By making things more easy, you’re more likely to hand over your cash, your data or perhaps both.

As a stereotypical liberal college professor, the whole “Fake News” thing from last year scared the Hell out of me, and would have done so regardless of the outcome of the election. Since the Internet is so important to everyday life and is already (for good or for evil) taking over the college classroom, I’m committed to helping students understand how to think critically about something that’s inevitably such an important part of their lives. With an epidemic of fake Founding Fathers quotes perverting our politics, the relationship between this project and history professing should be obvious.

Or we can all be gophers and climb back into our holes and wait for Bill Murray to blow up the golf course for us. Pardon me if I prefer to be more pro-active.

Posted by Jonathan Rees in Personal, Teaching, Technology, 0 comments

The means of educational production.

I’ve had two articles come out in the last two days, and I think both deserve at least a shout-out here. The first is a Chronicle Vitae “column” about teaching that has been well-received on Twitter. Give it a look if you’re interested in teaching….or trucks.

The second is a collaboration between my co-author Jonathan Poritz and I in the AAUP journal Academe. While it obviously shares some similarities to Education Is Not an App, I like it a lot because it’s such a good collaboration that I can’t tell where my ideas stop and JP’s begin. The one exception to that is the reference to the “shopfloor” in the title of the essay (as I’m the labor historian of the two of us) – and a few very stray references to Marxism/Leninism in the text.

This is the residual to what was the first conclusion to this piece, all of which ended up the cutting room floor. However, I want to resurrect a bit of it here for the sake of added value. While JP and I were discussing shared governance during the planning process for that article, it suddenly struck me just how unique shared governance is. After all, what other worker besides college professors have even a fraction of the control over the conditions of production that we do? We work alone. As long as we don’t make the mistake of using the learning management system there are few direct records of our work and our output is almost impossible to measure accurately.

I’m not saying that professors should have completely unfettered control over their workplace. That’s why it’s called shared governance, after all. However, our training and expertise has traditionally bought that us far more autonomy than most other workers. Technology is a threat to that autonomy. If you want to see why, look at practically every other post on this blog going back five or six years.

But – and this is where my epiphany come in – unlike skilled production workers, college professors don’t have to unite with anybody in order to control the means of production. By employing whatever educational technology best suits our needs, we can ride the wave of automation all by themselves – like my Chronicle Vitae piece suggests, automating the tasks that should actually be automated, and utilizing our skills to combat the edge cases that come up in teaching every day. Because we already control the means of educational production, we don’t have to give it up without a fight.

The problem comes up when either the labor supply expands beyond what the market can absorb – see Marc Bousquet on grad students as a waste product – or when technology enables our employers to try to re-define what learning is. Shared governance is our protection against both these kinds of changes. That’s why fighting for its continuation can be revolutionary all by itself.

Posted by Jonathan Rees in Academic Labor, Shared Governance, Technology, Writing, 1 comment

Clayton Christensen hates you and other observations.

I know this article about our old friend Clayton Christensen is old news now, but I was caught up in the end of the semester when it came out and have only gotten to writing about it now:

In a speech Thursday at Salesforce.org’s Higher Education Summit here, Christensen spoke at length about disruption theory broadly and discussed its application to colleges and universities. Higher education, he explained, was among the industries that “for several centuries was not disrupted,” but “online learning has put a kink in that.”

Technology itself is never the disruptor, Christensen said; a new business model is. But “it is technology that enables the new business model to coalesce, and that’s what is happening in higher ed now….

“If you’re asking whether the providers get disrupted within a decade — I might bet that it takes nine years rather than 10,” he said, to a smattering of gasps among the nearly 1,500 attendees.

So there’s absolutely no evidence yet of disruptive innovation in higher ed yet Christensen doubled down on his theory? What else did you expect from someone who runs “a nonprofit, nonpartisan think tank dedicated to improving the world through disruptive innovation.” Whose world is the Christensen Institute allegedly improving?:

Our higher education research aims to find innovative solutions for a more affordable, sustainable postsecondary system that better serves both students and employers.

Faculty? Not so much.

Reading Christensen party like it’s 2012 again reminded me of the first online conversation I had with Stephen Downes way back in those days before I even knew who Stephen Downes was. This is me in the comments to that old post, after Downes criticized me for being more interested in my own job than in universal education:

I’m certainly not going to remain in the global one percent if you succeed in making my job obsolete. Yes, there will still be a Harvard and there will still be a Yale, but state regional comprehensive universities will dry up like dust when the government funding moves entirely online.

You seem to welcome that, Stephen. Do you expect the tens of thousands of people who depend on these kinds of universities and the communities that depend on those universities to welcome that too? [A]m I supposed to just sit quietly and take one for the team?

But forget about me for a moment. If half the colleges in America actually closed, as Christensen STILL predicts, not just faculty would suffer. Administrators, staff, cafeteria workers…all of them would become jobless whole college towns would keel over and die without the economic engine that the local university currently provides. Billions of dollars that would have stayed circulating in those communities would be sucked up and distributed among investors and programmers in Silicon Valley. How exactly does this outcome serve those area employers? And what good is your online college degree if your hometown just died in the process of making it affordable.

Around the same time that Christensen Chicken-Littled himself onto the front page of IHE again, I marked yet another keynote by Audrey Watters which I thought might be useful at some point in the future (and, of course, it was). She’s talking about a different subject here, but I think this principle remains applicable:

Our institutions do not care for students. They do not care for faculty. They have not rewarded those in it for their compassion, for their relationships, for their humanity.

Christensen claims his schtick is non-partisan and improving the world, but it’s really just warmed over Social Darwinism from the late-nineteenth century.  You can dress up Herbert Spencer in the fig leaf of social science and technological philanthropy, but that doesn’t make his core philosophy any less cruel.

Posted by Jonathan Rees, 1 comment

With or without edtech.

Earlier today, my Twitter friend Jon Becker @ed me a link to an EdSurge essay about edtech and refrigerators, suggesting that I was “the only person qualified to comment on this.” Indeed, I can say with some certainty that I am the only person in the world who has written two books on refrigerators who is also interested in education technology. So when the Clayton Christensen Institute for Cheerleading Disruption of All Kinds throws a slow, hanging softball made especially for me, how can I possibly resist?

For those of you who refuse to read anything produced by the Clayton Christensen on principle (and I have some sympathy for that position these days), let may save you a click. The author, Julia Freeland Fisher, uses research on comparative appliance adoption rates by her colleague, Horace Dediu, to argue that:

[I]t’s becoming increasingly acknowledged that we need to pair investments in edtech tools with investments in professional development. But for the tools and models that least conform to traditional school structures, we’re also likely to need investments in fundamental reengineering—that is, not just developing teachers’ proficiency in using tools but rethinking processes like schedules, evaluations and staffing throughout an entire school building or district.

What do refrigerators have to do with restructuring schools? In order to use a new refrigerator, consumers only had to plug them in. In order to use washing machines, on the other hand, consumers needed plumbers to help them and maybe a whole new set of pipes in their houses. That’s why refrigerators became much more popular, much faster than washing machines and that’s why you need to change the way schools are structured so that they can best take advantage of all the wonderful new education technology that EdSurge must cover every day.

The first thing that jumped out at me about this article was Fisher’s basic dates in the history of the refrigerator. She says the refrigerator debuted in the 1930s. The first electric household refrigerators appeared during the 1910s. They were already being mass-produced by the late-1920s. “Refrigerators quickly took hold,” she writes “gaining over 90 percent adoption by the late 1950s.” I actually used the exact same statistic in my book Refrigeration Nation (p. 179, for all my fellow refrigerator aficionados who want to consult your own copies), but I used it to make the exact opposite point about refrigerators. In 1957, when over 90% of American households had refrigerators, only 12% of French households had refrigerators and less than 10% of English households did. If refrigerators were really that great, why didn’t they too just plug them in and enjoy the bounty?

As a historian, this is where I became really curious about where Fisher got her statistics. While she namechecks her colleague Dediu, there’s no link in the piece to any published study about refrigerators and washing machines. Indeed, the only link in the entire essay is to a general study about technological diffusion. There’s a chart in Fisher’s essay about comparative adoption curves, but there’s no source listed for that either. Other than completely leaving out the bottom left, the curve for refrigerators looks OK to me, but how can I trust her point about washing machines if I don’t anything about the source? How can I be sure that this isn’t the edtech equivalent of fake news?

That’s why I opened a book. Ruth Schwartz Cowan’s More Work for Mother is a classic in the history of technology and pretty close to the only scholarly work that tackled the history of refrigerators at any length before I did. Since it is a general history of appliances, I figured it might have a little bit about washing machine adoption rates in one of the sections I had forgotten about. So I pulled it down off my shelf, turned to the index and quickly hit the jackpot; “washing machines…diffusion, 195-96.” Here’s the quote:

“[I]n 1941–roughly thirty years after they came on the market, and twenty years after the prices had fallen to more or less reasonable levels as a result of mass production–only 52 percent of the families in the United States owned or had “interior access” to a washing machine. Thus, just under half the families in the land were either still hand rubbing or hand cranking their laundry or using commercial services.”

If you’re wondering, the Fisher/Dediu number is about 10 percentage points lower than the one that Cowan used. Perhaps this can be explained by the difference between owning a washing machine and “accessing” a washing machine in the basement of your apartment building or taking your dirty laundry down the street to a laundromat. But for purposes of Fisher’s overall point about edtech, this distinction means everything.

Can you live without a refrigerator? Most Americans can’t. [Indeed, the refrigerator adoption rate in the modern US is actually 99.5%.] However, French or English people in 1957 still had easy access to fresh meat and produce at large markets.  Many still choose to live that way today because fresh perishable food tastes better. Americans, on the other hand, tend to preference convenience over taste. That’s why the refrigerator industry was one of only three in the whole United States to grow during the Great Depression.  Anyone who had any money to spend at that time greatly valued the added convenience of electric refrigerators over ice. By 1960, the old ice industry basically disappeared because it ran out of customers.

Can you live without a washing machine? Of course you can. That’s why there are still coin-operated washing machines and laundromats. Keeping your food in other people’s refrigerators isn’t an option in the United States, but you don’t need constant access to a washing machine in order to get your clothes washed by machine when needed. In other words, owning your own refrigerator is close to the only way to have access to refrigeration, but dragging your dirty clothes to any laundromat is a reasonable way to get access to a washing machine even if there is none in your home or apartment.  There’s only one way to keep your perishable food fresh, but there are plenty of ways to get your clothes washed whether you own a washing machine or not. In short, refrigerators are close to a necessity. Washing machines are just really, really convenient.

Can you live without edtech? [You just knew I had to get around to edtech here eventually, right?] Shockingly enough, there were actually good schools in the United States long before Bill Clinton and Al Gore decided to put a computer in every classroom. Plenty of teachers and professors offer great classes of all kinds without anything more sophisticated than their voices and a chalkboard. Weirdly enough, just this morning, right after I read that article, I was pitching our dean on starting a digital humanities program in our college. “What about the professors who don’t want to use technology?,” he asked me. I said I would never in a million years force any teacher to use technology if they don’t want to, but it’s a actually a good thing if students have a wide range of classes in which they can enroll, some of which use educational technology and some of which don’t.

Which brings me to the fundamental problem with the Clayton Christensen Institute for Cheerleading Disruption of All Kinds. The whole assumption behind that article is that one technology will always inevitably drive another technology to extinction: Refrigerators will completely replace ice, washing machines will completely replace washboards and edtech will completely replace conventional teaching. That is only true for the first of those examples (and even then, only really in the United States). Whether teachers want to teach with or without edtech is a cultural decision, not some hard and fast rule determined by the universal laws of technology.

Unless, of course, you have some other axe to grind…

Posted by Jonathan Rees in Refrigeration Nation, Refrigerator, Teaching, Technology, 5 comments

BYOB (Be Your Own Boss).

You might not know this about me (as I don’t write about it much here), but I’m Co-President of the Colorado Conference of the American Association of University Professors (or AAUP).  In that capacity, I knew about this story long before it got reported (even though I didn’t participate in the investigation or contribute at all to the report):

A new report from the American Association of University Professors alleges that Colorado’s Community College of Aurora terminated an adjunct because he refused to lower his expectations for his introductory philosophy class. The report sets the stage for the AAUP to vote on censuring Aurora for alleged violations of academic freedom later this spring, but the college denies such charges. It blames Nathanial Bork’s termination on his own teaching “difficulties.”

I know Nate pretty well, so I’m more than a little biased when it comes to a case like this. Nevertheless, there are a couple of things about this incident that just made my head explode. First, as you can see from that IHE article, Nate still teaches at Arapahoe Community College, which is part of the same community college system as the Community College of Aurora. At CCA, Nate was allegedly such a bad teacher that the college fired him “virtually on the spot,” yet he’s still working productively down the road. If Nate was really such a menace, don’t you think CCA might have wanted to warn its sister school about him?

The second, even-more-mind blowing part of this case goes back to Nate being fired “virtually on the spot.” Nate was apparently so awful that they fired him DURING the semester, leaving all of his students in a lurch with some patchwork of substitute teacher(s) until finals week ended. He’d have to have been pretty darn awful for the benefits of that maneuver to outweigh the considerable costs. Of course, it wasn’t really about Nate’s teaching.

Nate’s firing was about making a point. The authors of the AAUP report on Nate’s case cover this subject very deftly:

A cannier administration might have let Mr. Bork finish the semester and then have declined to renew his contract. Insofar as this could have been done for exactly the reasons that appear to have motivated the CCA administration’s summary mid-semester dismissal of Mr. Bork, it would have constituted just as severe a violation of academic freedom. But the administration would have enjoyed the plausible deniability afforded by policies and procedures that enshrine arbitrary nonrenewal of appointments for adjunct faculty members.

It is certainly no secret that adjunct faculty lack real academic freedom precisely because of their precarious employment. Yet the administration at CCA made no pretense of the idea that Nate and other adjuncts there have the same control over their classrooms that tenure track faculty at most places (hopefully) have. They came up with this “Gateway to Success Initiative,” imposed it indiscriminately upon faculty of all kinds and fired Nate in response to his desire to be his own boss (at least as far as the way that he chooses to run his classroom is concerned).

While the AAUP’s Committee A (which oversees investigations like the one at CCA) doesn’t take that many cases in any given year, it should be obvious why this one is really important. Here is an administration that won’t even make the usual happy noises about all faculty having academic freedom. They think they should have more power over curricular decisions than their own faculty do. While I ran a few courses on spec in grad school, I’ve never taught as adjunct. Nevertheless, I have to imagine that one of the reasons that you’d put up with low pay, no benefits and zero job security is precisely that you can be your own boss in the classroom setting.

Yes, if you’re a terrible teacher, you might be subject to observation and discipline. But that discipline should be meted out by other faculty (like your department chair) and not by your administrators. You should also have the opportunity to change course if your teaching is somehow not up to snuff, and not get summarily dismissed in the middle of the semester.

Everything I’ve written about this case so far should be obvious to any informed faculty member who considers the issues at stake. But I want to make two more points that might not be so clear to everyone.

First, adjuncts are just the low-hanging fruit in a long-term administrative movement towards trying to control the way that faculty to teach. You can discipline adjuncts, particularly CC adjuncts, because they have few expectations of academic freedom and (often) a dire need for continued employment. Once this becomes the norm, there is no reason to believe that administrators will let tenure-track and tenured faculty exercise their traditional prerogatives in their own classrooms. Running a university like a business means closely controlling exactly how work gets done. If faculty acquiesce to this kind of academic Taylorism, we’re all gonna end up working with stopwatches behind us no matter what our employment status happens to be.

Second, to get back to a subject more common on this blog, technology is already greatly enabling administrators in this quest to control the classroom. My old obsession, mandatory LMS usage is just part of this phenomenon. But the destruction of faculty prerogatives goes beyond just administrators. Consider this observation from Jonathan Poritz and I in our book Education Is Not an App (p. 65):

While a typical face-to-face course, or even a regular fully-online course, does not have to cater to the recommendations of the nineteen or twenty people who may collaborate to produce a MOOC, the rise of online learning tools has meant that professors of all kinds have less say over their own classrooms than they did even twenty years ago. One reason that the power of [“teaching and learning specialists”] has increased is that the power of faculty has dwindled as technology has made it easier for faculty prerogatives to be divided when the work of teaching gets unbundled.

Now we’re not saying that instructional designer stink and they all must be destroyed. What we are saying is that the final decision about how the classroom will operate must belong to the professor, no matter what their status of employment happens to be. If you build a better mousetrap, use the carrot not the stick. Most faculty are smart and caring enough to join any technological bandwagon worth joining.

For all these rasons, by taking a stand on Nate’s behalf, the AAUP is actually taking a stand on behalf of us all. If you’re appreciative of this kind of work, you should consider joining us.

Posted by Jonathan Rees in AAUP, Adjuncts, Shared Governance, Teaching, 0 comments

My adventures in digital history.

These are my remarks as written (if not exactly as delivered) in Paul Harvey’s history seminar at the University of Colorado – Colorado Springs this morning:

I recently wrote an essay for the Chronicle of Higher Education called “Confessions of an Ex-Lecturer.” Yet my appearance this class (well, the first part of this class anyway) is going to be a lecture. Yes, I’m going to lecture about why and how I stopped lecturing. To get past this enormous contradiction, let me make a distinction between conveying historical content and making a pedagogical argument. You have no reason to memorize anything I say today. There will be no quiz later. Instead, this lecture explains my thinking about teaching history to you and see if I can convince you I’m right. I’ve adopted a lecture format here because I have to tell the story of how my thinking has changed in order for you to follow along with my reasoning.

My opinions on this subject are not popular in historical circles. As one of my former graduate school acquaintances put it on Twitter the other day: “[T]hey will pry the lecture out of my cold, dead hands.” I sympathize. Old habits die hard. That’s the way I learned history when I was in college. Indeed, I never had a class of any kind in college that had fewer than thirty people in it and the vast majority of those class periods consisted of people lecturing at us. A lot of those professors were really good at what they did – although I did take a class from a political science professor who looked up at the ceiling as he talked, which drove me completely crazy….but that’s a story for another time. The reasons I’ve sworn off lecturing in my own classes are twofold.

First, there’s the advent of the cell phone. These small supercomputers have so permeated daily life that the average person – notice how I didn’t say average student – average person can’t go ten minutes without reaching for their phone at least once. Indeed, stick me in some meeting where someone starts lecturing about something that I’m not particularly interested in and I’ll reach for my phone far faster that. I could be the most interesting lecturer in the world (which I most certainly am not), and a good number of you would still reach for your phones at some point during the presentation.

Please understand that I’m not blaming millennials here. I’m blaming everybody. For so many of us, the temptations of the Internet is just to hard to resist. “When people say they’re addicted to their phones, they are not only saying that they want what their phones provide,” writes the MIT psychologist Sherry Turkle, “they are also saying that they don’t want what their phones allow them to avoid.” If I’m talking at you in a classroom of any size, it is ridiculously easy for you to avoid me and I’m not going to be able to change that. Therefore, I have to talk at you I better make darn sure that I have something interesting to say.

So what if I give you the opportunity to do something rather than to passively absorb information? What the Internet take away, it also giveth. My interest in digital history comes from my interest in finding some alternative to lecturing about historical facts and then testing students on how many of those facts they’re retained. I know this is sacrilege in most historical circles, but I’m gonna say it anyways: You really can Google anything.

The Internet is well-developed enough that most of the time a discerning consumer of information can get reasonably reliable factual information very quickly with limited effort. But, and this is the second reason I’ve basically given up lecturing, with limited technical knowledge it is now possible for ordinary college students to make useful contributions to the great pool of historical information available online. Not only that, by doing so, they can pick up practical computer skills that will increase their employability upon graduation. With that kind of upside, taking some of the attention in class off of me seemed like a small price to pay.

One of the most interesting things about digital history is that this field lets you make professional use of skills that you probably picked up just by being an active digital citizen. For example, I started blogging right after I got tenure in 2003 because I was a lot less worried about someone threatening my employment because of my political opinions. Oddly enough, I devoted my entire blogging life to one subject: Walmart. I learned WordPress from a guy named Jeff Hess in Cleveland, Ohio via e-mail. Jeff was the administrator of our group anti-Walmart blog.

In 2007, when my department wrote and was awarded a teaching American History grant from the federal Department of Education, I used those skills in class for the first time. We were funded to take teachers to historic sites on the East Coast over the summer and this was a way that they could write easily from the road and that we could still follow them. So could their relatives friends and even students, which served as a nice side benefit – a benefit that applies to all sorts of history undertaken on the open web.

Another skill I already had which turns out to have enormous digital history ramifications is some proficiency in social media. Personally, I’m a stonecold Facebook hater, but Twitter has been a godsend to me with respect to digital history not so much in class but for keeping up with the field. Your professor, for example, (if you didn’t already know) is a prolific Tweeter, if mores on American religious history than digital history and things technological. More importantly, my students have used it to reach out to scholars in fields that they’re researching.

It’s also a great tool for publicizing the work you do online. I actually got a book contract thanks to Twitter (although not in history). If you’ve spent any time listening to the Canadian scholar Bon Stewart as I have, you’ll understand how social media in general and Twitter in particular is a great tool for building communities of interest – and I mean that both in terms of what you enjoy and as a way to fight for what you believe.

With respect to digital history in particular, the turning point for me in particular was the summer of 2014 when I attended an NEH Institute at the Roy Rosenzweig Center for History and New Media at George Mason University in Virginia. Me and a bunch of other folks who never studied this stuff in Grad School got a very intensive tour of what’s on the web, web tools and how we might want to integrate them into our classes. Some of it was old hat for me. Unlike a lot of my fellow professors, I had already heard of two-factor authentication and Password protection programs.

However, when it came to history-specific web tools almost everything they touched on was brand new to me. One I was already using, but learned to use better is Zotero, which actual began at the Roy Rosenzweig Center for History and New Media and really ought to be on every historian’s must-use list. Zotero is a notes program that lets you gain intellectual control of your research by allowing you to search it at the word level. That includes content inside digital copies of things that you’ve scanned and uploaded. As someone who wrote his dissertation on 4×6 notecards I can tell you I am never, ever going backwards on this. That’s why I’m now requiring all my students doing research papers to use it. My students constantly tell me how grateful they are to know about Zotero, and how they wish they knew about it two or three years earlier.

A jaw-dropping research tool for digital historians that I first learned about in Virginia is Camscanner. Camscanner is an app that turns your cell phone scanner into a document scanner. If I could show you the huge pile of Xerox copies I made for my dissertation at 25 cents, 50 cents…even a dollar a pop, you’d know why this is so amazing. Having access to free copies of documents from archive make sit easier to acquire information over what is often very limited research time. I had some experience with researching this way when the Manuscripts Division at the Library of Congress installed the greatest book scanners that I had ever seen in order to preserve the physical well-being of their collections (since bending things back for ordinary copying does so much damage). Now I’m swimming in information – information that’s searchable using Zotero. The same is true for my students as I have them working with local archives in my digital history classes.

The program I settled on for them to use is Scalar, which comes out of the University of Southern California. It’s actually designed as a book publishing program, something that allows books to appear on the web with digital media embedded into them. I’ve been using it in class for web exhibits. Study after study has shown that putting resources up on the web drive traffic to physical archives and libraries rather than take it away, so I’ve had my student create Scalar projects using local resources and putting them up on the web. Here’s a recent example from the Steelworks Center for the West that I liked a lot. Here’s another about a place I think that everyone in this class ought to know well.

Why Scalar? You don’t have to know how to program in order to make it look good. Indeed, as the experience of countless of my students has more than proven, you can learn how to use it within just an hour or two of starting to play with it. Indeed, I have plenty of students who can Scalar far better than I can because they’ve had far more reason to use more features than I have since I simply use it to put up a few syllabi (although I have trained to do more now).

Another reason I like Scalar is that students and faculty who use it can host their own Scalars if they go through Reclaim Hosting. This is not the place to argue why faculty and students should take back the web from university administrators and private companies (although I did co-author a book that fits in well with that argument), but one of the best things about the Reclaim-related “Domain of One’s Own” project is that it allows students to keep access to their digital work even after they’ve graduated. Scalars students create through Reclaim therefore can serve as evidence to potential employers that they can do something other than just historicize things. Not that there’s anything wrong with the ability to historicize things, but in this manner digital history might actually be the answer to the age-old question, “What can you actually do with a history degree (besides teach)?”

On personal level, my digital history experiments has proved much more interesting than standing up and lecturing to disinterested students about the same old things that I had always been lecturing about. In the future, I’m dying to get into digital mapping, as the Steelworks Center of the West has an absolutely astounding collection of mine maps that cover both towns and mines. I imagine a digital project that traces the physical impact of mining on Southern Colorado’s landscape as soon as I have enough theoretical background to pitch it to some funding agency. What’s really great is that thanks to my changes in pedagogy I’ll be able to get my students to pitch in.

When I was at the American Historical Association meeting in Denver a few weeks ago, I attended almost nothing but digital history sessions. I was really struck by all the people at those sessions by how willing everyone was to admit that they have no idea what they’re doing – that the whole field of digital history is kind of a running experiment. To paraphrase one scholar I heard at the meeting, digital history blurs the line between research, teaching and service. In my case, I’m having students do historical research and putting on the web for the benefit of local historical non-profits. I think the benefits of doing this far outweigh whatever harm that gets done to my ego if I’m no longer the center of attention in class anymore.

Posted by Jonathan Rees in Digital Humanities, Teaching, Technology, 0 comments

Two-fer Tuesday.

TFW two pieces you had a hand in get published one the same day.

Posted by Jonathan Rees in Education Is Not an App, Teaching, Writing, 0 comments

MOOCs: A Postmortem

MOOCs are dead. “How can I possibly argue that MOOCs are dead?,” you may ask. After all, to borrow the stats just from Coursera, they have: 1600 courses, 130+ specializations, 145+ university partners, 22 million learners and 600,000 course certificates earned. More importantly, it appears that Coursera has received $146.1 million dollars over the years. Even though it hasn’t gotten any new funding since October 2015, unless Coursera tries to copy “Bachmanity Insanity” (Is Alcatraz still available for parties?) the company is going to be sticking around for quite a while.

What I mean when I say that MOOCs are dead is not that MOOCs no longer exist, but that MOOCs are no longer competing against universities for the same students. Continuing with the Coursera theme here, in August they became the last of the major MOOC providers to pivot to corporate training. While I did note the departure of Daphne Koller on this blog, I didn’t even bother to mention that pivot at the time because it seemed so unremarkable, but really it is.

Do you remember Daphne Koller’s TED Talk? Do you remember how incredibly utopian it was?  n truth, it made no bloody sense even then. For example, she suggested back at the height of MOOC Madness that:

[M]aybe we should spend less time at universities filling our students’ minds with content by lecturing at them, and more time igniting their creativity, their imagination and their problem-solving skills by actually talking with them.

I agree with that now. In fact, I agreed with that then too. The problem with that observation to almost anyone who actually teaches for a living remains that talking with students is obviously impossible when you have ten thousand people in your class. More importantly, showing students tapes of lectures (even if they’re broken up into five minute chunks) is still lecturing.

That’s why MOOCs were never going to destroy universities everywhere. There will still be far more than ten universities fifty years from now. Or to put it another way, the tsunami missed landfall.

But just because this blow proved to be glancing doesn’t mean that the punch didn’t leave a mark. For example, a lot of rich schools threw a lot money out the window investing in Coursera and its ilk. [Yeah, I’m looking at you, alma mater.] Others simply decided to spend tens of thousands of dollars on creating individual MOOCs that are now outdated almost by definition since they’re not designed for corporate training.  Yes, I know that MOOC producers claim that their MOOC experience improved teaching on campus, but think how much better teaching on campus would have been if they had just invested in improving teaching on campus.

At best, MOOCs were a distraction. At worst, MOOCs were a chronic condition designed to drain the patient of life-giving revenue. Instead, those schools could have used that revenue (as well as its initial investments) for other purposes, like paying all their faculty a living wage.

My inspiration for this observation (and this entire post) is the MOOC section of Chris Newfield’s terrific new book, The Great Mistake: How We Wrecked Public Universities and How We Can Fix Them.*  This is from page 227:

MOOCs were not going to leverage public colleges by buying them.  But they could acquire a share of their revenue streams–that combination of student tuition and enrollment-based public funding–whose capture is one of the five key elements of privatization…MOOCs could leverage their private capital with far greater sums flowing colleges and universities without buying anything up front.  This offered the attractive prospect to strapped public colleges of gradually replacing even more tenure-track faculty with technology that could be managed by private MOOC firms off campus, for a reasonable fee.

To make one of my favorite distinctions, this applies to schools that are MOOC-producers (like Arizona State) even if those MOOCs are mainly for internal consumption, and especially all those colleges and universities that were potential MOOC consumers – any place that considered replacing their humdrum, ordinary faculty with all the “world’s best lecturers.”

In order to capture part of that revenue stream, MOOC providers had to argue that their courses were better than the ones that students were taking already.  That explains all the hating on large lecture courses.  Except, MOOCs were nothing but large lecture courses dressed up with technological doodads.  As Newfield explains on pp. 242-43:

     In effect, MOOC advocates were encouraging students to leave their state universities to attend liberal arts colleges, where they could go back to the future of intensive learning in the seminars that typify “colleges that change lives.”  But of course they weren’t.  Advocates were actually claiming MOOC ed tech could create liberal arts colleges all for next-to-no-cost (Koller) or greatly lowering costs (Thrun).  In making this claim, they ignored existing historical knowledge about learning at high-quality institutions, which made the technology seem original, when it was not.

MOOCs may have been cheaper (and Newfield even disputes that), but they certainly weren’t better – even than large lecture classes.

Again, the vast majority of us faculty foresaw this particular Titanic hitting the iceberg (including me, even if it did take me a while). Nevertheless, university administrators that partnered with MOOC providers or (even worse) bought their products, trusted Silicon Valley types more than their own faculty. This course of action was a reflection of the same self-loathing that Audrey Watters describes here:

There seems to be a real distaste for “liberal arts” among many Silicon Valley it seems – funny since that’s what many of tech execs studied in college, several of whom now prominently advocate computer science as utterly necessary while subjects like ethics or aesthetics or history are a waste of time, both intellectually and professionally.

Yet at least these Silicon Valley types had enough self awareness to go into a different field after they left college. What’s the excuse for a university administrator with an advanced degree in the humanities (or anything else for that matter) to hate their educations so much that they spend hundreds of thousands of dollars to deliberately undermine them?  There is none. They should have known better.

Next time Silicon Valley comes up with a new way to “disrupt” education, let’s see if we faculty can invest more time and effort in getting our bosses to listen to common sense.  Instead, as Newfield notes in his postmortem of Koller’s TED Talk on p. 241 of The Great Mistake:

The categorical discrediting of faculty pedagogy made this bypass of faculty expertise and authority seem reasonable and necessary for the sake of progress.

So in the meantime, let’s fight to improve shared governance everywhere so that we’re prepared to fight for quality education if our bosses refuse to accept the obvious.  Some of us becoming temporarily famous is not worth wasting so much money and effort on any technology that is obviously going to prove to be so fleeting.

* Full Disclosure: Newfield and I have the same publisher even though we publish in entirely different fields.

Posted by Jonathan Rees in Academic Labor, MOOCs, 4 comments

Get your side hustle off.

I’ve been streaming a lot of Simpsons with my son lately. Backwards. Since I quit watching the show regularly sometime in the late-90s, this was the best way that we could both enjoy all-new material. The quality of even the most recent stuff is obviously the good thing about streaming the Simpsons. The bad thing is being locked into watching all those Fox commercials since my cable company (or maybe it’s Fox) won’t let us fast forward. The above Uber commercial has been on full rotation for months. In fact, it sometimes plays twice an episode. I’ve been making so many “earnin’/chillin'” jokes that my son now leaves the room when it comes on.

I thought of that commercial twice while I was at the AHA in Denver last weekend. The first time was when I explained to four historians from Northern California (ironically, the first place that I ever took an Uber) how Uber works. [Key lesson: Always tip your driver!] The second time was when I went to my first digital humanities panel on Saturday morning. The commentator, Chad Gaffield from the University of Ottawa, was talking about how DH makes it possible to break down the false dichotomy between work and play. That spoke to me, because I’ve been having an awful lot of fun teaching all my classes lately. Indeed, I’m going to bring that up the next time I hear someone who teaches like it’s still 1995 start talking about “rigor.”

The other point Gaffield mentioned that I thought was really important was the way that DH blends the traditional roles of teaching, research and service. In my case, I teach students how to research using local resources that help the community once they appear online. However, I suspect there are a million variations to that. In any event, when you fill out your annual performance review, we can all include DH work in whichever category we don’t have enough material in already.

In the very early days of this blog, the role of tech critic was something of a side hustle for me. It wasn’t my day job, but my writing nonetheless found an audience. It’s through the conversations which that writing inspired that I stumbled into a large, multi-disciplinary pool of scholar/teachers who were trying to utilize the Internet to create unique educational experiences rather than cheap, dumb carbon copies of face-to-face courses. I started teaching online so that I could try to set a positive example for other people who might be reluctant to make the same jump because so much of what’s out there has a justifiably bad reputation. I still have a long way to go, but one of the most refreshing things I got out of all the DH panels I went to last weekend is that so does everybody else. Even historians who get their DH papers onto AHA panels readily admit that their learning curve remains steep.

By the time I left Denver in Sunday, I had decided I’m never going back. I don’t want my conventional courses to be entirely conventional anymore. In other words, I’ve been convinced that the digital needs to be present in every course I teach.

I am hardly the first person to draw such a conclusion. CU-Boulder’s Patty Limerick wrote in the September 2016 issue of AHA Perspectives that:

In innumerable settings, historians in Colorado are stepping up to this challenge. In the process, they are devising practices that transcend the conventional turf fights between “academic history” and “public history,” uniting in the strenuous and satisfying work of “applied history.”

I think you could make a pretty good case that food and refrigerators are relevant today, but it’s my classes which take students into the Steelworks Center of the West and the Pueblo Public Library that fit this definition of “applied history” the best.

While such activities have little to do with my current research, teaching is 50% of my job according the annual performance review I’ll have to turn in a couple of weeks from now. In short, what was once my side hustle has now become my regular hustle. While there’s still a lot of tech criticism left to write and I plan to write at least some of it when I have the time, this blog, when I have time for it (and why would I have redesigned it if I had intended to never use it again?) is going full pedagogy.

In the meantime, I have another actual history book I want to write…

Posted by Jonathan Rees in Digital Humanities, Teaching, Technology, 0 comments