Is It Okay to Be a Luddite?

Is It Okay to Be a Luddite?

This piece was originally published on Instructure’s Keep Learning blog. When it posted, we received a message from Howard Rheingold (NetSmart) linking us to a post last revised in May 1998. In that piece, he considers many of the same questions we ask here. Most significantly, his questions and ours intersect where we say “To fear a technological future is to deny a technological past and present” and he states “It is possible to think critically about technology without running off to the woods”.

So we offer this article, and Howard’s, as a consideration of what it means to approach technology with mindfulness, skepticism, and also exuberance. Our intention is to let these two articles brush against one another to see what conversations they raise across decades.


I feel a pinch as I approach the screen once more. A twinge, just the littlest bite of remorse. Sometimes, it’s sizeable, the feeling I have that I want the digital to be more, the Internet to be tangible, the vacant gaping spaces between my colleagues and myself to be smaller, more a hands-breadth than the length of a whale. And sometimes it is this, a mosquito in the ear. Either way, I return to the screen wishing for relationships that are bigger than pixels, and words that are indelible.

I rail against technology at dinner parties. I curse it to my friends in Google Hangouts. And they call me a luddite.

The title of this post is inspired by an essay by Thomas Pynchon. He wrote presciently in 1984, “Since 1959, we have come to live among flows of data more vast than anything the world has seen.” According to Pynchon, “Luddites flourished In Britain from about 1811 to 1816. They were bands of men, organized, masked, anonymous, whose object was to destroy machinery used mostly in the textile industry.” The 21st Century has produced a whole new kind of altogether less revolutionary luddite. These are the folks who refuse to go on Facebook, who have tried Twitter but would never use it regularly. They keep pen and paper handy and nod with suspicion at the great green elephant of Evernote. For these people, the Internet has not brought on a new world of connectedness and community, it has reduced us to two dimensions, static portraits of faces meant to be lively with expression. The Internet hurts their eyes. And they secretly (and sometimes not so secretly) scorn it’s denizens, reducing their work to blips.

Nicholas Carr writes in “Is Google Making Us Stupid?,” “You should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom.” Words that drip with irony. Says today’s luddite, Google may not be making us stupid, but it is changing our minds (the ones in our heads and the ones we tap with vigilant thumbs) about what brilliance and idiocy are.

There are many who look at the Internet as the downfall of modern education. They decry online learning as necessarily sub-par, stating that the digital can never replace the face-to-face. These are teachers throwing sabots into the machine, hoping with words to stop the gross forward movement altogether, before all we do is reduced to microwaves.

Perhaps, there is some middle-ground, not skepticism or luddism, but what Sean calls digital agnosticism. So often in our discussions of online education and teaching with technology, we jump to a discussion of how or when to use technology without pausing to think about whether or why. While we wouldn’t advocate for a new era of luddism in higher education, we do think it’s important for us to at least ask ourselves these questions.

We use technology. It seduces us and students with its graphic interfaces, haptic touch-screens, and attention-diverting multimodality. But what are the drawbacks and political ramifications of educational technologies? Are there situations where tech shouldn’t be used or where its use should be made as invisible as possible? How does tech reconfigure the learning environment, both literally and figuratively? When a classroom (virtual or otherwise) revolves around tech, what shape does it take? How is this shape different from the configurations of classrooms that don’t revolve around tech?

Can we approach new digital technologies at once with wonder and also dismay — with a reflective curiosity that pushes buttons unabashed but not without first brazenly dissecting them?

And what do we make of analog pedagogies? Is the chalkboard an anachronism, or does it remain (as we believe) one of the most radical and advanced learning technologies? How do we remind ourselves that when we go online, our feet (or some other parts of us) are usually still on-ground?

The student-made viral video “A Vision of Students Today” shows how the boundaries of the classroom have changed in recent years. It ends with a very moving ode to the chalkboard. And this video from Futurama offers a critique (both silly and profound) of how we interact with technology in contemporary society. Both point out how important it is to remain critical — open-eyed both in awe and inspection — at our engagement with machines.

Technology may be homo sapiens’ super power. It is everywhere all the time, whether digital, mechanical, or simply practical. We will do technology always, and so railing against it, or feeling a twinge at the loneliness of the pixel must be tempered. To fear a technological future is to deny a technological past and present. And there is nothing new about sounding the alarm. Luddism has roots in a powerful kind of human agency, but to assume that technology necessarily removes agency is to misunderstand its use. Even the luddites knew when and how to throw sabots.


[Photo, SLOW Chadburns Ships Engine Order Telegraph Great Lakes Naval Museum April 24, 201036 by Steven Depolo licensed under CC BY 2.0]

About the Authors

Sean Michael Morris (@slamteacher) is the Editor of Hybrid Pedagogy. He considers himself a digital agnostic, and allies himself with adjuncts, students, and others who are contingent to the enterprise of higher education. His personal website can be found at seanmichaelmorris.com.

Jesse Stommel (@Jessifer) is Director of Hybrid Pedagogy and Assistant Professor of Digital Humanities at University of Wisconsin-Madison. He is an advocate for lifelong learning and the public digital humanities. His personal site can be found at jessestommel.com.

3 Comments
Discussions from the Community.
  1. We’re getting a pretty good handle on how aspects of tech are impacting on learning. It’s hard, difficult, and a fuller picture is going to take time, effort and money, but in some cases, the news is in.

    We have pretty good reasons to think that some things won’t change. Looking back at the early evangelists for tech – I;m thinking Presnky here – we know that lots has not come to pass. And we know that lot’s of the doom has failed to come to pass either.

    Taking Carr’s idea – is google making us stupid.

    We have always outsourced our memory, and we have always forgotten things once we have faith in the constancy of where they are stored. This is as true of wax tablets as it is of people and the internet. I remember a phonenum,ber if I need to – if I have no pen and paper, or it;s not on my phone, for example. As soon as I write it doown I begin to forget it. We did this with accounts and wax clay cylinders in Sumeria. We do it with email addresses and smartphones in Islington. We dio it with family members and birthdays.

    The processes Carr fears are exactly the ones we have always been subject to. When we trust the resposiroty, we remember where, and less what.

    On the prophet side of things, we are not being significantly rewired by our online experiences. We are no better at task switching ( which is what Prensky;s multitasking actually is) than we ever were. We don;t operate at some form of mytical tqitch speed. And how we learn best has not switched to a graphical form ( our best guess here is that what we need to learn determines the best form or style for learning it – if we need to learn abbout a colour, then visual forms work, if we need to learn about a sound, then auditory, if we need to learn how to do something, themn doing it, and graphics that represent process work well….this remains the same as it ever has).

    We are no better, or worse at guiding our own learning ( in general, we are not as good as we could be). Collaboration has the same weaknesses and strengths, online, and off, as it has always had, even though the collaborative genepool is larger. It presents the same problems and possibilities whteher it;ps a pond or an ocean, though the fertility may be more and varied.

    As others have said before me, human cgnitive architecture has not changed much in the last ten thousand years.

    We remain, largely, who we are in the face of technology.

    We do know that bells and whistles technology probably detracts from learning. People retain less information in lectures, for example, when using twitter, than when not using it. This is nothing new, or specific to new technology. People retain more when there is no comnstruction work outsie the lecture hall than when there is.

    People connect to learn laterally from one another online. But we have always done this. As Wenger and Lave, partial unwitting founders of Connectivist theory stated about communities of practice stated, we haven;t discovwered something., Shat we are describing has always happened, and when you read the theory, you wiull recosgnise it, and not be startled.

    We require digital literacy to disecrn truh from falsity, bias from evidience, and propaganda from argument. But we have always need the cognitive literacies involved in judgement and assessment, and the technical literacies that underpin them. Knowing how to use twitter, and having strategies in place to assess what yu find is no differnet from knowing how to use a libarary card catalogue and knowing how to assess what you subsequently read.

    Neither prophets, nor doomsayers will be proven to be right. The world is constantly on the cusp of simultaneous apocalypse and utopia. None have ever crystallised, and we have remianed largely the same as a sp[ecies in the face of them.

    Prophet or doomsayer, ludditie or technophile. It;s the working teaching stiff who will pick up technology from the Garner hype cycle trough and carefully work it back to the plateau of reasonable usability it will probably offer. The extremeists will have moved on to the next millenarian project by then.

    I’m forty. I;ve been through many more ends of the world then that. After the first three or four, you come to see the similarities, and categorise the prophets of various striupes as having much more in common with each other than they have with reality.

    Be it Sebastian Thrun, the Mayan calendar, or Carr…

  2. Torn Halves says:

    “To fear a technological future…”

    A brief objection to the assumption that skepticism is grounded in fear. The final section of Georg Simmel’s “The Philosophy of Money” contains some very nice skeptical comments about the fascination (in 1907) with high-tech gadgetry like the telegraph, the telephone and the electric light. There is no expression of fear, rather there is a perception that the enthusiasm raves about something that is inessential (e.g. the speed with which people can communicate) and forgets what is truly essential (the quality of what exactly is communicated): “People’s ecstacy concerning the triumphs of the telegraph and telephone often makes them overlook the fact that what really matters is the value of what one has to say, and that, compared with this, the speed or slowness of the means of communication is often a matter of little import…[And in this way] the peripheral in life, the things that lie outside its basic essence, have become masters of its centre and even of ourselves.” (p482)

    What is remarkable is the enthusiasm with which people assimilate themselves to an impersonal order that degrades the quality of human life. We enthuse about our unprecedented access to undreamt of quantities of information, and overlook the fact that little if any of it deepens our understanding of things and our appreciation of life, and the sheer quantity of bits and bytes makes it harder than ever for anything to really enrich in any lasting way the life of the mind. And, for Simmel, this has an impact on our freedom: “Just as freedom is not something negative but rather is the positive extension of the self into the objects that yield to it, so conversely our freedom is crippled if we deal with objects that our ego cannot assimilate…What is distressing is that we are basically indifferent to those numerous objects that swarm around us…an interconnected enclosed world that has increasingly fewer points at which the subjective soul can interpose its will and feelings.” (p460)

    Simmel’s mood is one of distress, not fear. Τhe astonishing achievement of our pseudo-civilisation is its cultivation of enthusiasm in the place of that distress, so that people applaud the developments that make their subjective lives increasingly irrelevant.

  3. Torn Halves says:

    Comment after reading Pynchon’s article that you link to in the first paragraph: It would have been nice to read your response to Pynchon’s key argument in that article, i.e. that if we look at the concerns of the Luddites in the industrial age, we will see that Luddites in the digital age have every reason to praise rather than trash the computing technology that will realise at last the revolution they were hoping for. Today’s Luddites are the DigTech evangelists. The iPad is the new hammer.

Leave a Comment Join the fray.