Play is making a comeback. There have been TED Talks, peer-reviewed articles in pediatrics journals, pieces in The Atlantic, and an entire industry now devoted to the “right” kind of play for our kids’ development. So why devote another 2000+ words to play and pedagogy, especially because it has already been done well by the creators of this very site?
I’ve learned a great deal from watching my two kids, currently aged just four and almost six. I’ve watched them perform free imaginative play, interactive narrative play, and rules-driven play. Currently the conflict in my household is between the elder sister, who is obsessed with making sure everyone follows the rules, and her younger brother, who is still more interested in exploring and experimenting, happily making it up as he goes along.
We value the kinds of skills and attitudes that my daughter is exhibiting, which is to follow and enforce the rules, to ensure that things are “fair.” And certainly, these kinds of games are important for impulse control (wait your turn!) and for dealing with disappointment (you can’t always win) and also success (don’t rub it in your brother’s face!). But I admire what my son is bringing to the table, too: he is creative and experimental, finding out for himself what he enjoys and how he can learn and interact best with the materials at hand.
And then one day, both kids put their skills together and “hacked” two traditional board games, creating a hybrid version of Candyland and Snakes and Ladders. They negotiated their own rules, had their own interpretations of the pieces, and made the two games into a single interactive role-playing game with characters and a storyline. I had no idea how to “win” the game; and while I had no doubt that the rules would always favor my daughter, I recognized that the game needed structure in order to be playable, as much as it needed the chaotic energy in order to be a more meaningful experience.
Open play like this leaves space to hack the game. My son is currently obsessed with Angry Birds, but there’s only one way to win Angry Birds. You can spend hours, trial-and-error style, learning from each mistake, but once you learn the “trick,” there’s nowhere else to go. Once my kids got bored with the board games, they were able to create something new. When my kids get bored with Angry Birds, there’s only another version they can try to convince me to buy. Exploration and rules need to be able to coexist productively for my kids, not for the software company.
ProfHacker blogger Anastasia Slater has been chronicling a number of useful tech tools that can be used to create games in the classroom. She is currently taking a MOOC offered by MITx called “Learning Creative Learning.” This past week, the topic was Making and Constructivism. Coding programs like Scratch were discussed, and, unsurprisingly for a MOOC offered by MIT, the focus seems to be largely on using technology to enhance learning.
Which is fine. But I still worry about the digital divide; my students resist using technology for anything beyond its stated purpose, in part because they fear “breaking” it, but also, I think, because they have been taught that tools have a limited and prescribed purpose. Their hesitation has less to do with technology for me and more to do with the ethos of playfulness and curiosity, and a willingness to try, fail, and try again. The technology often can get in the way.
Audrey Watters recently wrote about how technology is not the panacea to the ills that are rampant in education today. Clearly, she is not the only one, but Watters has consistently been critically interrogating the dominant narrative of technology as a disruptive force for good in education. Her observations of TED echo the observations that Aaron Bady (and some others) have made about MOOCx: they encourage a passive consumption of knowledge and an almost religious-like devotion to the ethos of techno-disruption. This isn’t the world of free play and imagination, this is a world of blind faith in something most of us don’t understand well.
Watters also brings up an important point about access, in terms of its distribution but also usage. These are observations that many others have made, including Alondra Nelson, pointing to the economic inequities that exist here in the US (paywalled), but Watters hints as well at the racial, gendered, and class influences that act upon who uses the technology and how. If we truly want student-centered or peer-driven learning, all we need is the ethos and openness to try, not the bells and whistles Sillicon Valley are increasingly telling us are necessary to the process.
But this brings up my other concern: what Ernesto Priego calls the rise of the super-humanist. I would say that teachers are increasingly expected to be super-pedagogues, and companies (publishing companies, software companies, start-ups) are increasingly cashing in on teachers’ fears that they are not and cannot keep up with the technological advances that are taking place. A recent survey of k-12 teachers reveals a very real anxiety with the proliferation and rapid evolution of technology in terms of their ability to understand and integrate it into their practice.
With three-quarters of the classes in higher education being taught by faculty who are off the tenure-track, often working at multiple institutions in order to try and make ends meet, the expectation that these instructors also learn how to program, learn all the tools, as well as stay current in their field is untenable. This exacerbates the digital divide; those students who are most vulnerable — low-income or first-generation college students — are most likely to encounter contingent faculty in their courses at community colleges or regional state institutions (to say nothing of the predatory practices of many private education companies). These teachers’ classes will be more likely to be lagging in terms of innovative uses of technology; as well, they may also be more pedagogically rigid, simply because the expectation to perform beyond their means can cause a teacherly rigor mortis, suffocating out any kind of otherwise inherent free play.
The point being, how can we encourage play and innovation in the classroom, even without — or especially without — technology? How do we counter the narrative that technology is necessary for innovative and disruptive pedagogical practices?
A group of us have a tradition now of having a “game night” about once a month. We play traditional “board games” and enjoy unwinding after a long period of intense intellectual labor that comes with being a university professor/professional. At one such gathering, someone brought out an original edition Trivial Pursuit game from 1982.
We may have all been professors, but most of us in the group were either too young or not even born yet (sigh) for many of the questions to be common (or even uncommon) knowledge. We banned the use of our iPhones and spent the evening playing on two teams, battling to fill our little pie pieces. My team won the game because I know about curling (the sport).
It’s fitting that the game is called “Trivial” Pursuit. So much of the information we were being asked about was just that: trivial, bordering on irrelevant, due to the 30 years of history that have taken place since 1982. No more USSR. Four new presidents. One planet that is no longer a planet. I would jokingly ask most of the questions with a “in 1982” tacked on. We guessed, used logic, and desperately searched our memory for those things we learned in grade four about volcanoes and cloud formations.
We had an original edition of the game at my house when I was growing up. And I seem to remember being just as stumped and flummoxed by most of the questions as I am now, 30 years later. But we kept playing. Collecting those pie pieces became an obsession. The strategy? Play enough times so that you knew the answers to every single question because you had heard them all before.
We love quiz games and quiz shows. But there is something absurd about dividing “all” of human knowledge into discrete categories, reduced to tiny plastic slices of a pie. But there is something comforting about being able to measure how much we “know,” how much “knowledge” we’ve accumulated and can recall on cue. We want to win because winning makes us seem “smart” or at least a certain kind of smart.
The first semester I did peer-driven learning, some of my students used a Jeopardy template in PowerPoint to “quiz” their classmates on the materials they were supposed to have read. Because of the game-show format, the questions were closed-ended and only involved knowing superficial information about the text and the author. The class was divided in two and pitted against each other. We all enjoyed trying to come up with the answers first, with the “easy” questions being easily answered by those who read the text, while the more difficult questions tested who could re-read the fastest.
But I was left unsatisfied by the end of these classes. Certainly, we had proven that we knew and had read the essays, but did we actually understand what the essays were trying to say? Did we agree or disagree? What did we think? We hadn’t learned much beyond simple, superficial information.
The next semester, students created a “real-life” Farmville/Monopoly board game. The game provoked students in the class to think about sustainable farming and how we get our food. The students researched the actual cost of running a farm and integrated that into play. “Chance” cards included hurricanes, drought, sickness, and rises or drops in commodity prices. Students were given the choice to go organic or to go industrial. This game stirred discussion around issues of sustainability, cost, and exactly how much work goes into food production.
I, like Ian Bogost, worry about the gamification of learning. Bogost, isn’t opposed to games; he has written the indispensable How to do Things with Videogames. He wrote another book, and co-founded a software company of the same name, called Persuasive Games. The games are for “persuasion, instruction, and activism.” These are games as a learning tool.
This, I think, is very different than gamification (check out this list of articles on Mashable to see what I mean). Gamification often over-simplifies learning. It takes what was seen as a negative (particularly of video games) – turning people into mindless button-pushing zombies obsessed with instant gratification – into a positive: the right kind of mindless button-pushing zombies obsessed with instant gratification. Like rats in a maze, the faster/more efficiently we get through, the quicker we get our piece of candy as a reward. The maze itself is just the obstacle for getting the reward at the end.
Games can lead to insight and reflection. Play, however, opens space for discovery and creativity. My students’ Jeopardy game encouraged rote memorization and winning. The Monopoly/Farmville game play, on the other hand, initiated real discussion, investigation, and action.
How is this helping students learn, or increasing their desire to learn? And, how is this play? When the result (be it a badge or a piece of candy, a “reward” my two young children are all too familiar with) becomes the primary goal, how is it any different than a grade? Plus, I have wondered previously, are badges simply preparing students for the low-wage economy, where real capital is replaced by symbolic social or psychic capital?
Gamification might be getting students through “learning” but is it instilling in them a set of skills that will allow them to continue learning once the game is over? Will a student be motivated to “learn” once there is no game structure, no reward, however symbolic, at the end? Making something into a game does not automatically make it a more enriching learning experience.
The question I think that we face now as critical and digital pedagogues is how do we hack the new rhetoric of gamification in order to get back to the values of play.
[Photo by John-Morgan]