Life must be lived as play, Plato wrote, but don’t take his word for it. Just go to the park and look for an animal. Better yet: at least two animals.
A dog runs for a ball. Two dogs chase and tumble with each other. The way they bite looks different from a real bite. The difference reflects a power of abstraction. The difference is the style that communicates: This “bite” is a game.
As long as it lasts, playfulness extends an invitation to another reality. We are here, but there is also somewhere else we can go. Come with me? As soon as play ends you can tell. When two dogs stop “biting” and start biting, you separate them, fast.
They might not know why they switched. Then again, a philosopher might not know whether he is playing with his cat, or whether his cat is playing with a philosopher. The instinct to play may be a deeper instinct for abstraction than language—an older way to make new realities. Making new realities may be the first way that humans, and other animals, sort ourselves into teams.
In 1938, a few years before the Nazis locked him up, the Dutch scholar Johan Huizinga wrote a book about this. He called it Homo Ludens, “the playing man,” a joke that is also an argument. Huizinga argued that all of culture comes from the human instinct to make up rules and follow them, together, and against one another. Enter: athletes, music, courtship, war.
We tend to think of play as frivolous. In fact, nothing could be more serious. It is no accident that so many traditions depict the gods as playing to make the universe. No accident, the dead seriousness with which small children take on imaginary tasks. We hate the spoilsport more than the cheat, Huizinga said. The cheat at least honors the rules that sort us into our lives, while the spoilsport destroys the order that holds our team together.
In other words, In the beginning was the game.
Games aren’t just at the origins of our social order. They also lie at the origins of our digital order.
Modern computing is largely a creation of the US military. And one of the things that the US military has always liked to do with computers is play games. During the Cold War, the Pentagon’s favorite game was simulating war with the Soviet Union—something that computers were good at.
This is memorably illustrated in the 1983 cult classic WarGames. A teenager played by Matthew Broderick hacks into a NORAD supercomputer and sets off a simulation that almost triggers World War III. The problem is that the computer can’t tell the difference between playing a thermonuclear war and actually fighting one, between a “bite” and a bite.
But nuclear annihilation wasn’t the only game that computers could play. In 1962, a programmer at MIT came up with Spacewar!, a multiplayer space combat game that became a huge hit in computing centers around the country. Spacewar! turned the unwieldy, intimidating machines of mid-century computing into instruments of play and pleasure. It made computers fun.
It also helped inspire the personal computing revolution. In 1972, the famed cultural entrepreneur Stewart Brand watched a group of people play Spacewar! at the Stanford Artificial Intelligence Laboratory. What he saw was a revelation—“good news,” he called it, “maybe the best since psychedelics.” In Spacewar!, he saw the possibility for a new kind of computing—one that “bonded human and machine through a responsive broadband interface of live graphics display” and “served primarily as a communication device between humans.”
Spacewar! offered a vision of the digital age as interactive, social, personal. In the coming years and decades, engineers and entrepreneurs would implement this vision, and create the building blocks of our digital age.
Kids these days still fall in love with computers by playing with them.
The writers in this issue describe finding their way into computers through games. Some gamers are just in it for fun, while others get into managing gamer communities, and even go pro. Some gamers discover that the game isn’t what they expected—or that play is something different than what the game’s designers intended.
Sometimes, though, gaming gets serious. Games can embody a set of assumptions, even an ideology. Playing a game about cities, for example, you can absorb assumptions about how cities are supposed to be run.
A model for gaming an auction can become the basis for an entire platform economy. New rules can restructure global financial markets. Does it work? Sometimes it seems that the game where humans are self-transparent rational actors, working with complete information, is the biggest make-believe of all.
Managers have promoted “teamwork” at work for decades. More recently, gamification has become a buzzword. In every job that must be done there is an element of fun, the song goes. You find the fun and—snap!—the job’s a game. But in reality, gamification may serve less as a technology to speed chores than to destroy solidarity, coaching people to compete constantly against their natural allies—and themselves.
In the attention economy, successful platforms find a way to turn more and more of our fun into an opportunity to extract profit. Ironically, the people who make computer games have themselves become highly exploited.
Platforms that claim to be public squares or even playing fields can in fact encourage actors to game them in bad faith. Trolls attacking the serious bite bite, then back off saying it was “just a joke.” Transgression repeated, in feedback loops, can transform into boredom. As data-driven porn converges on predictable categories, we search for our safety word: What do you say when your kink isn’t naughty anymore?
If the end game of gamification is playing yourself, what would it take to reclaim play? Writers in this issue explore how homo ludens has put and might still put our instinct to better ends. How to build a better “bite”? As we gain new computational powers, we can use them to build and explore virtual alternatives to vanishing public space. We can tinker and jigger, attempting to build networks on entirely new principles.
A defining feature of play—a form of creativity that depends on restrictions, or rules—may be that it teaches us to adapt what we have to make what we want. This it shares with code, which encodes but can also be decoded, recoded. It is an education for an era of democratic deficits and dwindling resources.
Ours are times of anger and grief—but no time for despair. After the game is before the game, as legendary soccer player and coach Sepp Herberger once said. The task, now, may be to approach the limitations of the present with the intense seriousness of a child whom nothing makes happier than to start the same game—Again!