The internet began with the dream of a common language. The vision was a network of networks, bound together by a protocol that let a global community of computers speak to one another—an Esperanto, but for machines.
This dream expressed distinct and sometimes directly competing desires. It was built on wartime sciences of command and control, yet it also contained a communalizing impulse. On the early internet, everything was open source. Open standards prevailed over proprietary ones. Then, after the end of the Cold War, the US government gave the internet to the private sector. The paradox followed that as the internet became truly commonplace, widely popularized, it gradually lost its openness: it became a set of walled gardens, dominated by the so-called platforms. The mediation of billions of lives through infrastructures owned by a handful of companies has not made us more free, even when their services are.
Neither has it made us more alike. As the internet has become more massified, it has created more differentiation: producing filter bubbles, epistemic crises, and ever finer demographic segmentation—not to mention amplifying inequalities between those who own a piece of the platforms and those who scramble to put together a living on them; those they give the God’s-eye view, and those they’re used to oversee. Despite the expectations of early US cyberutopians—or, you might say, imperialists—the global spread of the internet has not brought American values everywhere. On the contrary, there are strong tendencies toward fragmentation, from the General Data Protection Regulation in the European Union and calls for data sovereignty in South Asia and Latin America, to the billion users behind the Great Firewall of China. Even as networked technologies have become increasingly universal, it is debatable whether they are “common,” in the deep sense of that word.
Commons is an unusual kind of noun. Singular plural. The article it takes is: The. Historically, “the commons” has referred to the various kinds of resources that members of a community might share. Within village settings, for example, the commons consisted of a central green area where villagers grazed livestock, or a forest where a kind nobleman let them hunt, fish, and gather berries.
Elinor Ostrom, the economist who won the Nobel Prize for refuting the popular thesis of a “tragedy of the commons”—that is, the idea that any freely shared resource would inevitably be abused and deteriorate—said that a commons consists, at a minimum, of:
Common goods — Those fruits of nature and society that everyone needs to survive and thrive, including our atmosphere, oceans and forests, biodiversity, all species of life, natural systems, and minerals; our food, water, energy, and art; culture, technology, healthcare, and spiritual resources; and, also, news media, and the trade and finance systems we use.
Commoners — Groups of people who share these resources.
Commoning — Inclusive, participatory, and transparent forms of decision-making and rules governing access to, and benefit from, these common resources.
Ostrom also specified that commons came with boundaries. A commoning process, to include some, had to exclude others. What is needed, who needs it, and how to claim it are hotly contested political questions in our moment—particularly in the midst of a global pandemic. Who will ensure that the answers to these questions are found fairly?
This issue examines the theme of “commons” from all three of Ostrom’s angles and more. Our authors investigate how large quantities of data have been collected and connected—not only by nation states, as the privacy advocates of the early internet feared, but by corporations, some of which sell their services back to the very government entities from which they were supposed to shield us. The fact that advertising, or attention capture, became the default business model of the internet is one reason for the situation in which we find ourselves. But, as this issue demonstrates, there are others. In the 2010s, advances in machine learning created powerful, and lucrative, incentives for companies to begin gathering as much data as they possibly could.
Alongside the companies that gather data, there are newly powerful companies that build the tools for organizing, processing, accessing, and visualizing it—companies that don’t take in the traces of our common life but set the terms on which it is sorted and seen. The scraping of publicly available photos, for instance, and their subsequent labeling by low-paid human workers, served to train computer vision algorithms that Palantir can now use to help police departments cast a digital dragnet across entire populations.
What might have once looked like a transgression of the public-private boundary starts to look more like its transformation. But data can also be put to different purposes. Across the country, anti-eviction activists are using digital tools to extract information once held exclusively by corporate landlords and police departments, and put it into the hands of the tenant organizers who need it.
Our authors and interviewees also investigate who is setting the terms of our world-system: the lingua franca that our machines use to speak to one another, and we use to speak through them; the standards that govern infrastructures on which more and more of us depend. The past is a source of lessons, alternative visions, and practices that might help us thread the gap from present conditions to a livable future. “Freedom quilting” was a form of computation that was also a form of care work, and not only that.
Pieces in this issue ask how to organize a fairer system of algorithmic distribution, and what a more public, less commodified internet might look like. They explore why diversity initiatives have failed, and might have been designed to—and why opposing racism might require a radical transformation of the business model, not just new inputs to it.
We are writing from strange times. The luckiest among us have spent months in digitally mediated isolation. For others, the closing year has been a time of hunger, illness, and intensifying hopelessness. All of our lives are increasingly managed by algorithms that target and personalize. Then again, we have all been compelled to think with new keenness about our points of proximity. How intimate, to worry about breathing in air that a stranger in the supermarket has breathed.
In Latin, the phrase locus communis literally means “common place.” Figuratively, it means a familiar topic of conversation. The “commonplace,” in this sense, is not pejorative, like a cliché. It is a set of shared assumptions, a ground on which a diverse set of actors can meet. We hope this issue contributes to the making of that ground, to the construction of a space for us to think together about how, in whatever comes next, connected machines might help us create new collectivities and possibilities.