The internet was supposed to save the world. What happened?
The time is out of joint. The president is unhinged. Misaligned, our civilization approaches its breaking point. Crises of all kinds—ecological, nuclear, social—threaten the final crack-up. And the internet, once seen as our savior, looks more and more like a destroyer, deranging the structures that keep our society intact.
Since the internet became mainstream in the 1990s, we’ve been told it would take us to utopia. The digital economy would transcend analog laws and limits, and grow forever on the fuel of pure thought. The digital polity would make us more engaged, and produce more transparent and responsive governments. As individuals, we could expect our digital devices and platforms to make us happier and healthier, more open and connected.
For decades, these promises seemed plausible. At least, most of the media thought so, as did most of our political class and the general public. In the past year, however, the consensus has shifted. Digital utopianism suddenly looks ridiculous. The old dot-com evangelists have begun to lose their flock. The mood has darkened. Nazis, bots, trolls, fake news, data mining—this is what we talk about when we talk about the internet now.
As the pendulum swings, it’s worth stopping to take a breath. Worshipping the internet was always absurd. Demonizing it is equally misguided. “Technology is neither good nor bad; nor is it neutral,” the historian Melvin Kranzberg once said. Therefore, as the sociologist Angèle Christin writes in this issue, “We shouldn’t merely invert the Silicon Valley mantra that technology provides the solution for every problem, to arrive at the argument that technology can’t solve any problem.”
Techno-utopianism isn’t the answer, in other words. Neither is techno-dystopianism. The internet once embodied our hopes for a harmonious future. Now it offers a convenient punching bag for our despair about the present. But technology doesn’t automatically generate justice or injustice. The outcomes it generates depend on who owns the machines, and how they’re engineered.
Utopia may never arrive. But technology can make the world more just—if we find the right ways to organize and operate it.
Language is a legacy system. And in the language we have inherited, justice is a technical concept.
A just world would be a world well made.
“Fair” means both fair and beautiful. The word that the word “justice” comes from means “straight.” We still justify margins. If a shirt or skirt rides up, or a picture frame tilts down, we adjust them. To “make things right” means, literally, setting their edges at ninety-degree angles.
Jesus’s day job was carpentry—until he became a full-time joiner of men. Would he have been a programmer today?
This issue includes pieces by and about people who are trying to build new technologies, or use existing ones, to rectify our broken social systems. It also includes people who are being disciplined and punished by technologies—from robo-debt software to racist search engines. And it offers strategies for resistance, whether through little hacks or all-out mutiny.
Justice may have a technical component but injustice has no purely technical solution. Making the world right isn’t merely a matter of making tweaks, or finding the one elegant algorithm that will refactor the spaghetti code of society. It might be comforting to imagine that we can fix our problems technocratically—especially if you have an engineering sensibility, or a lot to lose.
Any technologist wants to make things that work. But the key questions are works for what? And, perhaps even more to the point, for whom?
Justice, like Love, is supposed to be blind. The statues in front of courthouses show a goddess holding a set of scales out, with a piece of cloth over her eyes. The point is that the law should apply to everyone equally. Justice can’t see who’s rich or powerful. Her blindness fosters a deeper kind of insight.
Many of the issues that our contributors explore in the following pages come down to visibility. One piece investigates how black faces are seen (and not) by police software used to lock them away; another, how indigenous communities are deploying drones to force governments to acknowledge their land claims.
Democracy depends on self-representation: our ability to oversee those in power and to make ourselves seen. Most people are invisible in our political system. But the forces that oppress them are becoming increasingly obvious. The way the internet organizes knowledge—not by silo but by hyperlinks and hashtags—helps us recognize how everything is connected. It reveals not a series of isolated wrongs but a pattern with deeper roots.
It is always tempting to look at injustice and call it natural. It is how it is. Boys will be boys. Nature is a comforting concept to those in power, because nature is what you get to take for free. Natural is what you call a situation you don’t want to change—either because you feel helpless to do so or because you are its beneficiary.
People in power love to tell us that there is no alternative. But there are, in fact, many alternatives. The obstacles to human flourishing aren’t inevitable. They’re not eternal facts of life—they’re produced by the specific ways we organize our society. And we can organize our society differently.
With the fires burning and flood tides rising and nuclear war one tweet away, more and more people seem to realize that we need to—and fast. But to reorganize our world the right way will require a new moral vision.
We have inherited a particular set of metrics that guide how we build and implement technologies: clicks, downloads, conversion—which all, in the end, roll up to profit. But what if we optimized for different outcomes: sustaining the earth, empowering all who live on it, enlarging the horizon of human possibility?
Close your eyes. What does Justice see?