Issue 11 / Care

August 31, 2020
White lace on a black background

Image by Celine Nguyen

Where Deer Stroll Peacefully Past Computers

The Editors

This is the editor's note from our upcoming issue, Care.

Subscribe now to receive the issue as part of a subscription or pre-order the individual issue. Starting with this issue, we're excited to offer print subscriptions outside the US.


When the pandemic first struck the United States, it was not uncommon to hear people compare its virality to the other, online, kind. The numbers climbed impossibly—then inevitably—high. In a nightmare inversion of network effects, the more people who got it, the more deadly it became. In New York City ICUs, patients were dying in the hallways.

Then, as the weeks passed, a different technological metaphor started to seem more apt: the X-ray. The novel coronavirus itself remained, in many ways, inscrutable. But it revealed the brokenness of our systems for providing care.

Nursing homes and prisons became deadly zones of infection. Food production plants followed. Doctors and nurses were going to work in garbage bags, while governors, who had been told to compete with one another, bid up the price of PPE that never materialized. 

Cities could not dispose of their dead fast enough. They closed schools and daycares indefinitely, without explaining what exactly the working parents who were now also full-time caregivers were supposed to do—and the working parents were the lucky ones. Every Thursday, the Labor Department reported that millions of Americans had applied for unemployment the previous week. That did not count all of the people who could not apply for unemployment because the state websites were crashing and the hold times on their hotlines lasted all day.

Did it even make sense to call what you were feeling “anxiety” or “depression,” when there were so many real reasons to worry? Amid so much loss and suffering, what feeling person would not grieve? Nonetheless, social media revealed a steady stream of people who could not be persuaded to care enough for their fellow citizens, or themselves, to wear masks on the bus or to the grocery store. Armed protesters showed up at government buildings to demand a return to business as usual. 


The pandemic put technologists, and technology firms, in an ambiguous place. On the one hand, their products have become more essential than ever. Internet traffic is up; cloud services are in high demand. Countless Americans have turned to Amazon and Instacart to shop (or work), and Zoom to take classes or attend religious services (or work). Tech stocks are soaring, even as the real economy falls, and the fights that at least some prominent figures in the industry have picked with journalists and lawmakers reflect a new sense of invulnerability. 

At the same time, COVID-19 has exposed points of fragility in the system. If the biggest tech companies are essentially vast engines for making predictions, the pandemic was unforeseen: you could see the breakdown of the machine learning machine. Moreover, this was a crisis of care, and care is precisely what software cannot provide: it is designed to coordinate, and sometimes eliminate, human work. The tech firms could offer gig and warehouse jobs, but they could hardly make up for plummeting employment. They could apply their expertise in digital surveillance to contact tracing, but the populations most at risk—Black and Latinx communities—had the most reason to fear being tracked.  

Then the Minneapolis police murdered George Floyd, just two months after the Louisville police murdered Breonna Taylor. Cities across the country erupted. A government that couldn’t be bothered to do the bare minimum to contain the pandemic quickly moved to mobilize battalions of militarized cops. There was no money for PPE, but no shortage of resources available for repression.

Again, the role of technology was ambivalent. On the one hand, smartphones produced the videos of police violence that sparked and spread the mobilizations. Technologists soon found other strategic uses for digital tools—for instance, mapping police movements by listening to scanners. On the other hand, tech posed a constant danger. Drones developed for wars in the Middle East hovered above protests. Your own social media could be used against you—and so could other people’s, as a woman in Philadelphia learned the hard way, arrested after a stranger posted a photo of her with her face covered on Instagram. The networks used to organize and amplify collective action were all too easily weaponized by the state. If you go to a protest, leave your phone at home. 


We made this issue in the midst of these intertwined and rapidly unfolding crises. The pieces were written in the midst of the grief and rage of the past months, but also the moments of possibility and hope, which have so often taken the form of people taking care of one another, from street medics washing out pepper-sprayed eyes to militant nurses and teachers organizing for better conditions for patients and students. There was a reason so-called “Momtifa” captured the public eye. 

Some of the pieces in these pages deal directly with the most current and urgent aspects of the crises. What they find is that technology is often part of the solution—but only a part. Smartphones can help, but only in tandem with functioning healthcare and state institutions. The clean lines of the tech-laden megahospital suggest a future of frictionless care. But in their shadow, essential workers are saving lives in tents. 

Technology alone can’t save us. Often, in fact, it can hurt us. The harms it inflicts aren’t new. The software might have been made recently, but the social relations that software embodies and enacts were made a long time ago. Contemporary digital surveillance emerged from older practices, and obeys old carceral logics. As Sarah T. Hamid explains in these pages, today’s facial-recognition and predictive-policing algorithms belong to a centuries-long lineage of tools for “the control, coercion, capture, and exile of entire categories of people.”

Taking care of one another will require dismantling these tools, whether in the form of a laser pointer that scrambles a facial-recognition camera or a legislative ban that outlaws facial-recognition technology. It will also mean constructing alternatives. Moments of social mobilization enliven and expand our political imagination. Among the things that sorely need reimagining is our technology. 


The poet Richard Brautigan once imagined “a cybernetic forest,”

filled with pines and electronics

where deer stroll peacefully

past computers

as if they were flowers

with spinning blossoms.

This issue explores what a more habitable digital world might look like. There are recovered histories and preliminary experiments, sketches of past and possible schemes for organizing networks differently, and for redressing networked harms. 

The issue also asks, Who cares? In one sense, this means: Whose lives are touched by particular technologies, and who participates in their development and design and deployment? But it also means: Who performs the work of looking after and tending to people, and the machines that are integral to the systems that people need? The unglamorous work of maintenance and custodianship, of remembering abandoned knowledge—and programming languages—is what makes it possible for the millions of Americans who have lost their jobs in recent months to receive an unemployment check. 

Computers cannot care for us as completely as venture capitalists might like. But we hold out the hope that, with some deep social and technical reconstruction, they can be put into service of creating a more caring world. 

This piece appears in Logic's issue 11, "Care". To order the issue, head on over to our store. To receive future issues, subscribe.