A grey to white parallelogram on a white to black gradient.

Image by Xiaowei Wang.

Meditations in an Emergency

The Editors


As we close this issue, COVID-19 case numbers are surging across the European Union, and if they are not yet as high in North America, it seems to be mostly for a lack of tests. Oil prices are plunging, the Dow Jones is plunging, and Ted Cruz is in voluntary self-quarantine. New York State prisoners are making hand sanitizer for $0.65 per hour. Passengers are disembarking from the Diamond Princess into the Port of Oakland.

For months, prominent figures in the tech industry have been warning that it will get worse before it gets better.

In February, Recode reported that the venture capital firm Andreesen Horowitz was already on high alert, canceling employee travel to China and banning handshakes in the office. In early March, Sequoia Capital wrote a memo warning partners that coronavirus would be the “black swan” of 2020; they urged startups to prepare for leaner times.

On Twitter, one “founder coach and angel investor” surmised that “people in Silicon Valley were taking #COVID19 more seriously than other parts of the US including NYC” because “SV understands the power of network effects and exponential growth.” "Congrats Silicon Valley you took a math class," a tech journalist turned venture capitalist swiped back. 

Everyone was on edge.

The exponential growth in tweets about exponential growth was easy to make fun of, combining, as it did, several stereotypes. The prepper engineer stocking Soylent. The billionaire in his New Zealand bunker. The thought leader inclined to consider himself an expert and therefore obliged to opine on everything, since everything is, or can be made into a source of, data. And yet. There was something to what the Founder Coach and Angel Investor said. 

To tell a good horror story, it is not enough to imagine some freak occurrence. You have to find what is frightening about what is ordinary—or, more precisely, ordinarily desired. A large home with a pool, backyard, and picture windows. A movie star husband. A beach vacation. A hot shower after a long drive. The very thing you wanted must turn deadly. Otherwise, you could just walk away. 

The pacing of the unfolding pandemic has more in common with a conspiracy theory than with a slasher flick. Its mode is twitchy simultaneity rather than the drumbeat of a killer closing in. Now our victim is in Wuhan. Now in Bergamo. Now New York. The villain is invisible and everywhere. 

But as the coronavirus spread all over the world, the tech industry watched the properties that it usually thinks of as strengths turn into vulnerabilities. Disruption was rampant. The same transportation networks, and supply chains, that had turned second-tier cities into “furnaces” fueling global just-in-time production suddenly became vectors for infection. The same networks that could be used to track cases, and disseminate information, were promoting dangerous lies. What we are watching, at time of writing, from home, is hockey-stick growth, but for sickness and death.


Viral, we call content that spreads quickly by means of preexisting bodies and behaviors. A virus takes our tendencies to make new cells, or touch one another, or share a laugh, and turns them to its own sole purpose: self-propagation. What makes something catch on is a subject of much speculation. What is clear is that some dangers cannot be confronted without considering the ways that the entire system they imperil works. 

This issue will explore forms of security and insecurity that arise as digital technologies enter new realms of existence—and how stubbornly these two terms intertwine.

In one sense, the dynamics that it explores are not new.

The rise of capitalism in the fifteenth and sixteenth centuries provided some insurance against the whims of, say, the weather. Labor markets promised to free workers from physical coercion: rather than toiling on the particular piece of land where you happened to have been born, or where some lord put a sword to your head and told you, Toil!, you could choose where to go work where you wanted, for a wage. 

And yet, in order to get most people to work for wages at all, those in power had to produce widespread insecurity. By turning life’s necessities into commodities that had to be bought and sold on a market, they made it impossible for most people to survive any other way. Digital technologies have only intensified this dynamic. To access information—a matter of survival in a pandemic—we rely on the platforms. If you want to get information about what the pandemic means for your kid’s school, you will likely have to join Facebook. 

The current crisis has brought us back to these founding facts of capitalism as a world system. The market mediates our every move, and binds us, through countless threads of interdependence, to everyone else. Nothing like shortages of toilet paper to highlight how few of us would be capable of single-handedly reproducing the conditions of our lives.


At the same time, technologies designed to make some people and property more secure create new vulnerabilities. Machine learning algorithms used to plant, water, and harvest miles of crops at the optimal times can be hacked; a software bug or glitch can now cause a famine. Wireless technologies allow the lender that has loaned you the money to buy the car you need to drive to work to remotely disable that car if you should fall behind on monthly payments. 

Securing one man’s loan can endanger another man’s livelihood or life—or both, should the remote disabling, for instance, happen while you drive to work on the freeway. And so, when we talk about security, we always have to ask: Secure from what? Secure for whom?


You can use a baseball bat to play a game or break a window; you can set a table, or kill your hostess, in the parlor, with a candlestick. 


Any tool can be abused, and if we have learned anything in the years since Gamergate and the 2016 presidential election, it is that some systems are most dangerous when they are being used as they were designed to be. The same features that make it possible for social networks to produce “cognitive surplus” and new social movements also produce misinformation and harassment.

The authors in this issue explore how the very same products and design features that are supposed to make some secure make others less so. Code designed to let you prove that You are not a robot will not make you safer if you are vision impaired and need your phone to play the code aloud every time you log in. Privately owned surveillance cameras, whose footage is constantly scanned by proprietary facial recognition software, are unlikely to make all the residents of a city safer.  As one activist interviewed in these pages puts it: Feeling watched is not the same as feeling seen


It is taking too long to write this. We can hardly keep up with our push notifications. 

As we close this issue, deaths in Italy surpassed deaths in China; the number of confirmed cases in the United States passed 30,000, still with a dearth of tests; California sheltered in place; at least hundreds of thousands of Americans are losing their jobs, applying en masse to a safety net that is not ready for the load, and at least two senators were found to have sold millions of dollars in stock after attending a closed door coronavirus briefing, and then lied to the public about the severity of the coming crisis. It is probably too much to hope that they do not sleep soundly. Their colleague Rand Paul became the first senator to test positive for coronavirus; he is not likely to be the last.

Meanwhile, powerful tech firms are finding ways to play the situation to their advantage. The alerts continue: Amazon has announced that they will hire 100,000 more workers to meet skyrocketing demand for “contactless delivery,” as their competitor businesses stand empty or close down. Palantir and Clearview AI are making bids to sell software to surveil infected people and their networks to the state; Facebook and Google are offering to use the track-location function in their apps to do the same. 

In China, the pandemic has intensified, and legitimated, expansive government control. The tools that are making it possible for millions of Americans to learn and work remotely are also creating new opportunities for both public and private entities to do the same. School-issued computers come with spyware installed; Zoom video conferencing software allows the boss to track “attendee attention” or drop in at any time.


What would it take to turn this emergency inside out?

An emergency produces opportunities for solidarity as well as selfishness. A pandemic does this in particular. To the idiots buying five bottles of soap, one viral tweet admonished, it doesn’t matter how many times you wash your hands if your neighbors cannot wash theirs. Our present situation highlights, with particular starkness, that nobody can be secure alone.

Ancient philosophers argued that care or worry, cura, was the defining feature of human life. The state of being sine cura—“without” or free from it—was, at best, fleeting. On the other hand, care or worry is also the origin of curiosity

This issue proposes that the concept of security can still hold the same ambivalence and cautious hopefulness now as it did then. We must care for our neighbors not in spite of but because of our common vulnerabilities. (In Toledo, Ohio dozens of volunteers have begun sewing fabric masks for local doctors and nurses already running out of them.) As the social contagion of panic spreads, we hope this idea catches on, too: that the very same fears pushing millions of people into isolation can also draw us outward, binding us more firmly to the world and one another.

This piece appears in Logic's issue 10, "Security". To order the issue, head on over to our store. To receive future issues, subscribe.