Issue 15 / Beacons

December 25, 2021
Beacons

Art by Neta Bomani.

A Body of Work That Cannot Be Ignored

J. Khadijah Abdurahman

with contributions from Ben Tarnoff and Alex Blasdel

1/

In June 1945, a committee chaired by the physicist James Franck raised the alarm about the Manhattan Project’s development of nuclear weapons. The document they produced, known as the Franck Report, urged President Truman not to use the atomic bomb against Japan. Instead, Truman should demonstrate the bomb’s destructive power by dropping it on a desert or a barren island—or he should try to keep the bomb’s existence secret for as long as possible. Otherwise, the scientists warned, a global nuclear arms race would ensue, with catastrophic consequences for the planet.

The authors of the Franck Report had worked on the Manhattan Project. But rather than siphon the scientific knowledge they had accrued in developing nuclear weapons out of the lab and into the commons in order to build a mass movement, they waited until the final hour to pen a letter, addressed to a government that would never heed their call. The scientists understood the stakes of nuclear weapons better than anyone. But in making a moral appeal to the American empire, they demonstrated a profound misunderstanding of the social and political context the technology was developed in service of. Two months after they wrote the report, atomic bombs were dropped on Hiroshima and Nagasaki.

I was reminded of those physicists in December 2020, in the wake of Google’s high-profile termination of AI ethics scholar Timnit Gebru. Her firing was the final step in management’s systematic silencing of her scholarship, which came to a head over a paper she coauthored called “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?” The paper offers a damning critique of the energy demands and ecological costs of the sorts of large language models that are core to Google’s business, as well as how those models reproduce white supremacy—and codify dominant languages as default while expediting the erasure of marginalized languages. Thousands of Google workers, as well as supporters throughout academia, industry, and civil society, rapidly mobilized in defense of Gebru, writing an open letter to Google executives that demanded both her reinstatement and an apology for the egregious treatment she received. 

Like the Franck Report before it, however, this open letter represented a grave misunderstanding of the politics of AI and was in no way commensurate with the threat we face. The technologies being developed at companies like Google present major stakes for all of humanity, just as the invention of nuclear weapons did in the previous century. They are strengthening concentrations of power, deepening existing hierarchies, and accelerating the ecological crisis. More specifically, big tech corporations are extracting labor, ideas, and branding from Black people, and then disposing of them with impunity—whether it’s scholars like Gebru or Amazon workers in Bessemer, Alabama organizing against plantation conditions. 

Racial capitalism’s roadmap for innovation is predicated on profound extraction. AI is central to this process. The next flashpoint over AI is inevitable—but our failure to respond adequately is not. Will we continue to write letters appealing to the conscience of corporations or the state? Or will we build a mass movement? As Audre Lorde said, we cannot dismantle the master’s house with the master’s tools. We cannot stop Google from being evil by uncritically relying on the G Suite tools it developed. We cannot uncritically champion the most popular among us, as if social capital will resolve the colonial entanglements reproduced in much of what passes for research in the field of technology studies. An atlas ain’t no Green Book, and we cannot afford to pretend otherwise. What we can do is to build a better analysis of the context of racial capitalism in which extractive technologies are developed. We can share knowledge about the ways in which such technologies can be refused or have their harms mitigated. We can forge solidarities among workers, tenants, and technologists to help them organize for different futures. We can light alternate beacons. 

2/

Dismantling racial capitalism and displacing the carceral systems on which it relies requires an understanding of how technology produces “new modes of state surveillance and control,” Dorothy Roberts argues. Part of the challenge is that these new geographies of policing, regulation, and management are largely invisible. We experience the immediacy of our Amazon package being delivered without seeing the exploitative labor conditions decreasing the distance between order and arrival. This is not a function of insufficient effort—it’s an indication of how successful big tech corporations have been in concealing the sources of their power. In his essay in this issue, Julian Posada provides a detailed account of Venezeulans performing the tedious, low-paid labor of data labeling on which AI depends—labor that is hidden beneath Silicon Valley’s minimalist user interfaces and promises of automation. The circuits of racialized capital link us ever more closely together even as the pandemic has deepened our sense of alienation. 

Understanding how tech has reorganized labor, and developing a strategy to break free, is not easy. It cannot be done with the narrow technical training that produces computer science PhDs—the recent appending of ethics courses notwithstanding. It requires an interdisciplinary analysis in partnership with impacted people who are on the forefront of digital experimentation. There is no way around doing this work.

In theory, Black study is the intellectual method and tradition that is best positioned to lead such an analysis. As SA Smythe clarifies in these pages, Black study does not mean Black Studies™ —“a hegemonic and ethnonationalist interdisciplinary framework that was heavily funded by the government” during its founding in the 1960s. Instead, drawing on the work of Robin D. G. Kelley, Smythe defines Black study as “the deeply invested commitment to Black people, Black life, Black possibility, and freedom dreaming.” 

The Bulletin of the Atomic Scientists was founded by former Manhattan Project scientists in 1945 after the atomic bombs were dropped on Hiroshima and Nagasaki. Its iconic doomsday clock is currently set at one hundred seconds to midnight. This reflects the risk posed to the world from nuclear weapons, climate change, and, notably, “disruptive technologies.” Black study would have us trouble this notion of catastrophe as a singular event or a state of exception. As Smythe explains, exclamations of “How is this still happening and it’s 2021?” show that we’ve been bamboozled and hoodwinked into thinking time has marched linearly forward towards modernity. Smythe insists that the reason we find ourselves in what scholar Bedour Alagraa calls “the changing same” is that we are in fact still in 1492, circling the drain of the ongoing catastrophe initiated by white contact with the “new world.” But this is not cause for despair—it’s an opportunity to ask better questions, like the one posed by Katherine McKittrick in “Mathematics Black Life”: “What if we... begin to count it all out differently?”

This desperately needed intervention is constrained by the fragmented character of knowledge production. Within computer science and information studies, race is treated primarily as a social consequence of technology rather than constitutive of technology. In the six years since Simone Browne published Dark Matters, a seminal work tracing the links from the proto-biometrics of the Middle Passage to the present-day use of facial-recognition technology, important scholarship has emerged—including from Gebru, Ruha Benjamin, and Safiya Noble—but not at the urgency and scale with which new technologies are violently renegotiating the social contract. And although Black folks have had no choice but to survive the tools and techniques of social control, technology with a capital T has not been a central object of Black study. Similarly, abolitionist organizers have rightly disavowed technical solutions to the prison-industrial complex as reformist reforms, but have not often recognized how central technology is to intensifying the carceral state.

3/

What does it mean to Get Out! in the twenty-first century? How do we build fugitive technologies?

This special issue of Logic does not seek to provide a totalizing narrative or singular solution. Rather, our goal is to “call in” thinkers and artists from different disciplines—for example, Black studies scholars who are engaged with notions of catastrophe but whose insights have not been yet been taken up by people investigating how technology produces catastrophe, or integrated into the resistance strategies of communities being harmed by new forms of digital experimentation. (The approach of this issue is many times over indebted to Bedour Alagraa’s thinking on “the interminable catastrophe.”) Similarly, how can computer scientists and engineers more effectively communicate to the public not just about the harmful effects of technology but about how these systems actually work and what interventions on the level of software or hardware offer a more liberatory future? 

We take the stakes we’re facing seriously while leaving room for our futures to not be overdetermined by white supremacy. As André Brock, Jr. discusses in these pages, our approach to technology does not need to be one of abjectness. “I’m not saying, ‘Oh, I’m on the other side of the digital divide and I’m trying to cross that bridge,’” says Brock. “No, I peeped that bridge and it doesn’t take me anywhere that’s really necessary for me to go.” While we may not offer one path forward, we hope to get in the way of techno-solutionism and corporate-funded initiatives that absorb the most radical elements of the discourse without actually supporting people to go do the most radical thing. Our hope for this issue is that it will be what Seeta Peña Gangadharan, coorganizer of Our Data Bodies, calls “a body of work that cannot be ignored.”

I am grateful to the Logic team for letting me hijack their operation, to give us some space where we be imagining, even as the work punctures the myth embedded in the magazine’s name. We offer no singular way of knowing, no hope for messianic deliverance. We be needing logics. This issue is an outlet in which we can explore these logics and meaningfully argue with each other. In a recent interview, Keeanga-Yamahtta Taylor lamented the fact that “debates that exist in the left have no space to be deliberated upon. People get on social media to either ignore or insult each other’s political ideas and opinions,” she continued, “but I’m saying if we want to be impactful in building a mass movement, to shape and direct politics in this country, then something radically different needs to happen.”

In this issue you’ll find Marxists, Wynterians, Black speculative fiction, poetry written inside a cage, a graphic story about internet shutdowns in Kashmir, abolitionists, and the unaffiliated. In this issue you’ll find many beacons because, like Neta Bomani’s tween zine insists, we need to move beyond The Way. As guest editor, I chose to curate love letters over a manifesto—because I know plans and leaders get captured or beheaded, but we can nourish an otherwise set of relations to each other while we strategize on getting free. 

Postscript by Ben Tarnoff

One December morning in 2020, I DM’d Khadijah on Twitter. We’d never spoken before, but I’d just read a recent essay of hers, “On the Moral Collapse of AI Ethics,” and loved it, and wanted her to contribute to Logic. She said she’d be in touch with some further thoughts.

A couple weeks later, she followed up by email. What she really wanted to do wasn’t write a piece, she said, but edit a whole issue:

I’ve been thinking about concrete next steps to move beyond calling out the failure of the status quo to providing an alternate beacon for people who are looking for space to build and think critically, take risks and specifically room to think about currently under resourced domains ie tech/data policy in the global south, grassroots response beyond the right to refuse surveillance, bringing in agroecology, the core of Black studies (ie not just citations for bias but the epistemic and historical challenges being raised at the forefront of the field) etc.

The aspiration for the issue would be to create “alternate beacons”—that is, to present new ways of thinking about and living with technology, drawn in particular from Black thinkers and practitioners, with the hope of moving beyond critique (as much as we love critique) and toward imagining new worlds. It felt perfect for us. I brought the idea back to the Logic group, who shared my enthusiasm. Soon after, I connected Khadijah to our managing editor Alex Blasdel, and the two of them embarked on the long and labor-intensive task of making this issue.

Why did Logic decide to undertake this collaboration? I don’t presume to speak for the magazine as a whole—Logic is very much a collective venture, of which I am only one part—but I think it’s because Khadijah was giving us a way to evolve, to find new pathways for our project, now in its fifth year.

A lot has changed since we launched Logic in early 2017. One of our main motivations was our contempt for popular writing about technology. In the manifesto that led our first issue, we announced that “most tech writing is shallow and pointless.” In the intervening years, however, this statement has become less defensible. As the “techlash” has bloomed, the discourse has become immeasurably more sophisticated. There is now very good reporting about the industry and, with some exceptions, tech criticism as a whole has become less idiotic, more tethered to fact.

But not everything has changed. Despite the greater sense of clarity and concern, a lawyerly liberalism continues to dominate, and domesticate, the political conversation about tech. Some years back I attended a conference in which a fairly prominent tech policy person said that the best way to solve the various problems underlined by the techlash would be to put all of the “smartest people” from industry, government, and academia into one room and have them figure it out. All that was needed was the right constellation of experts, in other words.

So Logic still has work to do. The techlash has altered the terrain, but wherever there is power there is a court, and every court has its courtiers. The new common sense is much like the old; techno-utopianism may have fallen out of fashion, but technocracy of one kind or another is harder to eradicate. The techlash has served as a mass credentialing event for a new class of experts, as “AI ethics,” “responsible innovation,” and similar pursuits attract significant funding and visibility. Many of these experts do interesting work, and everyone needs to eat, but the overall arrangement in which they participate can’t help but reiterate the logic of technocracy.

What’s missing from this arrangement is the people whose lives are being reordered by technology—or, more precisely, by a particular set of practices as structured and mediated by technology. What’s missing is a view of technology from below, as it is encountered and experienced by living and breathing human beings. There are both epistemological and political stakes here. The feminist philosopher Nancy Hartsock once argued that systems of domination can only be fully understood from the standpoint of those they dominate, an insight she drew from Marx (only proletarians can obtain a complete view of class society) and applied to gender (only women can obtain a complete view of patriarchy). We can extend her argument further, and say that today’s technological regimes are most accurately perceived from the standpoint of those they oppress, exploit, and exclude. And this perception is to be acquired not simply for its own sake, but rather in the service of a broader political project of liberation, as it was for Hartsock and Marx. To see technology from below is also to develop the knowledge needed to govern it from below. Every cook can govern, C. L. R James reminds us, and the internet would undoubtedly be a better place if it were governed by more cooks (and fewer lawyers).

This is the spirit that animates the issue that Khadijah has curated. In these pages we see technology through the eyes of sex workers and click workers, of the incarcerated and the disabled. And while there is much injustice, there is also hope, creativity, and joy. There is the great imaginative power of the Black freedom struggle and the Black radical tradition. We are not led to any single set of conclusions and we never arrive at a final orthodoxy. Some circles on the left have long believed that orthodoxy is what makes revolutions. But revolutions are notoriously irregular affairs; their combustion derives from the diversity of their inputs, which interact in unpredictable ways. “The rise of a group of people is not a simultaneous shift of the whole mass,” W. E. B. Du Bois observed, “it is a continuous differentiation of individuals with inner strife and differences of opinion, so that individuals, groups and classes begin to appear seeking higher levels, groping for better ways, uniting with other likeminded bodies and movements.” This issue attempts to seek some of those higher levels and grope for some of those better ways, to do the right kinds of searching and struggle. Logic will do its best to keep lighting beacons in the years ahead.

Khadijah Abdurahman is the editor in chief of Logic(s).

This piece appears in Logic(s) issue 15, "Beacons". To order the issue, head on over to our store. To receive future issues, subscribe.