Portrait of Safiya Noble

Photo by Stella Kalinina

Resetting How We Think of Policy: A Conversation with Safiya Noble

Dr. Safiya Noble is an internet studies scholar, the author of Algorithms of Oppression, and the Faculty Director of the Center on Race & Digital Justice at the University of California, Los Angeles. We spoke with Safiya about how to think about policy beyond reform, the state of tech criticism, and surfing as an antidote to predictive static models.

Khadijah: This morning, I was thinking about how the right has catalyzed this onslaught of attacks on the notion of being woke at a time when we’re—we meaning most Black people— being confined into a state of hypervigilance. How do you think about this tension between being hypervigilant in terms of being aware of attacks, being aware of the people around us and relationships, but also needing sleep and needing space to exert agency around sleep?

Safiya: So many Black women in particular that I know who work around academia and/or the tech industry are unwell. We need to be hypervigilant about the way the tech sector is completely remaking society. We are also exhausted. This tension is always on my mind because it’s part of my life: we can’t afford to rest because the consequences will be too steep. 

Having said that, rest and living lives that are not overdetermined by everything external is really important. All of the—what tech people call externalities, which are factors that you didn’t account for that are influencing your product that are secondary or tertiary to the main point.

Khadijah: I feel like we are the externalities.

Safiya: [Laughs] We are. “Whoops, where did they come from? Oh boy, now we’ve got to deal with them.”

Khadijah: Consistently, we’ve been fucking their shit up, everything. We won’t let them be great.

Safiya: Because there also are externalities that are interfering with our ability to be great. The problem here is that they have a lot of power and a lot of money to make their visions come true.

Khadijah: I’m thinking about how JoAnne Epps, the interim president of Temple University, died during a service and they kept going and how much outrage this generated. Then also a level of frustration that myself and other people have with this outrage, when we know that academia is chewing us up and spitting us out. There’s a degree to which shock sometimes feels like we can’t believe this is happening to people who are credentialed, that which happens to the majority of Black people who are not credentialed—people who are living in the slums, people who are living in the surveillance future of austerity measures and constant social and political control. 

Concretely, what are counterproposals to academia, to the world that the tech sector’s envisioning? I remember you talked a bit about Black towns that are being built in places that have been abandoned.

Safiya: The level of burden that people feel around the world from the increasing inequality is unfairly distributed. The problems that we talk about in the United States, especially among academics at elite universities, even given the pressure in those situations, the pressures upon our lives and, of course, the death of Professor Epps at Temple. I’m really crushed to work in an industry that would let a Black woman drop dead and then continue on with the meeting, and that’s a reminder for all of us in our work, that everyone will step over our bodies whether it’s in the projects or in the streets and keep going, and in some cases, keep trying to survive, but in other cases, witness that and make pivots to thrive.

That had me thinking about Black towns last time I spoke with you. I was watching this Black developer who has been developing these almost off-the-grid homesteading neighborhoods and towns and really trying to target and attract Black people to very beautifully-built spaces and places where Black kids could just ride their bikes freely and no one is worried about them because they can just walk out the door and live and play.

That resonates so much with my own spirit—and I know you and I both have kids and we’re always talking about the freedom we want for our children, for all children, to be able to live and laugh and play. Instead, we’re up against the makers of ShotSpotter buying PredPol and deploying more and more weapons at our communities.

To imagine something more than being ensnared in AI and surveillance systems and to dream about different ways of living and knowing and being is absolutely crucial right now, especially as we feel the totalizing, antidemocratic, racist effects of the surveillance state.

Khadijah: That resonates a lot. I’ve been going to the countryside in India and Ethiopia and Zanzibar and other parts of Eastern Africa, and I see so many places where, contrary to Feed The Children infomercials of my childhood late night on BET, people are actually not starving. They can pick fruit off the trees. They have sustainable ecosystems. People live outside of colonial binaries.

We had a piece in the last issue about earthen houses and how so many of these exist in the countryside. People are embarrassed of them, but they stay cool during the daytime and stay warm at nighttime, unlike the concrete structures that people are basically selling themselves into slavery in the Middle East to afford to buy for their families, for their mothers. Ironically, now the EU gave several million dollars to Germany to build these earthen houses to fight the climate crisis.

We see our technologies being reappropriated by white supremacist institutions all the time. But for me, it’s not just a lesson in how we are harmed or what’s stolen from us, but also we have a set of institutions that are valuable, and institutions writ broadly. I think about double dutch as an institution, whisper networks as an institution. 

Safiya: We do. I’m thinking of the community gardens in Inglewood and Compton and South Central LA. And I just joined the board of this incredible organization in LA called Bio Equity Ed, and it’s run by Dr. Veda Ramsay, she’s a Black woman, and one of the things I learned from her is how powerful being in these places we have been systematically taken out of is. How important grass and trees and water and the movement of the wind is, the sun is, to our healing, health, wellbeing. 

Blackness is painted out of these modalities and ways of living and ways of knowing. I’ve been thinking about this so much because my son, who’s twelve, just got into surfing. I see him in the water and how free he is, so motivated to deal with the structures of white supremacy in his education and in the city because he has access to being in the water and feeling the freedoms that come with that and the way in which the dynamic nature of the ocean, it forces one to never be static or make a firm set of assumptions because you don’t know how that wave is going to come at you and you have to actually be fluid. You have to be in sync with the ocean in order to survive it.

There’s so many lessons we get to experience when we go and watch him ride a wave. The most simple dimensions of nature that are here for all of us on this planet are the things that we are constantly alienated from. For me, artificial intelligence, working in this field of tech criticism, that is the antithesis of what it feels like to be in the free, dynamic experience of our humanity because these technologies would rather force us into predictive static models from the past that allow for no newness, no exploration, no complexity. 

Khadijah: I feel like surfing is the antithesis of the world that is imagined by a spreadsheet. I think about André Brock’s work and how he talked about Black people having an excess of life, and then again, how the tech bros are so mad because an excess of life is hard to capture on a spreadsheet. But those are also Black geographies, and that’s what gives me hope.

This issue is focused on tech policy. People were surprised to hear the word policy come from my mouth because policy is often associated with reformism instead of abolitionist methodologies, like giving cops better training so they stop killing people. But for me, policy is more like what is a consensus that we can come to that can help us build life-affirming space? When I say space, I don’t mean it in a New Age good vibes sense, but in a specific place. How do we build those kinds of consensus, whether from the state or within movement spaces or as a parent? What do you think in the meta about tech policy? 

Safiya: I think of policy as being the conditions that overdetermine what is possible, and I always think about this in terms of my own parents who came of age in a pre-Civil Rights United States. Those conditions of segregation overdetermined what was possible. It created the climate. Both of them really coming up against that policy of the time, the conditions of the time, and trying to break through, and create a new set of conditions for themselves and for their children.

There’s a policy in my house around how we do what we do, right? Most people think about the policy as just being the federal, state, municipal policy or the rules at school or the rules of engagement. But I’ve always seen rules and laws as guidelines, policies as guidelines, as guardrails. Those guardrails might be deeply insufficient. Those guardrails might protect some and leave vulnerable others. 

Black people are profound experts on policy because we are profound experts on the conditions that limit our movement, and we are always pushing up against those policies. I feel so deeply appreciative of the way that Logic(s) and this issue is upending the way that people traditionally think of policy, which is what congresses or assembly members do. But really, we set the terms of engagement, and the people who are in positions of power are often forced to contend with the demands that we make, and the policy comes from those impulses to dream, to be liberated, to feel free, to explore, to be more. We should not cede that word to the narrowest imaginary.

Khadijah: To me, there’s certain things that should just be abolished—policing, the carceral system, prison, family policing, all of these ways people are encaged and confined and not allowed to be their full free selves. But I don’t want to abolish the EPA [the Environmental Protection Agency]. I don’t want raw sewage in the water. Going into the countryside in a lot of the global South, there’s raw sewage running in a lot of places that exposes us to dysentery and cholera. I also see how the regulations of medication in the US are then imported into other countries, so they are reliant on the regulations here.

When you think about tech policy, how do you think through what needs to be completely abolished—maybe facial recognition technology—versus, here’s where we need more nuance. For example, search. We can abolish Google, we can decentralize or deprivatize search, but just saying to abolish it doesn’t really answer the question of, how do we discover and work through information sciences?

Safiya: Yes. I think of these issues in a way that maybe feels too simplistic. I fundamentally start with the premise that everybody on Earth is uniquely important. So it gets easy then to discern which technologies, which policies, which practices are about limiting the possibility of human potential? Which are about harm and which aren’t? So that gets very easy by regulations to protect the environment or regulations that protect people, that enable people to not have to drink dirty water or take harmful derivatives of plant medicine that can kill them or hurt them. In that way, at the heart of thinking about policy is whatever one’s moral compass is.

The truth is a lot of people believe in the disposability of other people, and they regulate and make decisions out of that space. If we were to think about access to knowledge, we have this fundamental contradiction around whose knowledge is protected and made available and preserved, and whose isn’t. If we lived in a world that foregrounded indigenous knowledge practices, we would not be in the same conversation about climate change or climate injustice. 

Maybe it sounds too basic, but it is easy to see when you’re looking at search or social media and you see a set of business leaders and lobbyists who are lobbying to protect and support people who have very racist ideologies, who are antidemocratic, who have fascist and authoritarian leanings and who make provision for people who want to obliterate most of the planet, rather than protect and preserve it.

Khadijah: I would add the notion of place. I’m against computational thinking and computational logic because it always datafies demographics of people and then brings it to scale. Indigenous knowledge is locally rooted. Those closer to the North Pole are not necessarily custodians of knowledge about desert regions and vice versa. Computational thinking doesn’t allow for that kind of localized differentiated thinking, and also just wandering around. Everything is about efficiency and optimization.

Safiya: Yes. So many of the large scale computational projects in the world right now, like ChatGPT, are about this constant desire for universalization. When you try to force that kind of biodiversity, human diversity into static data models that are really about a total unification in the narrowest sense, then I think you preclude other kinds of localized contexts.

I was doing research in Western Australia this summer. I met some of the indigenous people who live there, and it was so incredible to hear them talk about being caretakers not just of the land, but of different animals and people, and these ways of thinking about the plurality of the ecosystem and being responsible for many parts of it, and living in orientation to the world that way.

Khadijah: I feel like we talked about song trails a couple years ago. Basically each part of the Aboriginal nation generationally transmitted songs that were a part of trails that corresponded to different geographic areas on the continent, and each area only had just a part of the song and part of an area. When they would come together as a group, they could share all of those songs, and it would geographically guide them across the continent and across generations. That is such a different logic than large language models or natural language processing or trying to content moderate for hate speech, or even thinking about things as “content.”

Safiya: It really affected me to be part of a group of people who I’ll call Black Americans for whom the process of enslavement, the practice of enslavement, the system and institution of it, sought to destroy our ability to have those kinds of connections and that kind of knowledge preserved. When you are part of a people who have been brutally destroyed with every effort to ensure that any aspect of your own indigenous knowledge is unavailable to you, it’s extremely confrontational to then see people who still have access to that be under threat. Under no circumstances can their knowledge be lost too.

So many of these systems completely either seek to extrapolate and extract and decontextualize this knowledge, or they just completely ignore it, and I’m not sure which is worse. When I talk to librarians who work with indigenous knowledge, they talk about how one needs to earn the right to know certain kinds of things, which is a different kind of logic than all knowledge should be accessible to all people at all times, or all information decontextualized of the geographies, the people, the culture that make that information become knowledge. In many ways, while some people think that these systems create more clarity and access to knowledge, we could also argue that they create more confusion.

Khadijah: That’s building off of Algorithms of Oppression. Algorithms of Oppression is, on one hand, one of the most cited texts ever. You got the MacArthur. I see recognition, but I’m not always clear whether people deeply engage in your work. That story that rotates around every once in a while about how Googling Black people still brings up gorillas. But it doesn’t move beyond that chapter.

I appreciate that you’re a regular person, given that we originally scheduled this interview a couple days ago, but I have been very sick, and you said, let’s reschedule. People who are close to you know that you have that level of humanity, which unfortunately is rare in academia, especially as people get to the top. Everyone around you shares my feeling that you’re so generous with your time and with your financial support. So I really want to give you your flowers and acknowledge that. I appreciate both you and your work.

Safiya: Well, I’m not crying. You’re crying. [Laughter] I really want to say thank you for saying that, and I appreciate you so much, and your wellbeing and health is very important to me. We know a lot of people who are not feeling well. I have not always felt well either myself over this past decade. But getting up every day and thinking about the things that we see and know are coming that maybe our families aren’t thinking about or people in our communities or people at the grocery store, right? The teachers in our kids’ schools, they’re not thinking about all the time, what will the implications of LLMs be for the future? What does it mean that the Saudis are huge investors in Twitter, which I refuse to call X because X belongs to Malcolm X?

Khadijah: What is X? Let a white South African think that they could just rename shit, you know what I mean? That’s the most colonial thing ever.

Safiya: I know. It’s so hilarious to me because it’s so associated with the Nation of Islam, and I’m like, “Does he know that?” 

Khadijah: I was sick in the hospital on a Dilaudid drip when this happened, and then I went on my phone for the first time in days and was like, “What’s going on? Am I bugging right now?” 

Safiya: What is so exciting to me about what you’re doing with Logic(s) is you’re able to take things like the concepts of Algorithms of Oppression, and you’re right, a lot of people just cite it to say, “Algorithms are biased.” But I was trying to convey in that book—

Khadijah: And did. Let’s be clear, and did.

Safiya: Not only are these systems rigged against us fundamentally at the level of code, but also there are huge implications for what it means to have tech companies control society, and that they are fundamentally antidemocratic, and they are monopolies, and this is dangerous for the world. Maybe I put some of that too far in the back of that book [laughs] because I was saying that ten years ago.

Khadijah: People don’t be reading. They want to cover their bases, but they don’t actually want to engage in Black women’s intellectual scholarship. I do think that there is a large group of people who actually heard, received the message, engaged, delivered. But those are not the loudest. Those are not the people getting the most resources.

Safiya: Black scholars have contributed a lot of ideas that have now been normalized as everyday common sense. I’m grateful that people have seen my work. But it has not made the true impact that we need it to make because when the White House summons people to talk about AI, they don’t summon the Black scholars with powerful critique. 

But a summoning is happening, has happened, because of radical tech critics around the world. There are thousands and thousands of people now who are reoriented to the fact that these systems are harmful, and a decade ago, that was not the predominant orientation. I’m very proud to be one grain of sand on that beach. But we have a lot of sand to move, so to speak, to really apprehend what these systems could do and are doing.

It’s an honor to get to be interviewed by you. Listen, I think about you. I think about our sister Victoria Copeland, who is one of the most important voices right now around AI and Black children and women, who are in danger from these databases and statistical models and the destruction of Black families and communities. We know she has not been feeling well and is also a person we need to hold up. 

When I look around, the truth is we are all helping each other and we are in different fights, and some people are out here trying to figure out how they get another hundred million dollars for their research center, and we’re over here like, “Okay, how do we just keep a center going? And how do we make sure that nobody drops dead?” That’s a real, real different context within which we do our work, so I’m glad that you’re feeling better and I want to see you thriving.

Safiya Umoja Noble, PhD is a professor and author of the best-selling book, Algorithms of Oppression.

This piece appears in Logic's issue 20, "policy: seductions and silences". To order the issue, head on over to our store. To receive future issues, subscribe.