An abstract grey to black gradient parallelogram on a grey to black gradient.

Image by Xiaowei Wang.

Safe or Just Surveilled?: Tawana Petty on the Fight Against Facial Recognition Surveillance

Organizers across the US have been building a movement to ban police use of facial recognition technology. To date, they have won municipal-level bans in San Francisco, Oakland, and Berkeley, as well as in Somerville, MA; however, the struggle against surveillance technology continues elsewhere.

We talked to Tawana “Honeycomb” Petty about what that fight looks like in Detroit, a majority-Black city where global capitalism and municipal disinvestment have left many Black residents below the poverty line. For decades she’s worked in the lineage of James and Grace Lee Boggs and other community anti-racist organizers to remind Detroit residents that Black abundance and safety existed before and are possible again.

Today, Petty organizes against surveillance technologies in Detroit as the director of the Data Justice Program for Detroit Community Technology Project. She also co-leads Our Data Bodies, a five-person team that raises consciousness in marginalized communities about how digital information is collected, stored, and shared by governments and corporations. She is a convening member of the Detroit Digital Justice Coalition, which organizes “Data DiscoTechs” and other initiatives to foster media and digital literacy.

The work is all interconnected. How do you equip residents with what they need to demand what’s best for them and their communities? Whether helping people to understand surveillance, advocating for inclusion in the first digital Census, or navigating government benefits, Tawana’s work with community members focuses on education, organizing, and agency in the pursuit of digital justice in Detroit and beyond.

We sat down with Tawana in January 2020 to talk about the fight against facial recognition, building data agency within systems meant to strip it, and what’s at stake in what she calls “the last Black mecca” in the United States.

To start us off, could you tell us about yourself, your background, and how you came to work with the Detroit Community Technology Project?

I'm Tawana Petty, also known as Honeycomb. I’m a poet, a mom, a social justice organizer, and a water rights advocate. I direct the Data Justice program for Detroit Community Technology Project (DCTP), and through DCTP I convene the Detroit Digital Justice Coalition.

Right now, one of the Data Justice Program's main initiatives is slowing down the rapid expansion of Project Green Light, which is a public-private partnership around facial recognition technology spearheaded by the Detroit Police Department (DPD). We're also creating educational resources about technology, surveillance, and equitable participation in the Census. During the time that Project Green Light was ramping up, DCTP was involved in a research project called Our Data Bodies, which I can tell you more about later.

When Project Green Light got started, we had no concept of the scope of surveillance it  would take on. The project gets its name from flashing green lights that are connected to video surveillance cameras inside and outside of different businesses. The cameras are monitored twenty-four hours per day, seven days per week, at DPD’s real-time crime centers and mobile devices.

It started off with cameras at eight gas stations—ones that stayed open during late night hours. The police wanted to use these cameras to signal to community members that the gas stations were now safe to enter at any time, because police were constantly going to be watching them. They partnered with two private companies on this project: Guardian Alarm and Comcast.

Fast forward a couple of years. We now have close to six hundred cameras all over Detroit, and the Mayor would like to push that number to 4,000. Project Green Light locations pay a monthly rate so that if something happens at that location, they get priority from police over non–Green Light locations. So they pay for policing. And then, of course, the DPD leadership signed a contract to use facial recognition on everything from drones, traffic lights, mobile devices—pretty much anything they could attach a surveillance camera to. They were using that facial recognition technology on footage from their Green Lights for about two years before social justice activists and technologists in the city really got wind of it, when DPD tried to push through a directive that would solidify their use of it. Before that proposal, there was no policy governing its use.

What did the rollout of Project Green Light look like within the community, and how did you start to organize folks against it?

The police really tapped into this negative narrative that has hovered over Detroit for decades. They were touting Project Green Light as the salvation of the city, like, “Hey, we have the answer. We know you are all afraid of each other, so we're going to give you all these security cameras, and you’re going to feel safe because we're always going to be watching you.”

The police department’s campaign targeted senior citizens mostly—the ones who are retired or sitting at home, inundated with media images. They didn't hide the idea of putting cameras everywhere like they later hid their adoption of facial recognition. In fact, they were trying to separate the two ideas, saying that Project Green Light was different than facial recognition. But the two systems needed each other; the city needed to overhaul itself with all these surveillance cameras in order to make facial recognition a viable system. The DPD just kept saying, “Facial recognition is not embedded within the cameras,” and, “We'll only use the facial recognition if we absolutely have to.” Meanwhile, they were proposing policy directives that were asking for real-time live tracking on mobile devices, drones, traffic lights, and more.

Initially, when we were showing up to meetings at the Board of Police Commissioners, there were a lot of residents, mostly senior citizens, who were really angry with us. They felt that we were getting in the way of a system that would make them safe, and that we were attacking police because we just wanted to cause trouble. In reality, we were just trying to educate residents, law enforcement, and the Board of Police Commissioners on the harms of mass surveillance and facial recognition. Over time, we were able to argue that facial recognition and Project Green Light needed each other to work the way that DPD leadership and city government wanted them to work, and you couldn’t have one without the other. We were also able to make the case that safety and surveillance were not synonymous.

Was your resistance mainly organized around police commission meetings? How else did you all create pressure?

We spoke at police commission meetings and town halls. We collaborated on a special surveillance issue of a magazine called Riverwise, where we brought together information about Project Green Light, the problems with facial recognition, and alternatives that would create actual safety. We also created a report, “A Critical Summary of Detroit’s Project Green Light and its Greater Context.”

“Green Chairs, Not Green Lights” is our counter-campaign. We’re encouraging community members to come back to their front porches and look out for each other. We’re raising consciousness and building real safety, and we’re saying that a system set up to prioritize profits and institutions over people is wrong.

I imagine you all are working in coalition with many different groups in this campaign. What are those different groups? Also, there’s a long history of activism and in particular Black-led activism in Detroit. How do you and the rest of the coalition situate yourselves within that history?

Yeah, absolutely. In addition to serving as director of the Data Justice Program here at DCTP, I'm also on the board of the James and Grace Lee Boggs Center to Nurture Community Leadership. James and Grace Lee Boggs are now ancestors, and a long legacy countering these sorts of things was created under their expert tutelage. [Eds.: James and Grace Lee Boggs were revolutionaries who worked to dismantle capitalism and racist oppression nationally and through fostering community-based, grassroots activism within their adopted home city of Detroit.] Our former Boggs Center board member Ron Scott, who is also now an ancestor, co-founded the Detroit Coalition Against Police Brutality, which still exists. I'm part of the Black Out Green Light Coalition and Green Light Black Futures coalition, which is organized by Black Youth Project 100 (BYP100). I also collaborate with the Detroit Justice Center, Media Justice, Color of Change, and Fight for the Future. Finally, I'm part of a lot of coalition work with the ACLU that collaborates with dozens of organizations that have been consistently resisting mass surveillance in the city.

Ultimately, we were unsuccessful in getting DPD to stop using facial recognition, but we did have some wins within the policy that the Board of Police Commissioners ultimately passed. We're definitely tuned in, which is why I think we were able to achieve any policy changes at all, even if what we wanted was a ban. Honestly, getting any of those changes in a city with this many Black people took a tremendous amount of work and was an uphill battle. All the other cities that were successful with bans were predominately white. Folks aren't as eager to take the ever-watching eye off of a predominately Black city.

Could you tell me about those changes and what the policy looks like now? 

Under the new policy, the police are prohibited from using facial recognition on any live feeds. Even though still photos are also very problematic, the thought that DPD would have been able to track people's faces as they walk the streets was very unnerving. So they're not able to do that. DPD can only use facial recognition to investigate violent crimes. They're barred from using it for immigration enforcement. They're barred from predictive analysis—using the technology for pre-crime, essentially. They're barred from mobile use, so they're not able to use their mobile devices to walk through crowds and track people using facial recognition, which is something that happened in Baltimore during the Freddie Gray protests. They're barred from violating the First, Fourth, and Fourteenth Amendments—freedom of religion, prohibition against unreasonable search and seizure, and due process under the law. Now, I'll tell you that I don't believe that's even possible—I think if you mass surveill a city, there's no way you're not going to violate those amendments.

The Board of Police Commissioners receives a weekly report on how the police are using facial recognition. Unfortunately, the Board of Police Commissioners, which is supposed to be a civilian oversight body, hasn't been very transparent with those reports. None of them have been made accessible to the public. They don’t even question the contents of the reports when they are handed them in meetings, at least not any of the Board of Police Commissioners meetings I have been present at. It's discouraging.

Most significantly in the policy, Detroit police officers are supposed to be criminally investigated if they misuse the technology. Their misuse is supposed to be reported to the mayor, the Board of Police Commissioners, and the city council within twenty-four hours of discovery, which would lead to dismissal and further legal action.

But I would say that our greatest accomplishment was raising the consciousness of the city residents. Many, many residents were engaged in a process that, at first, they were just going along with.

How did you do that? 

For several months, we gave testimony at every Board of Police Commissioner meeting that involved the topic. Unfortunately, residents are only afforded two minutes to speak, even on topics as dire as this one. We leveraged our minutes to push forward a vision of people sitting on their front porches, the community having good schools, affordable water that doesn't contain lead, neighbors not getting evicted, neighborhood block clubs. We pushed for senior citizens to think about the times when they felt safe, and whether or not being held up in their homes under surveillance cameras actually made them safe.

That understanding of safety was the scaffolding. And once we had that, we could build on it. We could say: what the police department is talking about is security; it’s not what you’re thinking of when you’re thinking about safety

It really was a slow educational process, to be honest with you—constantly bringing materials and giving testimony. We’d bring examples of conversations we’d had with children, where we'd ask them a question like, “Say you lived on a street. On side A of that street, there were metal gates and cameras and drones and police. And on side B, there were a bunch of neighbors who knew each other, who looked out for each other, who knew that if John didn't come home at 7:00 p.m., then they should probably find out where John is. On which side of the street would you feel safest?” Kids almost always said, “How could I feel safe if I don't know the people who live next to me, you know?” That same type of teaching works with adults when they started to think it through.

We also began bringing in the data around how facial recognition technology misidentifies darker skin tones in women and children, and how Detroit is 80 percent Black. We started reflecting back what DPD leadership themselves were actually saying. For example, DPD leadership indicated that of the 500 times over two years that they used facial recognition, they only moved forward with 150 cases, because the other 350 were misidentifications by the system.

We highlighted the fact that they were telling us the system was wrong at least 350 times and that they were relying on the naked eye of two analysts to catch the errors of the algorithm. This caused great concern about the potential of false arrests. So, we dissect what's being said and present that back to people consistently.

You’ve mentioned how Detroit is 80 percent Black, a stark contrast to other, primarily white cities that have banned facial recognition. How does your fight in Detroit look different from the fight in whiter cities like San Francisco or Boston?

We've built relationships with organizers in pretty much all the other places that have successfully banned facial recognition, including Portland, where folks are pushing for the most comprehensive ban. We've consistently told them: we need to synthesize what you all have done, but also apply a racial lens to it because it's going to be much harder to convince not just Detroiters but the world that a Black city doesn't need to be surveilled. It's a deeper, larger conversation and it requires a kind of organizing around anti-Black racism that often isn’t comfortable to do even in social justice or liberal spaces.

That’s why I value the consciousness-raising piece so much. I really want to win this, but I also have to find some value in the fact that we've been able to raise the consciousness of people all across the US who now consider themselves invested in our struggle. So I take pieces from all the places that I've been able to visit and all the people that we've been able to talk to. But I also understand that there is a very real dynamic here that does not exist in cities that only have a 5 percent Black population.

One of the things that I've consistently said is that Detroit is the last remaining Black mecca in the United States. If we don't succeed in resisting some of this violent system, there is no hope for Black residents across the rest of the country, period. Police departments have already tried to model this system in other cities—calling them Project Blue Light in Wisconsin, for example. They're just packaging up what city government is pushing through in Detroit. On any given day here, you can see drones flying through the sky like kites. 

Wow, that's an apocalyptic vision. In past talks, you’ve talked about the rise of surveillance and how that ties in with the “resurgence” of the city, as money floods into gentrified areas like downtown and Corktown. Could you give a little bit of that historical context in Detroit? 

There are areas of Detroit like Black Bottom and Paradise Valley that had Black hospitals, Black grocery stores and schools, and African-centered education—all these viable institutions—that were leveled for freeways. One of our challenges is getting folks to think about a time in the city when there was an abundance of thriving neighborhoods that were predominantly Black-owned, -led, and -invested.

I’m always trying to point out the inaccuracy of the narrative that Detroiters just don't want to care for where they live, they don't want safe neighborhoods, they don't want viable institutions—that the system that we exist under is a choice. We're in a city with five hundred thousand Black people and the median household income here is $29,000 a year—which means that half the households in the city make less than that. Predominantly Black women-led households. Lots of children in extreme poverty. One hundred thousand people without water in their homes over the last ten years. Schools closed by the hundreds. I work to get people to see crystal clear that this isn't an accidental situation. It’s easy to convince anybody in the state of Michigan and the world that if something bad is happening to Detroit, we brought it on ourselves. There isn't a dissecting of how we got here, or the history of racism and the disinvestment that has happened for generations, you know?

Watched, Not Seen

At the beginning of our conversation, you mentioned the Our Data Bodies project. I understand the project is led by a five-person team that includes three organizers from different groups: the Center for Community Transitions in Charlotte, which focuses on folks who have recently reentered the community from incarceration; the Stop LAPD Spying Coalition, which organizes against police surveillance in Los Angeles; and you all at DCTP. How did those three organizations come together? 

Our Data Bodies was initially a two-year research project between Detroit, Charlotte, and Los Angeles. We were thinking about the digital streams we all produce and the impacts that our information has on our interactions with government and police, on how we get resources, and on how our cities are developed. Within those two years, we learned that we wanted to move beyond a research project and expand on our organizing within our communities to increase self-determination around these systems.

The three organizers—Mariella Saba, Tamika Lewis, and I—all originally applied for our positions on the project. The interview committee was very intentional in picking social justice organizers for these roles as researchers because the type of analysis that comes from organizing on the ground in the neighborhood was going to be important in making sure that we were prioritizing community members. That won’t happen if you just focus on having a good researcher. Research is something that can be taught; care and compassion for the people connected to the research is a longer process.

So we all play to our strengths. I consider Stop LAPD Spying a mentor in this work, especially in resisting police and surveillance systems. For this project, Stop LAPD Spying and Los Angeles Community Action Network (LA CAN) put a lot of emphasis on unhoused populations and how Skid Row is a heavily targeted community that the LAPD uses to innovate new surveillance systems. 

In Charlotte, Tamika does work around citizens returning from incarceration. Folks returning from incarceration constantly have to report back to the system. It's like the system is constantly waiting for you to fail. Tamika also brings an important perspective as a gender-nonconforming organizer: they understand how resistance to being defined under the dominate gender binary makes people targets for intensified surveillance. 

How have you been able to make use of that experience and research?

We’ve collectively produced materials like our Digital Defense Playbook, which is an educational resource and activity guide about data, surveillance, and safety that we created last year based on our three years of community research. We’ve taken that work to different neighborhood institutions, as well as academia. We’ve worked with data scientists, who are often disconnected from the real people represented by the statistics they analyze.

We're all learning from each other and staying in constant communication about what's happening in our different cities, and hearing the same themes. Things like, “The one mistake that I've made in my life is now the thing that's limiting how I'm able to survive.” Or, “I'm feeling heavily surveilled because I'm trans,” or “I'm feeling super targeted because my credit isn't good,” or “I've just reentered society from incarceration and I feel like every move that I make is being scrutinized and monitored.” Across all the cities, community members wanted to be seen for something more than a mistake they made, or something more than their data. And they didn't want to be pursued and tracked and monitored.

What exactly is a “data body”? Recently, we’ve seen the rise of various tools that let you see the data that companies have about you. But your approach seems more holistic. It’s not just, does Facebook have this data about me, but what are all of the different pieces of data that the government or private companies might have—and what am I able to do about it? 

We try to get community members to think not just about the impact that their data has on them, but the impact that their data has on the decisions that affect their family, their neighborhood, and their city. 

One of our exercises is called “What’s in Your Wallet?” As part of that exercise, we often use the example of a person who uses their EBT card at the liquor store up the street and purchases foods that aren't considered healthy. Those activities create a data trail. Maybe a bank will look at those data trails and decide not to invest in a grocery store in that neighborhood. It's difficult to know exactly how some decisions are made, but what we do know is that data is leveraged to make most of those decisions.

That’s what we mean by data bodies. It’s not just the individual; it's the information that's been generated about this individual and the systems that interact to make decisions about this individual, this individual's family, this individual's neighborhood—all the data’s tentacles.

In the Digital Defense Playbook, you cite a conversation with a community member who says about interacting with government agencies, "For their benefit they do communicate. But for my benefit, no." And I see this a lot in government work—if agencies want to communicate to surveil and penalize, then they can and will. But if you want them to share information to, say, verify that you’ve lost your job so that you can get food stamps, they often can’t or won’t.

It seems like so much of your work with Our Data Bodies is trying to build data agency within a system where power is skewed heavily towards other organizations or towards the state. How do you help people build data agency to overcome that power imbalance?

One of the things that we truly, clearly recognize within our work is that the power is going to come from the respiriting of community members. A lot of folks have been really dehumanized. There’s a quote that says, “Friendship begins at the moment when one person says to another, ‘What, you too? I thought I was the only one.’” And so it’s been this weaving together of stories to let community members know that they're not alone. 

It’s saying: did you know that there is an open data portal in your city and that you can push for that to become a useful source of information? Or: did you know that you can request to have some of your information deleted from a government database? It’s letting people know: here's how these systems connect and interact with one another. It’s an ongoing process of learning how different people are experiencing these systems, but also tying them together with the stories of others so that they understand they aren’t isolated, that there are many people across the world resisting these systems. 

We intentionally did not publish everything in the Digital Defense Playbook. There are stories that community members shared with us that emphasize how they're resisting systems that we did not share, which contradicts everything that a capitalist system would ask you to do. It's like, why not put it all out there? Well, because we actually want people to keep surviving. 

The Digital Defense Playbook serves as a community organizing mechanism where we get to talk to people, we get to share stories, we get to respirit, we get to raise digital and media literacy. We don't think that it solves all the things. But we do think that when you build the confidence of community members, and you let them know that they're not alone, and that there are folks that are resisting these things all over, it does something for the spirit. So that's how we use it. 

In Person, In Print

Like Our Data Bodies, the Data DiscoTechs organized by the Detroit Digital Justice Coalition seem like they aim to raise consciousness and agency around data within communities. Could you paint a picture of one for me?

The “discovering technology” fairs, which we refer to as DiscoTechs, are kind of like a science fair at school—you know, with informational stations on different topics. To start one, we either respond to folks who have reached out to us or pick a neighborhood where we want to do one. We connect with an organizer or resident who lives in that neighborhood, who has the pulse of what's happening, and we support them in finding out what stations they think might most benefit their neighborhood.

We start with a few stations that we definitely want to have. Recently, we’ve had stations where we’re telling community members what to expect with the census. What kinds of things get funded based on census data? What are some of the concerns that have been raised about the census? How has it been used in the past? How long is census data kept? We did research to empower community members with that information.

But then the rest of the eight or nine stations are designed around what the community says they need. So they might say, well, in this neighborhood, it's a lot of senior citizens. They don't understand Twitter, they don't understand how to set up an email account, they just want to learn how to use a Mac—things like that. So then those are the stations that we will have. Community members are able to come in for free. We have music. We call it a DiscoTech, but it also has music and dance. At the last two, we gave away low-cost computers: the Raspberry Pi full kit, with the keyboard and everything. We gave those to residents who don't have that technology at home. 

Who shows up to those DiscoTechs? Mostly young folks? Is it a mix of ages, of races? 

It definitely depends where we have them. Some neighborhoods are predominantly senior citizens, so we'll have a lot of seniors show up. When we did one in southwest Detroit, it was a predominately Spanish-speaking undocumented community, so we had interpreters onsite, including youth interpreters. It was great. Those stations had mostly young and middle-age participants.

I’m curious about the print magazine that you help produce, Riverwise. How are you involved? How did it come to be?

I think it's the poet in me, but I don't remember a point in my life where I didn't feel obligated to push back against the dominant narrative of Detroit. I'm pretty sure there was a time, but I don't remember it.

There was a newspaper in the city, the Michigan Citizen, that was trying to lift up community stories of resilience and resistance, countering the dominant narrative in the city. It was around for three decades, but then it ended. The Boggs Center felt that we had a responsibility to fill that void, or at least try to fill that void. And so we convened community members over a one-year period to see whether there was interest in trying to create consistent literature that would amplify community stories of resilience and resistance that would counter the narratives that were coming out of the city. Through those meetings, Riverwise was born. I am now part of the Riverwise Magazine Collective, and I serve on the editorial board. Eric Campbell, a former contributor to the Michigan Citizen is the managing editor. We put it out quarterly, and then we do some special editions, depending on what the situation is. That's how the special surveillance edition in collaboration with DCTP came about.

We convene weekly and go through stories that we've heard ourselves, or we've pursued folks who we want to write for us, or we have people who send in submissions. Then we determine whether we want to theme the magazine based on something that’s happening in the city, or if this is one where we're just going to lift up various stories. And then we have these community discussions around what's written in the magazine. If a person did an article or something for the magazine, then we'll ask them if they want to host a community discussion and we will support them in that. We also host writing workshops to support residents who want to improve their writing skills or learn how to contribute content to the magazine. It’s not our coalition that does most of the writing workshops. We tap people in the community—poets, writers, you know, organizers—who want to host those workshops and then we just support them. 

Calling All Co-liberators

How can people support the fight against facial recognition?

They can support organizations like Fight for the Future which are pushing for a federal ban on the government use of facial recognition. They could call or email their Congressperson. They can help us amplify our struggles. 

In Michigan there are a few current bills that could use some support. There’s a bill in the State Senate, 2019 SB 0342, that's likely going to be passed. It was initially a ban on all police use of facial recognition, but now it's a ban on police use of real-time facial recognition. There was House Bill 4810, a five-year moratorium on the police use of facial recognition on everything—drones, body cameras, any surveillance technology—that’s still kind of lingering out there. And then there is the Community Input Over Government Surveillance ordinance that's before the Detroit City Council, which creates guidelines for council oversight and public input for surveillance technology. That one is currently being passed back and forth for amendments. 

On the federal level, there's a Congressional bill called the No Biometric Barriers Housing Act of 2019 that bans the use of biometrics in public housing. There's also a bill out there called the Ethical Use of Facial Recognition Act that would prevent federal funding from being used for facial recognition technologies, including police and other government use. I’m not really sure what's going to stick beyond the Detroit city ordinance, though, and maybe the Michigan State Senate bill. 

But wherever you are, you can support the fight by continuing to try to politicize your community, push for alternatives to surveillance, and minimize the conflation between surveillance and safety. And in Detroit, we'll just continue the work by showing how invasive and insidious this technology is—hopefully before we see a slew of false arrests. We don't want to be living in a social credit system, where our every move is dictated by algorithms and surveillance cameras.

Or at least more than what we have already.

Exactly, yeah. It’s already a social credit system in Detroit, but we don't want to add more and more technologies to that and exacerbate the violence and marginalization that residents are already feeling. And we don't want this rolled out all across the globe as a way to contain and control Black and brown people and Indigenous people and poor white people. Surveillance is not safety.

The hardest part for us is that law enforcement, government institutions, and too many people don't seem to see an alternative, especially in areas that have been deemed dangerous. But everyone needs to understand that most crime in our neighborhoods is rooted in poverty and disinvestment and racial violence. We need co-liberators; we don't need any more allies. We need folks to really feel their liberation is tied up in our liberation, you know?

Tawana “Honeycomb” Petty is a mother, social justice organizer, youth advocate, poet, and author. She directs the Data Justice program for Detroit Community Technology Project and co-leads the Our Data Bodies project.

This piece appears in Logic's issue 10, "Security". To order the issue, head on over to our store. To receive future issues, subscribe.