Issue 3 / Justice

December 01, 2017
Four women smiling while being analyzed by facial recognition software.

A screenshot from a promotional video by Face++, a facial recognition platform.

White Code, Black Faces

Ali Breland

Cops around the country increasingly rely on facial recognition software. But white engineers are the ones writing it—and they’re embedding their own biases, with catastrophic results.

“You good?” a man asked two narcotics detectives late in the summer of 2015.

The detectives had just finished an undercover drug deal in Brentwood, a predominately black neighborhood in Jacksonville, Florida that is among the poorest in the country, when the man unexpectedly approached them. One of the detectives responded that he was looking for $50 worth of “hard”—slang for crack cocaine. The man disappeared into a nearby apartment and came back out to fulfill the detective’s request, swapping the drugs for money.

“You see me around, my name is Midnight,” the dealer said as he left, not realizing who he had just met.

Before Midnight departed, one of the detectives was able to take several photos of him, discreetly snapping pictures with his phone held to his ear as though he were taking a call. That’s at least allegedly what happened, according to case details from The Florida-Times Union and a court motion. Details get murkier from there.

Two weeks later, police wanted to make the arrest, but didn’t know who they had bought the crack from. The only information they had were the smartphone pictures, the address where the exchange had taken place, and the nickname “Midnight.” Stumped, the Jacksonville Sheriff’s Office turned to a new tool to help them track down the dealer: facial recognition software.

The technology helped them pin down a suspect named Willie Lynch. Lynch, who has been described by close observers of the case like Georgetown University researcher Clare Garvie as a “highly intelligent, highly motivated individual” despite only having graduated high school—he even filed his own case motions, which could be mistaken for ones written by an actual lawyer—was eventually convicted and sentenced to eight years in prison. He is now appealing his conviction.

Whether or not Willie Lynch is “Midnight” remains to be seen. At the very least, his conviction deserves closer scrutiny. Many experts see the facial recognition technology used against him as flawed, especially against black individuals. Moreover, the way the Jacksonville Sheriff’s Office used the technology—as the basis for identifying and arresting Lynch, not as one component of a case supported by firmer evidence—makes his conviction even more questionable.

The methods used to convict Lynch weren’t made clear during his court case. The Jacksonville Sheriff’s Office initially didn’t even disclose that they had used facial recognition software. Instead, they claimed to have used a mugshot database to identify Lynch on the basis of a single photo that the detectives had taken the night of the exchange. Even after the office’s use of the technology became apparent, they still gave few answers about how they used the technology or how they trained Celbrica Tenah, the analyst in the Sheriff’s Office Crime Analysis Unit who identified Lynch using facial recognition software.

An Imperfect Biometric

The lack of answers the Jacksonville Sheriff’s Office have provided in Lynch’s case is representative of the lack of answers to questions that facial recognition poses across the country. “It’s considered an imperfect biometric,” said Garvie, who created a study at Georgetown called The Perpetual Line-Up on facial recognition software. “There’s no consensus in the scientific community that it provides a positive identification of somebody.”

The software, which has taken an expanding role among law enforcement agencies in the U.S. over the last several years, has been mired in controversy because of its effect on people of color. Experts fear that the new technology may actually be hurting the communities the police claims they are trying to protect.

“If you’re black, you’re more likely to be subjected to this technology and the technology is more likely to be wrong,” House Oversight Committee ranking member Elijah Cummings said in a Congressional hearing on law enforcement’s use of facial recognition software in March 2017. “That’s a hell of a combination.”

Cummings was referring to studies like the one published in 2016 by the Center on Privacy and Technology at Georgetown Law. This report found that black individuals, as with so many aspects of the justice system, were the most likely to be scrutinized by facial recognition software in cases. It also suggested that software was most likely to be incorrect when used on black individuals—a finding corroborated by the FBI’s own research. This combination, which is making Lynch’s and other black Americans’ lives excruciatingly difficult, is born from another race issue that has become a subject of national discourse: the lack of diversity in the technology sector.

Racialized Code

Experts like Joy Buolamwini, a researcher at the MIT Media Lab, think that facial recognition software has problems recognizing black faces because its algorithms are usually written by white engineers who dominate the technology sector. These engineers build on pre-existing code libraries, typically written by other white engineers.

As the coder constructs the algorithms, they focus on facial features that may be more visible in one race, but not another. These considerations can stem from previous research on facial recognition techniques and practices, which may have its own biases, or the engineer’s own experiences and understanding. The code that results is geared to focus on white faces, and mostly tested on white subjects.

And even though the software is built to get smarter and more accurate with machine learning techniques, the training data sets it uses are often composed of white faces. The code “learns” by looking at more white people—which doesn’t help it improve with a diverse array of races.

Technology spaces aren’t exclusively white, however. Asians and South Asians tend to be well represented. But this may not widen the pool of diversity enough to fix the problem. Research in the field certainly suggests that the status quo simply isn’t working for all people of color—especially for groups that remain underrepresented in technology. According to a 2011 study by the National Institute of Standards and Technologies (NIST), facial recognition software is actually more accurate on Asian faces when it’s created by firms in Asian countries, suggesting that who makes the software strongly affects how it works.

“These libraries are used in many of the products that you have, and if you’re an African-American person and you get in front of it, it won’t recognize your face,” said MIT Media Lab director Joichi Ito at the World Economic Forum in Davos at the beginning of 2017.

As Ito points out, being invisible to a technology that can be used against you is extremely dangerous. It’s also a sad allegory for how black individuals are not seen in the criminal justice system. In a TEDx lecture, Buolamwini, who works with Ito and is black, recalled several moments throughout her career when facial recognition software didn’t notice her. “The demo worked on everybody until it got to me, and you can probably guess it. It couldn’t detect my face,” she said. “Given the wide range of skin-tone and facial features that can be considered African-American, more precise terminology and analysis is needed to determine the performance of existing facial detection systems,” Buolamwini told Recode in January.

Unregulated Algorithms

Even as the use of facial recognition software increases in law enforcement agencies across the country, the deeper analysis that experts are demanding isn’t happening.

Law enforcement agencies often don’t review their software to check for baked-in racial bias—and there aren’t laws or regulations forcing them to. In some cases, like Lynch’s, law enforcement agencies are even obscuring the fact that they’re using such software. To take another example, in their Perpetual Line-Up study Georgetown researchers found that the Pinellas County Sheriff’s Office in Florida runs 8,000 monthly facial recognition searches, but the county public defender’s office said that police have not disclosed the use of the technology in Brady disclosure—evidence that, if favorable to the defense, must be provided to them by prosecutors.

Garvie said she is confident that police are using facial recognition software more than they let on, which she referred to as “evidence laundering.” This is problematic because it obscures just how much of a role facial recognition software plays in law enforcement. Both legal advocates and facial recognition software companies themselves say that the technology should only supply a portion of the case—not evidence that can lead to an arrest.

“Upon review, all facial recognition matches should be treated no differently than someone calling in a possible lead from a dedicated tip line,” writes Roger Rodriguez, an employee at facial recognition vendor Vigilant Solutions, in a post defending the software. “The onus still falls on the investigator in an agency to independently establish probable cause to effect an arrest,” he continues—probable cause that “must be met by other investigatory means.” It doesn’t always work out that way, of course.

Even if facial recognition software is used correctly, however, the technology has significant underlying flaws. The firms creating the software are not held to specific requirements for racial bias, and in many cases, they don’t even test for them. “There is no independent testing regime for racially biased error rates,” Georgetown researchers wrote in their Perpetual Line-Up report. “In interviews, two major face recognition companies admitted that they did not run these tests internally, either.”

A third I spoke to, CyberExtruder, a facial recognition technology company that markets itself to law enforcement, also said that they had not performed testing or research on bias in their software. Another, Vigilant Solutions, declined to say whether or not they tested for it. CyberExtruder did note that certain skin colors are simply harder for the software to handle given current limitations of the technology. “Just as individuals with very dark skin are hard to identify with high significance via facial recognition, individuals with very pale skin are the same,” said Blake Senftner, a senior software engineer at CyberExtruder.

“[Race is] just very hard to control for in their testing,” Garvie explained. “There haven’t been enough public studies, but the limited research that has been done does suggest that the algorithms may have different accuracy rates depending on the race of the subject.” Garvie and her colleagues believe that NIST is well positioned to help with such studies. The agency, part of the U.S. Department of Commerce, conducts voluntary tests with facial recognition companies every four years and is testing for variances in results by country of origin—which Garvie notes “can be a good proxy for race.”

NIST’s tests won’t come soon enough for Lynch, however. His case is currently playing out in the Florida First District Court of Appeal. The clock also can’t be turned back for others like him, who may have been unfairly tried as a result of less than perfect software without transparent standards for its use.

Facial recognition software raises many questions that need clear answers. Obtaining those answers will take more than commissioning studies, as vital as they are. It’s also essential that laws catch up with the technology, in order to provide people like Lynch with the opportunity to know the tools that are being used against them. Most importantly, we need to take a closer look at who’s making these algorithms—and how they’re doing it.

Ali Breland is a staff writer at Mother Jones, where he reports on technology, the internet, and misinformation. His writing has appeared in the Guardian, Vice, Logic, and elsewhere.

This piece appears in Logic's issue 3, "Justice". To order the issue, head on over to our store. To receive future issues, subscribe.