To capture racial bias in policing, we need to fill critical data gaps

From 2012 to 2015, a team of researchers collected 2.9 million police patrol records in Chicago. The team’s analysis of this data, from nearly 7,000 police officers, showed that black police officers were less likely to arrest civilians than white police officers patrolling the same neighborhood (NS: 2/11/21). Officers arrested an average of eight people per shift, with black officers making 24 percent fewer arrests than white officers. But another analysis, which excluded shifts where no arrests had taken place, reversed the results. This gave the impression that black officers had made 12% more arrests than white officers.

Disregard events that don’t happen – police allowing a jaywalker to pass, choosing not to make an arrest (usually for minor issues like possession of a small amount of drugs) or not never shoot a drawn gun – is problematic, according to police expert Dean Knox of the University of Pennsylvania. “Instead of drawing the conclusion that minority agents engage in less enforcement,” he says of his Chicago study, “you might mistakenly conclude that they engage in more enforcement. . The turnaround happened because, compared to white officers, black officers went on patrol more often without making any arrests.

Non-events of this nature are generally excluded from police data. Although a large body of evidence suggests police in the United States discriminate against blacks, Knox says, many police departments only collect data on a handful of interactions between their officers and civilians. Cell phone videos, like those of Eric Garner in a choke and George Floyd having trouble breathing, tend to only appear when dating has gotten out of hand. It is therefore difficult to measure racial prejudice in police services or to find targeted solutions to reduce such prejudices.

How, however, can researchers studying police account for non-events? The laborious Chicago data collection by Knox and his team is not always feasible. And even this rigorous study, reported in Science earlier this year, there were still gaps: the team had data on when police arrested, arrested, or used force against civilians, but not on minor interactions that did not meet the requirements of department registration.

When the research teams take these problematic datasets at face value, writes Knox in a November 4 essay in Science, they often come to conflicting conclusions. Disagreements in the literature allow public officials and the media to select studies that support their point of view, whether arguing for or against implicit bias training to overcome unconscious stereotypes or prioritizing recruiting minority officers.

A long chain of events

Knox wrote the essay following the publication of a controversial article and now retracted, study published in 2019 in the Proceedings of the National Academy of Sciences. “White officers are no more likely to shoot minority civilians than non-white officers,” the study authors wrote. They concluded that policies aimed at increasing police diversity would do little to stem racial disparities in police killings.

The study has grown in popularity, especially among conservative media and politicians, Knox says. “It was one of the centerpieces people use to deny the existence of a policing bias.”

But the authors’ findings were mathematically baseless, says Knox, who, along with Jonathan Mummolo, a police expert at Princeton University, wrote an article demystify the study in Average. Some 800 academics and researchers signed the coin. The team did not take into account the total number of encounters with police and then measured what fraction of those encounters resulted in lethal violence, Knox said.

But this narrow focus on fatal police shootings, a rare event that usually occurs at the culmination of a long chain of events, ignores any potential biases earlier in the chain, Knox says. The first potential bias in a chain of events begins with an officer’s decision to approach or let a civilian pass. Knox recognizes that a separate layer of research is needed to account for societal disparities, such as the presence of more officers in black, often impoverished neighborhoods, and long-standing discriminatory practices that reduce quality. education and other services in these neighborhoods.

“Even if you can’t see all of the things that have happened before, it’s imperative to just recognize that they exist,” Knox says.

Consider this concrete example. On July 10, 2015, Brian Encinia, a soldier from the State of Texas, arrested Sandra Bland, a black woman, for failing to signal a lane change. The exchange escalated and culminated with Bland’s arrest by Encinia for not following orders. Bland’s subsequent death in a county jail sparked a public outcry.

Focusing only on Bland’s arrest, and not on anything that happened before, would provide little information on how Bland ended up in jail for such a minor offense, or how to prevent such an outcome in the future. But because Encinia’s body camera recorded the entire exchange, police researchers, in this case interested in tone and language, were able to identify key steps leading up to her arrest. For example, the researchers reported in Law and Society Review in 2017, that of Encinia the language begins politely but becomes more and more agitated as Bland refuses to comply with his orders. His once formal commands, such as “get out of the car” become informal and unprofessional: “I’ll get you out of here.”

This word “sharp blow” indicates that Encinia is losing control of the situation, says Belén Lowrey-Kinberg, criminologist at St. Francis College in New York. Previous research has shown that when agents move from formal to informal language, violence can ensue.

Although this is a single event case study, the study provides “a great example of how situations can escalate,” says criminologist Justin Nix of the University of Nebraska Omaha.

Correction of erroneous data

Bad police data doesn’t need to be thrown away, Knox says. His team developed an algorithm to account for data gaps at all points of an interaction between police and civilians. The algorithm weights the different degrees of discrimination possible at each point in a chain of events – perhaps the race ignored Encinia’s decision to stop Bland because he couldn’t. seeing his face, for example, or maybe race played a big part because most of the drivers in that area are white. The range of values ​​resulting from the summation of these events suggests the possible amounts of discrimination in any given scenario, Knox says.

The program works on a very general principle, Knox says. “What data are you seeing? And “What data are you not seeing?” “

Thinking of the whole chain of events also indicates how to collect better statistics.

Consider a study on police shootings conducted by Nix and police expert John Shjarback from Rowan University in Glassboro, NJ, which appeared on November 10 in PLOS One. The researchers looked at racial disparities in the use of force by officers against black and white civilians. National databases only include shootings that resulted in the death of a civilian. But whether someone lives or dies after being shot depends on several factors, such as proximity to a trauma center, location of the gunshot wound, and access to first aid. The researchers therefore sought to examine all police shootings, including those that resulted in injury but not death. To do this, they relied on records from four states – California, Colorado, Florida, and Texas – that have been collecting this information for years.

Erroneous police data does not need to be deleted. An algorithm that takes into account data gaps works on a very general principle. “What data are you seeing? And “What data are you not seeing?” “

Dean Knox, University of Pennsylvania

The data revealed that some 45% of victims suffer from non-fatal injuries. Taking into account the relative populations of black and white civilians showed that for all four states, racial disparities in injuries were greater than racial disparities in deaths. For example, from 2009 to 2014 in Florida, blacks were about three times more likely than whites to be shot by police, but more than five times more likely to be injured. Across all four states, and for reasons that are not entirely clear, black victims are 7% less likely to die from their injuries than white victims.

National databases that only include records of civilians who died at the hands of police underestimate the agents’ use of lethal force against black civilians, Nix says. Death “is the end of a very long sequence of events. In our article, we supported one link in the chain. That is, the researchers looked at all cases where officers used lethal force and not just those that resulted in death.

Knox is now working with two police departments to break down encounters between police and civilians in more detail. These departments require officers to turn on their body cameras when they believe an interaction with a civilian will reach the level of an official interaction. (Officers have discretion at this point in the process, Knox acknowledges, so that, as with the Chicago study, this first link in the chain remains elusive.) Knox and his team will analyze the scripts of each encounter for the language and tone, like normal voice. or shout – a quantitative version of the approach Lowrey-Kinberg used to unbox the Encinia-Bland encounter. Computer vision techniques will analyze gestures, such as “drawn weapon”. Knox says he hopes the data will help his team move closer to reconstructing entire interactions, including identifying non-events in a given chain.

“You don’t just want the side of the story as written by an officer,” Knox says. “You want all the interaction. “

Comments are closed.