U.S. Courts Are Using Algorithms Riddled With Racism to Hand Out Sentences
- By Jack Smith IV | Mic
- May 23, 2016
- 1 min read
EndFragment
For years, the criminal justice community has been worried. Courts across the country are assigning bond amounts sentencing the accused based on algorithms, and both lawyers and data scientists warn that these algorithms could be poisoned by the prejudices these systems were designed to escape.
Until now, that concern was pure speculation. Now, we know the truth.
An investigation published Monday morning by Pro Publica analyzed the results of thousands of sentences handed out by algorithms, and found that these formulas are easier on white defendants, even when race is isolated as a factor.
"The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants," the investigative team wrote.
The algorithms don't take race directly into account, but instead use data that stands in for correlative information that could stand in as a proxy. The Florida algorithm evaluated in the report is based on 137 questions, such as "Was one of your parents ever sent to jail or prison?" and "How many of your friends/acquaintances are taking drugs illegally?"
Those two questions, for example, can appear to evaluate someone's empirical risk of criminality, but instead, they target those already living under institutionalized poverty and over-policing. Predominantly, those people are people of color.
"[Punishment] profiling sends the toxic message that the state considers certain groups of people dangerous based on their identity," University of Michigan law professor Sonja Starr wrote in the New York Times in 2014. "It also confirms the widespread impression that the criminal justice system is rigged against the poor."
EndFragment
Comentarios