News & Politics

Computerized Criminal Behavior Predictions Are No More Effective Than Untrained Humans: Report

Without effective scrutiny, algorithm-based software could hurt those who are already the most vulnerable.

Photo Credit: oneword / Shutterstock

The effectiveness of the criminal justice system has been debated since its creation. There is great difficulty in developing a uniform system when criminal defendants’ circumstances are variable. Thanks to recent coverage of police shooting, sexual assault cases and self-defense trials over the last few years, the criminal justice system has become interwoven with our daily news of politics, government and pop culture. It doesn’t take long to see the system operates in favor of those with power and influence while being disadvantageous for those with a history of systemic vulnerability. It is inescapable, and it is becoming increasingly apparent that the system is flawed.

We had hoped that in the age of technology, we could eradicate bias by putting computer programs in place of our old systems. With algorithm-based systems, we can make faster, less variable predictions about the likelihood of people ending up in the criminal justice system again, or recidivism. But it’s become increasingly apparent that automating the process made things worse because now we have taken old bias and embedded it by teaching it to computers. We hoped machines could provide the fair treatment humans have failed to give criminal defenders and past offenders—but they haven’t. And it turns out, machines may not be any more effective than humans at predicting recidivism.

“People hear words like ‘big data’ and ‘machine learning’ and often assume that these methods are both accurate and unbiased simply because of the amount of data used to build them,” said Julia Dressel, whose senior undergraduate honors thesis in computer science at Dartmouth College is gaining national attention.

Don't let big tech control what news you see. Get more stories like this in your inbox, every day.

Earlier this year, Dressel released a report in conjunction with computer science professor Hany Farid at Dartmouth College titled “The accuracy, fairness, and limits of predicting recidivism” in the Journal of Science Advances. The two evaluated the risk assessment software COMPAS—Correctional Offender Management Profiling for Alternative Sanctions—and the results were shocking. Participants only had seven details about offenders, compared to 137 given to COMPAS, and were accurate 67 percent of the time.

The conclusion is that untrained experts are making criminal predictions with considerably less information than COMPAS.

“We have shown that a commercial software that is widely used to predict recidivism is no more accurate than the predictions of people with no criminal justice expertise who responded to an online survey. We have also shown that the COMPAS prediction algorithm is equivalent to a very simple classifier,” says Dressel.

Predicting criminal behavior is serious business. It’s critical that we use this and similar research to evaluate ways to improve the programs we are using to make determinations on people’s futures. Without effective scrutiny, algorithm-based software could hurt those who are already the most vulnerable.

Dressel’s research isn’t the first time COMPAS has been under scrutiny. In 2016, ProPublica presented an in-depth analysis of the software that found high levels of racial bias. Findings included heightened and false predictions on black recidivism, a false reduced risk for white recidivism and black defendants misclassified as having a higher risk for violent offenses.

“Algorithmic tools sound impressive, so people are quick to assume that a tool’s predictions are inherently superior to human predictions. It’s dangerous if judges assume COMPAS produces accurate predictions, when in reality its accuracy is around 65 percent. Therefore, it’s important to expose when software like COMPAS isn’t performing as we expect. We cannot take any algorithm’s accuracy for granted, especially when the algorithm is being used to make decisions that can have serious consequences in someone’s life,” Dressel continued.

Along with being equally as effective in predictions, participants were equally as biased, particularly in the area of race.

“Our research suggests that the two most predictive criteria of recidivism are age and total number of previous convictions. On a national scale, black people are more likely to have prior crimes on their record than white people are (black people in America are incarcerated in state prisons at a rate that is 5.1 times that of white Americans, for example). Within the dataset used in our study, white defendants had an average of 2.59 prior crimes, whereas black defendants had an average of 4.95 prior crimes. The racial bias that appears in both the algorithmic and human predictions is a result of this discrepancy,” Dressel explained.

False predictions about criminal defendants and recidivism affects people’s lives. The predictions made by COMPAS and similar software are used to determine bail totals, prison sentences and eligibility for parole. Unfortunately, more often than not individuals of color, black men in particular, are the ones whose lives are destroyed by these mistakes.

Over the last 30 years, the Sentencing Project has been at the forefront of the discussion around criminal justice reform. Their research efforts help spread awareness of the way race affects experience within the criminal justice system, the long-term isolation of felons and suggestions for improving sentencing.

“More than 60 percent of the people in prison today are people of color. Black men are nearly six times as likely to be incarcerated as white men, and Hispanic men are 2.3 times as likely. For black men in their 30s, 1 in every 10 is in prison or jail on any given day,” says Kara Gotsch, the director of strategic initiatives at the Sentencing Project.

Given our nation's history, this information is troubling, yet not surprising. The real conflict comes in when we consider the struggles associated with life after prison and its continuous stigma.

“People with criminal records face significant collateral consequences of their conviction, including barriers to voting, employment, housing, and financial public assistance. These barriers complicate the reintegration process after incarceration and likely increase the odds a person will recidivate,” Gotsch explained.

The biases people of color are exposed to when interacting with the criminal justice system are paired with the everyday experiences of personal and systemic racism. Think of it this way—a black man with a college degree has fewer employment opportunities than a white man with a high school diploma. If you add a criminal record, finding gainful employment can be nearly impossible. Our criminal justice system takes the usual oppression and multiplies it, leaving additional barriers to self-sufficiency for black and brown people.

According to research conducted at the Sentencing Project, the problems with our criminal justice system go much deeper than furthering racism. Incarceration is often used as a substitution for issues of addiction and mental illness that would be better treated with personalized treatment programs than with removal from society.

“Public education about the United States’ exceptionalism in its use of incarceration is critical to demonstrate that there is another way to address public safety concerns. The U.S. relies on incarceration and severe punishments to address circumstances that really result from our broken social safety net, like inadequate access to medical and mental health treatment, including drug treatment. Incarceration will not solve these problems, but community investment in services can,” says Gotsch.

If we want to do more to assist individuals who are at risk for committing crimes instead of hiding them away, we must spend more time evaluating how we interact with criminal defendants. Dressel’s honors project shedding light on both the flaws of algorithm-based punishments and ever-present human implicit bias is a step in the right direction.

Since the studies publication, Northpointe, Inc. (now Equivant), which owns the COMPAS software, wrote an “Official Response” on its website that alleges Dressel’s research “actually adds to a growing number of independent studies that have confirmed that COMPAS achieves good predictability and matches the increasingly accepted AUC standard.” However, they make no mention of the racial bias and say they will review the materials from the study for accuracy.

Our criminal justice system needs fixing, and following technological shortcuts isn’t the answer. We need to do more as a nation to make sure we aren’t derailing someone’s life for things we could have fixed through early intervention.

The takeaway? According to Dressel, it’s being aware that algorithms are imperfect and that reducing bias will require intentional effort—but there is hope. “Algorithms can have racially biased predictions even if race isn’t a feature of the algorithm. It could be possible to develop algorithms void of racial bias. However, in this field, as research continues it’s important that we assess the algorithms for bias at every step of the way to ensure the tools are performing as we expect,” she concluded.

If companies won’t take responsibility for making racial bias permanent, it’s up to us as a community to bring attention to racial disparities. Both the Sentencing Project and Dressel’s research are a step in the right direction.

Rochaun Meadows-Fernandez's work has appeared in Healthline, Yes! Magazine, HuffPost, Allure, and other publications. Follow her on Twitter or check out her website.