Risk Assessments: Junk Science

Risk Assessments: Junk Science

An automated risk assessment has been proposed across the country as a remedy to individual magistration across the State of Texas. It was promised that this would take the guess work out of magistration. Proponents of bail reform tout them as the panacea to the ills in the criminal justice system. The reality is that these so called “evidenced-based tools” are turning out to be nothing more than pure junk science.

Over the past several months, there has been a growing consensus that risk assessments are “junk science.” There have been numerous scholarly articles and whitepapers released on this very topic. Each of them come to the same conclusion: risk assessments have not been proven to work in the ways promised. Here are just a few quotes from some of these articles.

“In sum, there is a sore lack of research on the impacts of risk assessment in practice. There is next to no evidence that the adoption of risk assessment has led to dramatic improvements in either incarceration rates or crime without adversely affecting the other margin.”

Stevenson, Professor of Law, George Mason University

“An algorithm cannot take into account factors such as human emotion and need. It is stupid to allow a mathematical equation, the value of which is only as useful as the data it utilizes to operate, to determine the fate of a being that possesses free will. That’s outright absurd, stupid, and dangerous.”

Wang, Procedural Justice and Risk-Assessment Algorithms, Yale University

“In theory, risk assessments had good motivations behind them to reduce biases, but all they do is reproduce those biases in policing and prosecution. And more problematically, they give these instruments the gloss, the promise of not being biased, and they hide the bias that already constituted them.”

Werth, “Theorizing the Performative Effects of Penal Risk Technologies: (Re)producing the Subject Who Must Be Dangerous”, Rice University

In addition to scholars, social justice advocates are now coming out against these algorithms as promoting racial injustice. Just recently over 100 civil rights groups banded together to oppose the use of risk assessments in the criminal justice system. In their public statement these groups said the following:

“Pretrial risk assessment instruments are not a panacea for racial bias or inequality. Nor are they race-neutral, because all predictive tools and algorithms operate within the framework of institutions, structures, and a society infected by bias. Those facts weigh heavily against their use.”

Press Release, More than 100 Civil Rights, Digital Justice, and Community-Based Organizations Raise Concerns About Pretrial Risk Assessment.

Additionally, other advocate groups have also publicly stated their distrust of computer algorithms in the criminal justice system.

“Pretrial risk assessment instruments, as they are currently used, cannot safely be assumed to advance reformist goals of reducing incarceration and enhancing the bail system’s fairness.”

Koepke & Robinson, Danger Ahead: Risk Assessment and the Future of Bail Reform, Washington Law Review (Forthcoming)

Most importantly, when it comes to risk assessments, the public is NOT in favor of their use. According to a national survey on the topic.“The public strongly disfavors algorithms as a matter of fairness, policy, and legitimacy.”