Risk assessment tools are starting to take root in the criminal justice system. They’re used to make decisions about pretrial release, sentencing, and the level of supervision or custody to which a defendant will be subject. Some of the results are encouraging. For example, Mecklenburg County uses a risk assessment developed by the Laura and John Arnold Foundation to help make pretrial release decisions. The pretrial services office there reports that the risk assessment has contributed to “transformational change” in how pretrial justice is administered, with fewer secured bonds being imposed the jail population falling with no harm to public safety. Based in part on Mecklenburg’s success, the North Carolina Commission on the Administration of Law and Justice encouraged the creation of a pilot project that would “implement and assess more broadly . . . an empirically derived pretrial risk assessment tool.”
Risk assessment tools are not without controversy, however. For example, in a 2014 speech, then-Attorney General Eric Holder warned that the use of risk assessments at sentencing “may inadvertently undermine our efforts to ensure individualized and equal justice. By basing sentencing decisions on static factors and immutable characteristics – like the defendant’s education level, socioeconomic background, or neighborhood – they may exacerbate unwarranted and unjust disparities that are already far too common in our criminal justice system and in our society.”
Readers interested in risk assessments may be interested in two recent developments:
- The Supreme Court of New Mexico recently affirmed a trial judge’s order detaining a murder defendant without bail. In the course of its opinion, the court noted that the judge’s decision was supported by the results of “the Arnold Public Safety Assessment (PSA) . . . a nationally recognized scientifically validated risk assessment instrument that courts in an increasing number of jurisdictions use as an aid, though never as the only factor, in making detention and release decisions.” State v. Groves, __ P.3d __, 2018 WL 359473 (N.M. Jan. 11, 2018). The opinion is noteworthy because it reflects courts’ increasing comfort with and reliance on risk assessment tools.
- A new article in the journal Science Advances evaluated the accuracy of COMPAS, a widely-used risk assessment tool that uses “137 features about an individual” to assess the likelihood of recidivism. The tool’s overall accuracy in predicting recidivism is 65.2%. This is somewhat better than chance, but the authors found the risk assessment to be “no more accurate or fair than predictions made by people with little or no criminal justice expertise” who were provided with some basic information about the defendants for whom they were making predictions. The researchers also found that “a simple linear predictor provided with only two features,” age and criminal history, was as accurate as the more complex tool.
Obviously, this new research doesn’t establish that all risk assessments are inaccurate, much less biased in the way that Attorney General Holder suggested. But it appears that some risk assessment tools work better than others, and that determining whether a particular tool works well requires serious scientific scrutiny, which can be facilitated by transparency about how the tool works. I remain cautiously optimistic about risk assessments as a source of information to support judicial decision-making. But I’m interested in others’ thoughts and experiences, especially regarding how things are working in Mecklenburg County or in other jurisdictions where risk assessments are in use.