New! Sign up for our free email newsletter.
Science News
from research organizations

AI sentencing tools need to be closely scrutinized

Date:
June 9, 2020
Source:
University of Surrey
Summary:
Judges should closely vet the AI tools they use to help them predict whether a defendant is likely to re offend, urges a new study.
Share:
FULL STORY

In a paper published by the Behavioral Sciences & Law journal, experts from the University of Surrey take a critical look at the growing use of algorithmic risk assessment tools, which act as a form of expert scientific evidence in a growing number of criminal cases.

The review argues that because of several issues, such as the biases of the developers and weak statistical evidence of the AI's predictive performance, judges should act as gatekeepers and closely scrutinise whether such tools should be used at all.

The paper outlines three steps that judges should consider:

  • Fitness, this is to consider whether using the AI tool is relevant to the case at all
  • Accuracy, this is to understand whether the tool can truly distinguish between reoffenders and non-reoffenders
  • Reliability, this would require the judges to scrutinise the trustworthiness of tool's outcomes in practice. This stage would not be required if the judge found the AI lacking in the one of the first two steps.

Dr Melissa Hamilton, author of the paper and Reader in Law and Criminal Justice at the University of Surrey's School of Law, said: "These emerging AI tools have the potential to offer benefits to judges in sentencing, but close attention needs to be paid to whether they are trustworthy. If used carelessly these tools will do a disservice to the defendants on the receiving end."


Story Source:

Materials provided by University of Surrey. Note: Content may be edited for style and length.


Journal Reference:

  1. Melissa Hamilton. Judicial gatekeeping on scientific validity with risk assessment tools. Behavioral Sciences & the Law, 2020; 38 (3): 226 DOI: 10.1002/bsl.2456

Cite This Page:

University of Surrey. "AI sentencing tools need to be closely scrutinized." ScienceDaily. ScienceDaily, 9 June 2020. <www.sciencedaily.com/releases/2020/06/200609130012.htm>.
University of Surrey. (2020, June 9). AI sentencing tools need to be closely scrutinized. ScienceDaily. Retrieved December 19, 2024 from www.sciencedaily.com/releases/2020/06/200609130012.htm
University of Surrey. "AI sentencing tools need to be closely scrutinized." ScienceDaily. www.sciencedaily.com/releases/2020/06/200609130012.htm (accessed December 19, 2024).

Explore More

from ScienceDaily

RELATED STORIES