When Machines Can Be Judge, Jury, and Executioner: Justice in the Age of

Sentenced by Algorithm

submited by
Style Pass
2021-05-25 05:00:07

When Machines Can Be Judge, Jury, and Executioner: Justice in the Age of Artificial Intelligence

Is it fair for a judge to increase a defendant’s prison time on the basis of an algorithmic score that predicts the likelihood that he will commit future crimes? Many states now say yes, even when the algorithms they use for this purpose have a high error rate, a secret design, and a demonstrable racial bias. The former federal judge Katherine Forrest, in her short but incisive When Machines Can Be Judge, Jury, and Executioner, says this is both unfair and irrational.1

One might think that the very notion of a defendant having his prison time determined not just by the crime of which he was convicted, but also by a prediction that he will commit other crimes in the future, would be troubling on its face. Such “incapacitation”—depriving the defendant of the capacity to commit future crimes—is usually defended on the grounds that it protects the public and is justifiable as long as the sentence is still within the limits set by the legislature for the crime. But the reality is that the defendant is receiving enhanced punishment for crimes he hasn’t committed, and that seems wrong.

Nonetheless, Congress and state legislatures have long treated incapacitation as a legitimate goal of sentencing. For example, the primary federal statute setting forth the “factors to be considered in imposing a sentence” (18 U.S.C. sec. 3553, enacted in 1984) provides, among other things, that “the court, in determining the particular sentence to be imposed, shall consider…the need for the sentence imposed…to protect the public from further crimes of the defendant.”

Leave a Comment