Can algorithms help judges make fair decisions?

Read the original article source of this excerpt.

WHYY on 02/20/20 by Alan Yu

When judges impose sentences, they consider the crimes, and how likely the offenders are to offend again.

Lori Dumas has been a judge in family, and criminal courts in Philadelphia for more than a decade.

She said she knows people end up in front of her because they did something. But she also considers that a criminal record, as a result of her ruling, will likely affect the rest of their lives, whether someone gets a place to live, what kind of job they can get.

“Sometimes, people think it’s easy what we do. It really isn’t,” she said. “Because we are responsible for lives.”

She thinks about someone’s childhood, work history, mental health, whether they show remorse. And she also looks ahead: What is going to happen to this person six months, a year from now, because of what she decides?

Dumas follows the law first, but within that, she has quite a bit of discretion.

If she thinks she needs more information about a case, she can ask for it: school records, child welfare records, or a psychological evaluation.

“People think that all judges should be able to look at the law and look at a fact situation and rule the same way,” Dumas said. “The reality is, is that you can have one particular set of facts heard by five different judges and get five different results. And that’s based on the fact that we are people before we come to the bench.”

For example, she is a parent. Her point of view in juvenile court will probably be different than that of a judge who does not have children.

“Our own background, I would say, is probably 95% of the reason why what you may hear in one courtroom is totally different than what you may hear in another.”

Ten years ago, Pennsylvania’s legislature decided there was a problem with how all this worked.

“It was totally arbitrary,” said Todd Stephens, a member of the state House of Representatives who chairs the Pennsylvania Commission on Sentencing.

“Judges were just on their own deciding, ‘Well, jeez, this defendant in front of me, it seems like I might need to take a closer look and get more information…’ But there was no evidence-based or data-driven objective standard for requesting additional information: It was done on a whim, every judge had their own internal set of criteria that they might use on whether or not to get more information about a particular offender.”

So in 2010, the state panel worked on an algorithm, a formula, that would allow a computer to predict how likely a person was to commit another crime and recommend when judges should get more information about a case. The goal was to make sentencing more consistent, reduce prison populations, and lead to less crime.

Mark Bergstrom, executive director of the Commission on Sentencing, said compared to judges, an algorithm can process lots of data. “When we started our project, we didn’t look at a handful of cases, we looked at over 200,000 cases to try to see what factors sort of related to positive and negative outcomes. And that’s information that judges didn’t have or didn’t have in a … structured … way.”

The formula will look for patterns based on age, gender, what crime someone is being convicted of, prior convictions and for which crimes, and whether the offender has a juvenile record. It cannot take race into account, or county, which is seen as a proxy for race.

The judge will still make the ultimate decision on sentencing. The algorithm will be rolled out this year, and evaluated after 12 months.

Continue reading –>