When Government Rules by Software, Citizens Are Left in the Dark

In July, San Francisco Superior Court Judge Sharon Reardon considered whether to hold Lamonte Mims, a 19-year-old accused of violating his probation, in jail. One piece of evidence before her: the output of algorithms known as PSA that scored the risk that Mims, who had previously been convicted of burglary, would commit a violent crime or skip court. Based on that result, another algorithm recommended that Mims could safely be released, and Reardon let him go. Five days later, police say, he robbed and murdered a 71-year old man.

On Monday, the San Francisco District Attorney’s Office said staffers using the tool had erroneously failed to enter Mims’ prior jail term. Had they done so, PSA would have recommended he be held, not released.

Mims’ case highlights how governments increasingly rely on mathematical formulas to inform decisions about criminal justice, child welfare, education and other arenas. Yet it’s often hard or impossible for citizens to see how these algorithms work and are being used.

San Francisco Superior Court began using PSA in 2016, after getting the tool for free from the John and Laura Arnold Foundation, a Texas nonprofit that works on criminal-justice reform. The initiative was intended to prevent poor people unable to afford bail from needlessly lingering in jail. But a memorandum of understanding with the foundation bars the court from disclosing “any information about the Tool, including any information about the development, operation and presentation of the Tool.”

The agreement was unearthed in December by two law professors, who in a paper released this month document a widespread transparency problem with state and municipal use of predictive algorithms. Robert Brauneis, of George Washington University, and Ellen Goodman, of Rutgers University, filed 42 open-records requests in 23 states seeking information about PSA and five other tools used by governments. They didn’t get much of what they asked for.

Many governments said they had no relevant records about the programs. Taken at face value, that would mean those agencies did not document how they chose, or how they use, the tools. Others said contracts prevented them from releasing some or all information. Goodman says this shows governments are neglecting to stand up for their own, and citizens’, interests. “You can really see who held the pen in the contracting process,” she says.

The Arnold Foundation says it no longer requires confidentiality from municipal officials, and is happy to amend existing agreements, to allow officials to disclose information about PSA and how they use it. But a representative of San Francisco Superior Court said its contract with the foundation has not been updated to remove the gag clause.

Goodman and Brauneis ran their records-request marathon to add empirical fuel to a debate about widening use of predictive algorithms in government decision-making. In 2016, an investigation by ProPublica found that a system used in sentencing and bail decisions was biased against black people. Scholars have warned for years public policy could become hidden under the shroud of trade secrets, or technical processes divorced from the usual policy-making process.

The scant results from nearly a year of filing and following up on requests suggests those fears are well-grounded. But Goodman says the study has also helped convince her that governments could be more open about their use of algorithms, which she says have clear potential to make government more efficient and equitable.

Some scholars and activists want governments to reveal the code behind their algorithms, a tough ask because they are often commercial products. Goodman thinks it’s more urgent that the public knows how an algorithm was chosen, developed, and tested—for example how sensitive it is to false positives and negatives. That’s no break from the past, she argues, because citizens have always been able to ask for information about how new policy was devised and implemented. “Governments have not made the shift to understanding this is policy making,” she says. “The concern is that public policy is being pushed into a realm where it’s not accessible.”

For Goodman’s hopes to be met, governments will have to stand up to the developers of predictive algorithms and software. Goodman and Brauneis sought information from 16 local courts that use PSA. They received at least some documents from five; four of those, including San Francisco, said their agreement with the Arnold Foundation prevented them from discussing the tool and its use.

Some things are known about PSA. The Arnold Foundation has made public the formulas at the heart of its tool, and the factors it considers, including a person’s age, criminal history and whether they have failed to appear for prior court hearings. It says researchers used data from nearly 750,000 cases to design the tool. After PSA was adopted in Lucas County, Ohio, the Arnold Foundation says, crimes committed by people awaiting trial fell, even as more defendants were released without having to post bail.

Goodman argues the foundation should disclose more information about its dataset and how it was analyzed to design PSA, as well as the results of any validation tests performed to tune the risk scores it assigns people. That information would help governments and citizens understand PSA’s strengths and weaknesses, and compare it with competing pretrial risk-assessment software. The foundation didn’t answer a direct request for that information from the researchers this March. Moreover, some governments now using PSA have agreed not to disclose details about how they use it.

An Arnold Foundation spokeswoman says it is assembling a dataset for release that will allow outside researchers to evaluate its tool. She says the foundation initially required confidentiality from jurisdictions to inhibit governments or rivals from using or copying the tool without permission.

Goodman and Brauneis also queried 11 police departments that use PredPol, commercial software that predicts where crime is likely to occur and can be used to plan patrols. Only three responded. None revealed the algorithm PredPol uses to make predictions, or anything about the process used to create and validate it. PredPol is marketed by a company of the same name, and originated in a collaboration between Los Angeles Police Department and University of California Los Angeles. It did not respond to a request for comment.

Some municipalities were more forthcoming. Allegheny County in Pennsylvania produced a report describing the development and testing of an algorithm that helps child-welfare workers decide whether to formally investigate new reports of child maltreatment, for example. The county’s Department of Human Services had commissioned the tool from Auckland University of Technology, in New Zealand. Illinois specifies that information about its contracts for a tool that tries to predict when children may be injured or killed will be public unless prohibited by law.

Most governments the professors queried didn’t appear to have the expertise to properly consider or answer questions about the predictive algorithms they use. “I was left feeling quite sympathetic to municipalities,” Goodman says. “We’re expecting them to do a whole lot they don’t have the wherewithal to do.”

Danielle Citron, a law professor at the University of Maryland, says that pressure from state attorneys general, court cases, and even legislation will be necessary to change how local governments think about, and use, such algorithms. “Part of it has to come from law,” she says. “Ethics and best practices never gets us over the line because the incentives just aren’t there.”

Researchers believe predictive algorithms are growing more prevalent—and more complex. “I think that probably makes things harder,” says Goodman.

This article was published by Wired.


Previous
Previous

$3.1M Award Aims to Prevent Attacks on AI Security Measures