When Government Rules by Software, Citizens Are Left in the Dark

Tom Simonite | Wired | August 17, 2017

IN JULY, SAN Francisco Superior Court Judge Sharon Reardon considered whether to hold Lamonte Mims, a 19-year-old accused of violating his probation, in jail. One piece of evidence before her: the output of algorithms known as PSA that scored the risk that Mims, who had previously been convicted of burglary, would commit a violent crime or skip court. Based on that result, another algorithm recommended that Mims could safely be released, and Reardon let him go. Five days later, police say, he robbed and murdered a 71-year old man.

On Monday, the San Francisco District Attorney’s Office said staffers using the tool had erroneously failed to enter Mims’ prior jail term. Had they done so, PSA would have recommended he be held, not released. Mims’ case highlights how governments increasingly rely on mathematical formulas to inform decisions about criminal justice, child welfare, education and other arenas. Yet it’s often hard or impossible for citizens to see how these algorithms work and are being used.

San Francisco Superior Court began using PSA in 2016, after getting the tool for free from the John and Laura Arnold Foundation, a Texas nonprofit that works on criminal-justice reform. The initiative was intended to prevent poor people unable to afford bail from needlessly lingering in jail. But a memorandum of understanding with the foundation bars the court from disclosing “any information about the Tool, including any information about the development, operation and presentation of the Tool.”...