In criminal proceedings, courts are increasingly relying on automated decisionmaking tools that purport to measure the likelihood that a defendant will reoffend. But these technologies come with considerable risk; when trained on datasets or features that incorporate bias, criminal legal algorithms threaten to replicate discriminatory outcomes and produce overly punitive bail, sentencing, and incarceration decisions. Because regulators have failed...