Cash bail has turned American jails into debtors' prisons, where the wealthy can go about their business while awaiting trial, and the poor can languish for months or even years in jail because they can't make bail (in practice, they generally plead guilty, regardless of their innocence).
There has been a lot of energy and research into the proposition of replacing cash bail with algorithmic risk assessments. These have the possibility of impartially administering justice to rich and poor alike, but so far, they've been a racist hot mess, in which the inequities in the system are used as training data to produce machine learning models that enshrine that systemic racism in a system with an unassailable, empirical veneer. "You aren't in jail because of your skin color, you're in jail because of math."
Starting October 2019, California courts will substitute algorithmic risk assessment for cash bail. The system builds in some auditing and other safeguards, but stops short of requiring full transparency in training data and model design, meaning that we probably won't know if it's fair until 2023, when the first full audit is scheduled. In the meantime, you could spend four years in jail waiting to find out.
Rashida Richardson, policy director for AI Now, a nonprofit think tank dedicated to studying the societal impact of AI, tells Quartz that she’s skeptical that this system will be less biased than its predecessors.“A lot of these criminal justice algorithmic-based systems are relying on data collected through the criminal justice system. Police officers, probation officers, judges, none of the actors in the criminal justice system are data scientists,” Richardson said. “So you have data collection that’s flawed with a lot of the same biases as the criminal justice system.”
But there has been some movement towards openness in the data and factors used: Jayadev says that activist pushback has forced some companies to start disclosing which factors are considered when calculating risk scores—like a suspect’s previous criminal history and severity of the crimes alleged—but not how much each weighs into the final risk score. When some risk assessment algorithms weigh more than 100 factors (pdf), a lack of transparency makes it difficult to tell if the factors that an algorithm considers influential can relate to race, gender, or any other protected demographic.
California just replaced cash bail with algorithms [Dave Gershgorn/Quartz]
(Thanks, Zenkat!)
(Image: Cryteria, CC-BY)