Second wave Algorithmic Accountability: from "What should algorithms do?" to "Should we use an algorithm?"


For ten years, activists and theorists have been developing a critique of "algorithms" (which have undergone numerous renamings over the same time, e.g. "filter bubbles"), with the early critiques focusing on the way that these can misfire with dreadful (or sometimes humorous) consequences, from discrimination in which employment and financial ads get served to the "dark patterns" that "maximized engagement" with services that occupied your attention but didn't bring you pleasure.

Today, a new wave of critiques is emerging, one that doesn't merely ask "What are the problems with how this algorithm does its job?" but also asks, "Should an algorithm do this job?"


The canonical example of this is bias in facial recognition: it's well-understood that facial recognition tools perform worse when asked to identify women and people with darker skin, a circumstance that is very plausibly attributed to the male, white developers of these tools who trained them on people who looked like themselves.


The first-order critique of this is "Garbage In, Garbage Out": the lack of representation and the bias in tech hiring ripples out beyond the workplace and into the products, reproducing discrimination everywhere the products land.


But the second-order critique is more nuanced: "Given that a major application for facial recognition is totalitarian surveillance and control, maybe we should be thinking about limiting facial recognition altogether, rather than ensuring that it is equally good at destroying the lives of women and brown people."

This is a point that was really well articulated by Cindy Cohn at last year's launch party for the EFF/McSweeney's book on privacy. Cindy went on to point out that the Chinese state had overcome its problems with algorithmic bias in facial recognition for people of African descent by convincing an African nation to share its drivers' license database with Chinese researchers, who used this as training data — which means that black people in China can be identified and rounded up just as well as Chinese people can. This is hardly a victory for human rights!


The second wave of algorithmic accountability is well-summarized in The Seductive Diversion of ‘Solving’ Bias in Artificial Intelligence, an essay by Julia Powles (previously) and Helen Nissenbaum, who wrote, "Which systems really deserve to be built? Which problems most need to be tackled? Who is best placed to build them? And who decides?"

Frank Pasquale (previously) predicts that this second-wave critique will be especially pointed when it comes to finance ("fintech"): "we should also ask larger questions about when 'financial inclusion' can be predatory, creepy (as in 24/7 surveillance), or subordinating (as in at least one Indian fintech app, which reduces the scores of those who are engaged in political activity). What happens when fintech enables a form of 'perpetual debt?'"


At present, the first and second waves of algorithmic accountability are largely complementary. First wavers have identified and corrected clear problems in AI, and have raised public awareness of its biases and limits. Second wavers have helped slow down AI and robotics deployment enough so that first wavers have more time and space to deploy constructive reforms. There may well be clashes in the future among those who want to mend, and those at least open to ending or limiting, the computational evaluation of persons. For example: those committed to reducing the error rates of facial recognition systems for minorities may want to add more minorities’ faces to such databases, while those who find facial recognition oppressive will resist that “reform” as yet another form of predatory inclusion. But for now, both waves of algorithmic accountability appear to me to share a common aim: making sociotechnical systems more responsive to marginalized communities.

The Second Wave of Algorithmic Accountability [Frank Pasquale/Law and Political Economy]


(via Naked Capitalism)