Boing Boing Staging

"Affordances": a new science fiction story that climbs the terrible technology adoption curve

“Affordances” is my new science fiction story for Slate/ASU’s Future Tense project; it’s a tale exploring my theory of “the shitty technology adoption curve,” in which terrible technological ideas are first imposed on poor and powerless people, and then refined and normalized until they are spread over all the rest of us.


The story makes the point by exploring all the people in a facial recognition ecosystem, from low-waged climate refugees who are paid to monitor facial recognition errors in an overseas boiler room, to cops whose facial recognition systems and risk-assessment scoring institutionalize algorithmic racism, to activists whose videos of human rights abuses on the US border are disappeared by copyright enforcement bots deployed by shadowy astroturf organizations, to the executives at the companies who make the facial recognition tools whose decisions are constrained by automated high-speed trading bots.


It also explores methods of technological resistance, solidarity, and activism, and how the flip-side of automated systems’ inaccuracy is their fragility.

The story is accompanied by a response essay by Nettrice Gaskins (previously), “an artist-educator who collaborates with AI,” who discusses it in the context of the “afrocentric counter-surveillance aesthetic,” which is my new all-time favorite phrase.


There were different kinds of anxiety: the anxiety she’d felt when she was recording the people massing for their rush, clammy under the thermal blanket with its layer of retroreflective paint that would confound drones and cameras; she walked among the people, their faces shining, their few things on their backs, their children in their arms, the smell of too many bodies and too much fear. Then there was the anxiety she felt as she retreated to her perch, where she had her long lenses, each attached to its own phone, all recording as the rush formed up, the blankets rustling and rippling and then ripping as bodies burst forth, right into the gas, into the rubber bullets, into the armored bodies that raised their truncheons and swung and swung and swung, while the klaxons blared and the drones took to the sky.

There was the anxiety she felt when the skirmish ended and she trained her lenses on the bodies sprawled on the concrete, the toys and bags that had been dropped, the child holding her mother’s limp hand and wailing.

But now came a different kind of anxiety as she edited her footage down, mixing it and captioning it, being careful to blur the faces, but being even more careful to avoid any of the anti-extremism trigger-words: migration, violence, race, racism—words that white nationalists used, but also words that were impossible to avoid when discussing the victims of white nationalism. Advertisers hated them, and algorithms couldn’t tell the difference between people being racist and people complaining about racism. There were new euphemisms every week, and new blacklists, too. In theory, she could just hit publish and when the filter blocked her, she could get in line behind tens of millions of other people whose videos had been misclassified by the bots. But she didn’t want to wait 10 months for her video to be sprung from content jail; she wanted it to go viral now.


Affordances [Cory Doctorow/Slate/ASU/Future Tense]

Exit mobile version