Boing Boing Staging

Moral economy and software development: software without politics is recipe for totalitarianism

Maciej Cegłowski (previously) keynoted the Society for the Advancement of Socio-Economics conference with a characteristically brilliant speech about the “moral economy of tech” — that is, the way that treating social problems like software problems allows techies to absolve themselves of the moral consequences of their actions and the harms that result.


Software development involves “working with deterministic systems that have been designed by other human beings” to find the “clever twist” that gives us the “feeling of competence, control and delight.” That feeling leads us to believe that we can solve social problems in the same way, a feeling that’s exacerbated by the programmer’s urge to “seek maximal and global solutions” that “fix the general problem for everybody, and for all time.”

But a quick glance around the ground-zero of technological solutions for social problems — the Bay Area — shows “a residential theme park for tech workers, surrounded by areas of poverty and misery that have seen no benefit and ample harm from our presence.” what’s more, the programmer’s excuse that we can’t be held responsible for the way our tools are used absolves of us any responsibility for this situation, while inviting us to double down on the tactics that created it.

The crowning achievement of this insanity is surveillance capitalism, which rewards those who “try flashy things that … capture the speculators’ imagination” and their money, in order to collect more and more data from the general population, though the “actual value of the data collected is not clear.” Thus tech is locked in an arms race with humans, who “develop an immune response to new forms of tracking and manipulation,” prompting tech to find ever-more insidious ways of collecting from us.

Far from being a harmless folly or an irrational economic bubble, surveillance capitalism is creating the conditions for some genuinely terrifying outcomes, as states draw on the massive troves of data to control their populations — thus containing the social instability that comes from otherwise unsustainable levels of wealth inequality and immiseration.

I’ve often heard people on the government side — cops, spooks, bureaucrats, politicians — say that NSA surveillance doesn’t scare them as much as Google and Facebook surveillance. But the NSA is only able to spy on us cost effectively because of surveillance capitalism: the data-collection activities of the private sector are the intake feed for the state. It’s naive to expect that the state will place effective curbs on corporate surveillance at the same time as it is relying on that data as the front-line of its border patrols, criminal justice system and other core functions.

Any data we collect will probably leak. Any data we retain will definitely leak. The private data breaches we’ve experienced to date — and the mass-scale criminal attacks on their subjects — were the tremors, not the quake.

One of the candidates running for President this year has promised to deport eleven million undocumented immigrants living in the United States, as well as block Muslims from entering the country altogether. Try to imagine this policy enacted using the tools of modern technology. The FBI would subpoena Facebook for information on every user born abroad. Email and phone conversations would be monitored to check for the use of Arabic or Spanish, and sentiment analysis applied to see if the participants sounded “nervous”. Social networks, phone metadata, and cell phone tracking would lead police to nests of hiding immigrants.

We could do a really good job deporting people if we put our minds to it.

Or consider the other candidate running for President, the one we consider the sane alternative, who has been a longtime promoter of a system of extrajudicial murder that uses blanket surveillance of cell phone traffic, email, and social media to create lists of people to be tracked and killed with autonomous aircraft. The system presumably includes points of human control (we don’t know because it’s secret), but there’s no reason in principle it could not be automated. Get into the wrong person’s car in Yemen, and you lose your life.

That this toolchain for eliminating enemies of the state is only allowed to operate in poor, remote places is a comfort to those of us who live elsewhere, but you can imagine scenarios where a mass panic would broaden its scope.

Or imagine what the British surveillance state, already the worst in Europe, is going to look like in two years, when it’s no longer bound by the protections of European law, and economic crisis has driven the country further into xenophobia.

Or take an example from my home country, Poland. Abortion has been illegal in Poland for some time, but the governing party wants to tighten restrictions on abortion by investigating every miscarriage as a potential crime. Women will basically be murder suspects if they lose their baby. Imagine government agents combing your Twitter account, fitness tracker logs, credit card receipts and private communications for signs of potential pregnancy, with the results reported to the police to proactively protect your unborn baby.

The Moral Economy of Tech [Maciej Cegłowski/Idlewords]

(via Four Short Links)

(Image: Bansky one nation under cctv, Oogiboig, CC-BY-SA)

Exit mobile version