Boing Boing Staging

Trump is an object lesson in the problems of machine learning

Trump’s algorithm is to say semi-random things until his crowd roars its approval, then he iteratively modifies those statements, seeking more and more approval, until he maxes out and tries a new tack.

This is one of the core strategies of machine-learning: random-walking to find a promising path, hill-climbing to optimize it, then de-optimizing in order to ensure that you haven’t plateaued at a local maximum (think of how an ant tries various directions to find a food source, then lays down a chemical trail that other ants reinforce as they follow it, but some will diverge randomly so that other, richer/closer food sources aren’t bypassed).

It also betrays one of the core problems with machine learning: bias in the sample-set. The people that Trump relies upon to give him his success feedback are the people who show up for Trump rallies, who are the most extreme, least-representative group of potential Trump voters. The more Trump optimizes for this limited group, the more he de-optimizes for the rest of the world.


Biased training data is a huge (ahem, yuge) problem for machine-learning. Cities that use data from racist frisking practices to determine who the police should stop end up producing algorithmic racism; court systems that use racist sentencing records to train a model that makes sentencing recommendations get algorithmic racism, too.

Trump is a salesman — he says whatever he thinks his audience wants to hear. That’s why he’s contradicted every position he’s ever espoused. He randomwalked into a strategy of saying things that an underserved, vocal group of bigots wanted to hear, and they fed back their approval and he’s been death-spiraling ever since.

Cathy “Weapons of Math Destruction” O’Neil explains what this means for machine learning in our political and policy spheres.


The reason I bring this up: first of all, it’s a great way of understanding how machine learning algorithms can give us stuff we absolutely don’t want, even though they fundamentally lack prior agendas. Happens all the time, in ways similar to the Donald.

Second, some people actually think there will soon be algorithms that control us, operating “through sound decisions of pure rationality” and that we will no longer have use for politicians at all.

And look, I can understand why people are sick of politicians, and would love them to be replaced with rational decision-making robots. But that scenario means one of three things:

1. Controlling robots simply get trained by the people’s will and do whatever people want at the moment. Maybe that looks like people voting with their phones or via the chips in their heads. This is akin to direct democracy, and the problems are varied – I was in Occupy after all – but in particular mean that people are constantly weighing in on things they don’t actually understand. That leaves them vulnerable to misinformation and propaganda.

2. Controlling robots ignore people’s will and just follow their inner agendas. Then the question becomes, who sets that agenda? And how does it change as the world and as culture changes? Imagine if we were controlled by someone from 1000 years ago with the social mores from that time. Someone’s gonna be in charge of “fixing” things.

3. Finally, it’s possible that the controlling robot would act within a political framework to be somewhat but not completely influenced by a democratic process. Something like our current president. But then getting a robot in charge would be a lot like voting for a president. Some people would agree with it, some wouldn’t. Maybe every four years we’d have another vote, and the candidates would be both people and robots, and sometimes a robot would win, sometimes a person. I’m not saying it’s impossible, but it’s not utopian. There’s no such thing as pure rationality in politics, it’s much more about picking sides and appealing to some people’s desires while ignoring others.

Donald Trump is like a biased machine learning algorithm
[Cathy O’Neil/Mathbabe]


(Image: Sholim)

Exit mobile version