Boing Boing Staging

Three kinds of propaganda, and what to do about them

Jonathan Stray summarizes three different strains of propaganda, analyzing why they work, and suggesting counter-tactics: in Russia, it’s about flooding the channel with a mix of lies and truth, crowding out other stories; in China, it’s about suffocating arguments with happy-talk distractions, and for trolls like Milo Yiannopoulos, it’s weaponizing hate, outraging people so they spread your message to the small, diffused minority of broken people who welcome your message and would otherwise be uneconomical to reach.

Stray cites some of the same sources I’ve written about here: Tucker Max’s analysis of Yiannopoulos’s weaponized hate and The Harvard Institute for Quantitative Science team’s first-of-its kind analysis of leaked messages directing the activities of the “50-cent army, which overwhelms online Chinese conversation with upbeat cheerleading (think of Animal Farm‘s sheep-bleating, or Nineteen Eighty-Four‘s quackspeak).


But I’d never encountered the work he references on Russian propaganda, by RAND scholar Christopher Paul, who calls Russian disinformation a “firehose of falsehood.” This tactic involves having huge numbers of channels at your disposal: fake and real social media accounts, tactical leaks to journalists, state media channels like RT, which are able to convey narrative at higher volume than the counternarrative, which becomes compelling just by dint of being everywhere (“quantity does indeed have a quality all its own”).


Mixing outright lies with a large dollop of truth is key to this tactic, as it surrounds the lies with a penumbra of truthfulness. This is a time-honored tactic, of course: think of the Christian Science Monitor‘s history of outstanding international coverage, accompanied by editorials about God’s ability to heal through prayer; or Voice of America’s mixture of excellent reporting on (again) international politics and glaring silence on US crises (see also: Al Jazeera as a reliable source on everything except corruption in the UAE; the BBC World Service’s top-notch journalism on everything except UK complicity in disasters like the Gulf War, etc).

In addition to this excellent taxonomy of propaganda, Stray proposes countermeasures for each strain: for Russia-style “firehoses of falsehood,” you have to reach the audience first with an alternative narrative; once the firehose is on, it’s too late. For Chinese quackspeak floods, you need “organized, visible resistance” in the streets. For pathetic attention-whores like Yiannopoulos, Stray says Tucker Max is right: you have to ignore him.

As I’ve written before, we’re not living through a crisis about what is true, we’re living through a crisis about how we know whether something is true. We’re not disagreeing about facts, we’re disagreeing about epistemology. The “establishment” version of epistemology is, “We use evidence to arrive at the truth, vetted by independent verification (but trust us when we tell you that it’s all been independently verified by people who were properly skeptical and not the bosom buddies of the people they were supposed to be fact-checking).”

The “alternative facts” epistemological method goes like this: “The ‘independent’ experts who were supposed to be verifying the ‘evidence-based’ truth were actually in bed with the people they were supposed to be fact-checking. In the end, it’s all a matter of faith, then: you either have faith that ‘their’ experts are being truthful, or you have faith that we are. Ask your gut, what version feels more truthful?”

One of the most insightful things I’ve heard about the epistemological crisis came from a recent episode of the hilarious News Quiz on BBC Radio 4: the people who support Trump do so tribally, like supporters of a sports team. If you hear that your sports team had three players thrown out of the game for breaking the rules and still won, you don’t rail against their cheating, you celebrate their victory in the face of the odds. The drumbeat of news about Trump losing pawns — see, e.g., Michael Flynn — doesn’t matter to trumpists, so long as they believe that Trump is still achieving his policy goals, which are so nebulous that he can credibly claim to be achieving them through symbolic, meaningless stunts like forcing people to show papers before debarking domestic flights, or tearing apart families with cruel deportation orders.

Debunking doesn’t work: provide an alternative narrative

Telling people that something they’ve heard is wrong may be one of the most pointless things you can do. A long series of experiments shows that it rarely changes belief. Brendan Nyhan is one of the main scholars here, with a series of papers on political misinformation. This is about human psychology; we simply don’t process information rationally, but instead employ a variety of heuristics and cognitive shortcuts (not necessarily maladaptive over all) that can be exploited. The classic experiment goes like this:

Participants in a study within this paradigm are told that there was a fire in a warehouse and that there were flammable chemicals in the warehouse that were improperly stored. When hearing these pieces of information in succession, people typically make a causal link between the two facts and infer that the fire was caused in some way by the flammable chemicals. Some subjects are then told that there were no flammable chemicals in the warehouse. Subjects who have received this corrective information may correctly answer that there were no flammable chemicals in the warehouse and separately incorrectly answer that flammable chemicals caused the fire. This seeming contradiction can be explained by the fact that people update the factual information about the presence of flammable chemicals without also updating the causal inferences that followed from the incorrect information they initially received.
Worse, repeating a lie in the process of refuting it may actually reinforce it! The counter strategy is to replace one narrative with another. Affirm, don’t deny:

Which of these headlines strikes you as the most persuasive:

“I am not a Muslim, Obama says.”

“I am a Christian, Obama says.”

Defense Against the Dark Arts: Networked Propaganda and Counter-Propaganda [Jonathan Stray]

(via The Grugq)

Exit mobile version