Facebook announced today that it is officially banning white nationalist and white separatist content from the platform. But they've made similar announcements before about other forms of harmful and bogus content, such as illicit gun sales and misinformation about vaccines, and the net result was — not much. Big announcements like this are easy to make. Following up with action, not so much.
Facebook also committed to trying to deprogram people who try to post white separatist and white nationalist content to the platform by presenting them with pop-up redirects.
This is the industry-standard response among social media platforms to terrorist content from Al Qaeda, ISIS, and the like.
This is great news.
It's a long-overdue step by Facebook.
It's a start.
And it wouldn't have happened without the determined and dogged reporting by journalists including Joseph Cox and Jason Koebler and team at Motherboard.
Here are a few tweets from the Motherboard reporters with highlights:
this comes after months of reporting by @josephfcox and me about Facebook's internal content moderation policies and a backlash from civil rights groups after we learned that Facebook banned "white supremacy" but not "white nationalism" or "white separatism"
— Jason Koebler (@jason_koebler) March 27, 2019
Facebook now says there is no difference between the three, bringing it in line with the dominant perspective of civil rights experts.
— Jason Koebler (@jason_koebler) March 27, 2019
Facebook will also begin trying to deprogram people who try to post white supremacist/nationalist content with pop up redirects: pic.twitter.com/RRSUjfUgnA
— Jason Koebler (@jason_koebler) March 27, 2019
The company is also going to begin using the same behind-the-scenes tools it uses to surface and remove ISIS, Al-Qaeda and other terrorist-related content for white supremacist-related content
— Jason Koebler (@jason_koebler) March 27, 2019
Here's an excerpt:
The new policy, which will be officially implemented next week, highlights the malleable nature of Facebook’s policies, which govern the speech of more than 2 billion users worldwide. And Facebook still has to effectively enforce the policies if it is really going to diminish hate speech on its platform. The policy will apply to both Facebook and Instagram.
Last year, a Motherboard investigation found that, though Facebook banned “white supremacy” on its platform, it explicitly allowed “white nationalism” and “white separatism.” After backlash from civil rights groups and historians who say there is no difference between the ideologies, Facebook has decided to ban all three, two members of Facebook’s content policy team said.
“We’ve had conversations with more than 20 members of civil society, academics, in some cases these were civil rights organizations, experts in race relations from around the world,” Brian Fishman, policy director of counterterrorism at Facebook, told us in a phone call. “We decided that the overlap between white nationalism, [white] separatism, and white supremacy is so extensive we really can’t make a meaningful distinction between them. And that’s because the language and the rhetoric that is used and the ideology that it represents overlaps to a degree that it is not a meaningful distinction.”
Specifically, Facebook will now ban content that includes explicit praise, support, or representation of white nationalism or separatism. Phrases such as “I am a proud white nationalist” and “Immigration is tearing this country apart; white separatism is the only answer” will now be banned, according to the company. Implicit and coded white nationalism and white separatism will not be banned immediately, in part because the company said it’s harder to detect and remove.
The decision was formally made at Facebook’s Content Standards Forum on Tuesday, a meeting that includes representatives from a range of different Facebook departments in which content moderation policies are discussed and ultimately adopted. Fishman told Motherboard that Facebook COO Sheryl Sandberg was involved in the formulation of the new policy, though roughly three dozen Facebook employees worked on it.
Reuters also has reporting out today on the story.
An important note from @beccalew: Check in on Facebook's maintenance of these new policies limiting contagion of violent ideology.
After Facebook said it'd ban vaccine misinformation earlier this month, anti-vaxx content is still recommended everywhere.https://t.co/LqPUowvlYs pic.twitter.com/e5HtGClJIU
— Ben Collins (@oneunderscore__) March 27, 2019
A. This is great.
B. Facebook could've done this years ago.
C. The human toll of that delay is incalculable. https://t.co/nsz7ZHgJm6
— Noah Shachtman (@NoahShachtman) March 27, 2019