In 2016, a Facebook user called the Austrian Green Party politician Eva Glawischnig-Piesczek “a corrupt oaf,” a “traitor” and a member of a “fascist party.” Glawischnig-Piesczek secured an Austrian court verdict that held these remarks to be libellous, and Facebook took them down for Austrian users.
But Glawischnig-Piesczek was upset that these remarks were still visible outside of Austria, so she asked the EU’s highest court, the CJEU, to order Facebook to block the post throughout the world, and to use some kind of technology to prevent other Facebook users from making “equivalent comments.”
This represented a new peak in extraterritorial theories of internet regulation. For years, we’ve had laws and rulings in which one jurisdiction orders the removal of content that is unlawful in its borders and insists that the material be removed everywhere in the world, even in places where the speech would be lawful (in the USA, the remarks directed at Glawischnig-Piesczek would be protected speech under the First Amendment).
These efforts have had disturbing potential for kicking off a race to the bottom, in which each territory’s speech rules are combined into a set of global prohibitions: the world’s longest copyrights would obtain in every country (so Mexico’s life-plus-100 rule would trump the EU’s life-plus-50 rule), the world’s most hair-trigger libel rules would govern globally (no insulting the King of Thailand, or the Saudi royals, or Attaturk, or Putin), and so on.
But Glawischnig-Piesczek went further: she didn’t just want Facebook to remove a libellous remark. She wanted Facebook to somehow screen every other remark, made by every other Facebook user, in every language, in every territory, and determine whether they were posting something “equivalent” to the remarks that offended her, and block those from being published.
It’s unlikely that there is a human being with sufficiently good judgment to determine whether any sizable set of remarks are “equivalent” to the ones that offended Glawischnig-Piesczek, but even if there was, that human being would need to be taught every language used by every Facebook user, and would then have to be cloned millions of times so they could screen every remark every Facebook user made to ensure that it complies with the order.
Obviously there is no way to automate this. There is no thesaurus so complete that it could block all potential “equivalent” statements, and any attempt to try would end up catching all kinds of non-equivalent remarks that were innocent of any wrongdoing, but somehow similar in the judgment of a black-box algorithm to the offending statements.
This is part of an incredibly disturbing and dangerous thread in EU policy and law: first there was last March’s passage of a copyright rule that requires platforms to filter all users’ speech, photos, videos, and audio against a crowdsourced, open database of allegedly copyrighted works, with no penalties for falsely claiming copyrights in order to block speech you don’t like.
Then came the Terror Regulation, which gives platforms one hour to remove speech identified — without any judicial proceedings — as terroristic, something that will necessity grossly expanded filtering.
Now comes the CJEU ruling, which, incredibly, took Glawischnig-Piesczek’s side, and gave EU courts the right to issue orders to platforms requiring them to not only globally remove users’ posts that violate EU states’ laws, but to also somehow find and delete “equivalent” posts, however they may appear.
The best — and most depressing — commentary on this comes from Daphne Keller, a sometime Boing Boing contributor who is also Intermediary Liability Director at the Stanford Center for Internet & Society, and whose Twitter thread on the ruling points out just how deficient it is, and how foundationally unjust the proceeding was, for many reasons, but especially because the court was never asked to consider how this may affect Facebook’s users, and only really pondered the question of how this affects Facebook and people who want to libel Austrian politicians.
As Keller writes: “TL;DR: It’s open season for courts to mandate made-up technology without knowing what that technology will do. And those mandates can apply to the whole world.”
The CJEU's ruling in Glawischnig-Piesczek is out, and it is basically the worst case scenario. https://t.co/YgEHv20pJd 1/
— Daphne Keller (@daphnehk) October 3, 2019