Boing Boing Staging

UK Internet censorship plan no less stupid than it was last year

UK Prime Minister David Cameron has promised to make pornography filters standard on British Internet connections. This is a remarkably stupid policy, and despite that, it is a recurring silliness in British (and global) politics. Back in 2012, the House of Lords was considering the same question, and I wrote a long, comprehensive article for the Guardian explaining why this won’t work and why it will be worse than doing nothing. Nothing I asserted in that essay has changed in the interim.

Consider a hypothetical internet of a mere 20bn documents that is comprised one half “adult” content, and one half “child-safe” content. A 1% misclassification rate applied to 20bn documents means 200m documents will be misclassified. That’s 100m legitimate documents that would be blocked by the government because of human error, and 100m adult documents that the filter does not touch and that any schoolkid can find.

In practice, the misclassification rate is much, much worse. It’s hard to get a sense of the total scale of misclassification by censorware because these companies treat their blacklists as trade secrets, so it’s impossible to scrutinise their work and discover whether they’re exercising due care.


There’s no way to stop children viewing porn in Starbucks

In 2003, the Electronic Frontier Foundation tested the censorware used by US schools to see how many of the most highly-ranked documents on concepts from the national school curriculum were blocked by the school’s own censorware. They discovered that 75-85% of these sites were incorrectly classified. That percentage went way, way up when it came to sensitive subjects such as sexuality, reproductive health, and breast cancer.

That study is a decade old, and dates from a time when the web was comparatively minuscule. Today’s web is thousands of times larger than the web of 2003. But labour isn’t thousands of times cheaper, and good content has not gotten thousands of times easier to distinguish from bad content in the interim.

So when Lady Benjamin spoke of wanting to protect “students doing research for a project”, she should have also been thinking of protecting students’ access to the documents necessary to do their homework.

But it’s not just “good” material that gets misclassified. There is unquestionably a lot of material on the internet kids shouldn’t be seeing, and it multiplies at a staggering rate. Here, you have the inverse of the overblocking problem. Miss 1% – or 10%, or a quarter – of the adult stuff, and you allow a titanic amount of racy material through. Students who are actively seeking this material – the modern equivalent of looking up curse words in the school dictionary – will surely find it. Students who are innocently clicking from one place to another will, if they click long enough, land on one of these sites. The only way to prevent this is to block the internet.

So when Lady Massey complains that Starbucks has failed its civic duty by allowing unfiltered internet connections in its cafes, she’s rather missing the point. Even a filtered connection would pose little challenge to someone who wanted to look at pornography in a public place. Meanwhile, any filter deployed by Starbucks will block all manner of legitimate material that its customers have every right to look at.

So far, I’ve been writing as though “adult” content can be trivially distinguished from “family friendly” content. The reality is a lot muddier. There’s plenty of material in a day’s newspaper that I wouldn’t want my four-year-old daughter to hear about, from massacres and rapes in Mali to the tawdry personal stories on the agony aunt page. Sorting the “adult” internet from the “child” internet is a complex problem with no right answer.

When Boing Boing, the website I co-own, was blocked by one censorware site, it was on the grounds that we were a “nudity” site. That was because, among the tens of thousands of posts we’d made, a few were accompanied by (non-prurient, strictly anatomical) small images showing some nudity.

When we argued our case to the vendor’s representative, he was categorical: any nudity, anywhere on the site, makes it into a “nudity site” for the purposes of blocking. The vendor went so far as to state that a single image of Michelangelo’s David, on one page among hundreds of thousands on a site, would be sufficient grounds for a nudity classification.

I suspect that none of the censorship advocates in the Lords understand that the offshore commercial operators they’re proposing to put in charge of the nation’s information access apply this kind of homeopathic standard to objectionable material.

Exit mobile version