He was talking to law-makers, policy-makers and power-brokers, people who were, at best, half-smart about technology — just smart enough to understand that in a connected world, every problem society has involves computers, and just stupid enough to demand that computers be altered to solve those problems.
Paging Theresa May.
Theresa May says that last night’s London terror attacks mean that the internet cannot be allowed to provide a “safe space” for terrorists and therefore working cryptography must be banned in the UK.
This is a golden oldie, a classic piece of foolish political grandstanding. May’s predecessor, David Cameron, repeatedly campaigned on this one, and every time he did, I wrote a long piece rebutting him. Rather than writing a new one for May, I thought I’d just dust off a pair of my Cameron-era pieces (1, 2), since every single word still applies.
Theresa May says there should be no “means of communication” which “we cannot read” — and no doubt many in her party will agree with her, politically. But if they understood the technology, they would be shocked to their boots.
It’s impossible to overstate how bonkers the idea of sabotaging cryptography is to people who understand information security. If you want to secure your sensitive data either at rest – on your hard drive, in the cloud, on that phone you left on the train last week and never saw again – or on the wire, when you’re sending it to your doctor or your bank or to your work colleagues, you have to use good cryptography. Use deliberately compromised cryptography, that has a back door that only the “good guys” are supposed to have the keys to, and you have effectively no security. You might as well skywrite it as encrypt it with pre-broken, sabotaged encryption.
There are two reasons why this is so. First, there is the question of whether encryption can be made secure while still maintaining a “master key” for the authorities’ use. As lawyer/computer scientist Jonathan Mayer explained, adding the complexity of master keys to our technology will “introduce unquantifiable security risks”. It’s hard enough getting the security systems that protect our homes, finances, health and privacy to be airtight – making them airtight except when the authorities don’t want them to be is impossible.
What Theresa May thinks she’s saying is, “We will command all the software creators we can reach to introduce back-doors into their tools for us.” There are enormous problems with this: there’s no back door that only lets good guys go through it. If your Whatsapp or Google Hangouts has a deliberately introduced flaw in it, then foreign spies, criminals, crooked police (like those who fed sensitive information to the tabloids who were implicated in the hacking scandal — and like the high-level police who secretly worked for organised crime for years), and criminals will eventually discover this vulnerability. They — and not just the security services — will be able to use it to intercept all of our communications. That includes things like the pictures of your kids in your bath that you send to your parents to the trade secrets you send to your co-workers.
But this is just for starters. Theresa May doesn’t understand technology very well, so she doesn’t actually know what she’s asking for.
For Theresa May’s proposal to work, she will need to stop Britons from installing software that comes from software creators who are out of her jurisdiction. The very best in secure communications are already free/open source projects, maintained by thousands of independent programmers around the world. They are widely available, and thanks to things like cryptographic signing, it is possible to download these packages from any server in the world (not just big ones like Github) and verify, with a very high degree of confidence, that the software you’ve downloaded hasn’t been tampered with.
May is not alone here. The regime she proposes is already in place in countries like Syria, Russia, and Iran (for the record, none of these countries have had much luck with it). There are two means by which authoritarian governments have attempted to restrict the use of secure technology: by network filtering and by technology mandates.
Theresa May has already shown that she believes she can order the nation’s ISPs to block access to certain websites (again, for the record, this hasn’t worked very well). The next step is to order Chinese-style filtering using deep packet inspection, to try and distinguish traffic and block forbidden programs. This is a formidable technical challenge. Intrinsic to core Internet protocols like IPv4/6, TCP and UDP is the potential to “tunnel” one protocol inside another. This makes the project of figuring out whether a given packet is on the white-list or the black-list transcendentally hard, especially if you want to minimise the number of “good” sessions you accidentally blackhole.
More ambitious is a mandate over which code operating systems in the UK are allowed to execute. This is very hard. We do have, in Apple’s Ios platform and various games consoles, a regime where a single company uses countermeasures to ensure that only software it has blessed can run on the devices it sells to us. These companies could, indeed, be compelled (by an act of Parliament) to block secure software. Even there, you’d have to contend with the fact that other EU states and countries like the USA are unlikely to follow suit, and that means that anyone who bought her Iphone in Paris or New York could come to the UK with all their secure software intact and send messages “we cannot read.”
But there is the problem of more open platforms, like GNU/Linux variants, BSD and other unixes, Mac OS X, and all the non-mobile versions of Windows. All of these operating systems are already designed to allow users to execute any code they want to run. The commercial operators — Apple and Microsoft — might conceivably be compelled by Parliament to change their operating systems to block secure software in the future, but that doesn’t do anything to stop people from using all the PCs now in existence to run code that the PM wants to ban.
More difficult is the world of free/open operating systems like GNU/Linux and BSD. These operating systems are the gold standard for servers, and widely used on desktop computers (especially by the engineers and administrators who run the nation’s IT). There is no legal or technical mechanism by which code that is designed to be modified by its users can co-exist with a rule that says that code must treat its users as adversaries and seek to prevent them from running prohibited code.
This, then, is what Theresa May is proposing:
* All Britons’ communications must be easy for criminals, voyeurs and foreign spies to intercept
* Any firms within reach of the UK government must be banned from producing secure software
* All major code repositories, such as Github and Sourceforge, must be blocked
* Search engines must not answer queries about web-pages that carry secure software
* Virtually all academic security work in the UK must cease — security research must only take place in proprietary research environments where there is no onus to publish one’s findings, such as industry R&D and the security services
* All packets in and out of the country, and within the country, must be subject to Chinese-style deep-packet inspection and any packets that appear to originate from secure software must be dropped
* Existing walled gardens (like Ios and games consoles) must be ordered to ban their users from installing secure software
* Anyone visiting the country from abroad must have their smartphones held at the border until they leave
* Proprietary operating system vendors (Microsoft and Apple) must be ordered to redesign their operating systems as walled gardens that only allow users to run software from an app store, which will not sell or give secure software to Britons
* Free/open source operating systems — that power the energy, banking, ecommerce, and infrastructure sectors — must be banned outright
Theresa May will say that she doesn’t want to do any of this. She’ll say that she can implement weaker versions of it — say, only blocking some “notorious” sites that carry secure software. But anything less than the programme above will have no material effect on the ability of criminals to carry on perfectly secret conversations that “we cannot read”. If any commodity PC or jailbroken phone can run any of the world’s most popular communications applications, then “bad guys” will just use them. Jailbreaking an OS isn’t hard. Downloading an app isn’t hard. Stopping people from running code they want to run is — and what’s more, it puts the whole nation — individuals and industry — in terrible jeopardy.
That’s a technical argument, and it’s a good one, but you don’t have to be a cryptographer to understand the second problem with back doors: the security services are really bad at overseeing their own behaviour.
Once these same people have a back door that gives them access to everything that encryption protects, from the digital locks on your home or office to the information needed to clean out your bank account or read all your email, there will be lots more people who’ll want to subvert the vast cohort that is authorised to use the back door, and the incentives for betraying our trust will be much more lavish than anything a tabloid reporter could afford.
If you want a preview of what a back door looks like, just look at the US Transportation Security Administration’s “master keys” for the locks on our luggage. Since 2003, the TSA has required all locked baggage travelling within, or transiting through, the USA to be equipped with Travelsentry locks, which have been designed to allow anyone with a widely held master key to open them.
What happened after Travelsentry went into effect? Stuff started going missing from bags. Lots and lots of stuff. A CNN investigation into thefts from bags checked in US airports found thousands of incidents of theft committed by TSA workers and baggage handlers. And though “aggressive investigation work” has cut back on theft at some airports, insider thieves are still operating with impunity throughout the country, even managing to smuggle stolen goods off the airfield in airports where all employees are searched on their way in and out of their work areas.
The US system is rigged to create a halo of buck-passing unaccountability. When my family picked up our bags from our Easter holiday in the US, we discovered that the TSA had smashed the locks off my nearly new, unlocked, Travelsentry-approved bag, taping it shut after confirming it had nothing dangerous in it, and leaving it “completely destroyed” in the words of the official BA damage report. British Airways has sensibly declared the damage to be not their problem, as they had nothing to do with destroying the bag. The TSA directed me to a form that generated an illiterate reply from a government subcontractor, sent from a do-not-reply email address, advising that “TSA is not liable for any damage to locks or bags that are required to be opened by force for security purposes” (the same note had an appendix warning me that I should treat this communication as confidential). I’ve yet to have any other communications from the TSA.
Making it possible for the state to open your locks in secret means that anyone who works for the state, or anyone who can bribe or coerce anyone who works for the state, can have the run of your life. Cryptographic locks don’t just protect our mundane communications: cryptography is the reason why thieves can’t impersonate your fob to your car’s keyless ignition system; it’s the reason you can bank online; and it’s the basis for all trust and security in the 21st century.
In her Dimbleby lecture, Martha Lane Fox recalled Aaron Swartz’s words: “It’s not OK not to understand the internet anymore.” That goes double for cryptography: any politician caught spouting off about back doors is unfit for office anywhere but Hogwarts, which is also the only educational institution whose computer science department believes in “golden keys” that only let the right sort of people break your encryption.
(Image: Facepalm, Brandon Grasley, CC-BY))