My latest Locus column is "Cold Equations and Moral Hazard", an essay about the way that our narratives about the future can pave the way for bad people to create, and benefit from, disasters. "If being in a lifeboat gives you the power to make everyone else shut the hell up and listen (or else), then wouldn’t it be awfully convenient if our ship were to go down?"
Apparently, editor John W. Campbell sent back three rewrites in which the pilot figured out how to save the girl. He was adamant that the universe must punish the girl.
The universe wasn’t punishing the girl, though. Godwin was – and so was Barton (albeit reluctantly).
The parameters of ‘‘The Cold Equations’’ are not the inescapable laws of physics. Zoom out beyond the page’s edges and you’ll find the author’s hands carefully arranging the scenery so that the plague, the world, the fuel, the girl and the pilot are all poised to inevitably lead to her execution. The author, not the girl, decided that there was no autopilot that could land the ship without the pilot. The author decided that the plague was fatal to all concerned, and that the vaccine needed to be delivered within a timeframe that could only be attained through the execution of the stowaway.
It is, then, a contrivance. A circumstance engineered for a justifiable murder. An elaborate shell game that makes the poor pilot – and the company he serves – into victims every bit as much as the dead girl is a victim, forced by circumstance and girlish naïveté to stain their souls with murder.
Moral hazard is the economist’s term for a rule that encourages people to behave badly. For example, a rule that says that you’re not liable for your factory’s pollution if you don’t know about it encourages factory owners to totally ignore their effluent pipes – it turns willful ignorance into a profitable strategy.