Xeni brought this to my attention yesterday. Over at Scientific American, Michael Moyer takes a critical look at an Al Jazeera story about a recent study purporting to show that infant deaths on the American West Coast increased by 35% as a result of fallout from the Fukushima Daiichi nuclear power plant meltdown.
At first glance, the story looks credible. And scary. The information comes from a physician, Janette Sherman MD, and epidemiologist Joseph Mangano, who got their data from the Centers for Disease Control and Prevention's Morbidity and Mortality Weekly Reports—a newsletter that frequently helps public health officials spot trends in death and illness.
Look closer, though, and the credibility vanishes. For one thing, this isn't a formal scientific study and Sherman and Mangano didn't publish their findings in a peer-reviewed journal, or even on a science blog. Instead, all of this comes from an essay the two wrote for Counter Punch, a political newsletter. And when Sci Am's Moyer holds that essay up to the standards of scientific research, its scary conclusions fall apart.
Sherman and Mangano tally up all the deaths of babies under one year old in eight West Coast cities: Seattle; Portland, Ore.; Boise, Idaho; and the California cities of San Francisco, Sacramento, San Jose, Santa Cruz and Berkeley. They then compare the average number of deaths per week for the four weeks preceding the disaster with the 10 weeks following. The jump–from 9.25 to 12.5 deaths per week–is "statistically significant," the authors report.
Let's first consider the data that the authors left out of their analysis. It's hard to understand why the authors stopped at these eight cities. Why include Boise but not Tacoma? Or Spokane? Both have about the same size population as Boise, they're closer to Japan, and the CDC includes data from Tacoma and Spokane in the weekly reports.
More important, why did the authors choose to use only the four weeks preceding the Fukushima disaster? Here is where we begin to pick up a whiff of data fixing. … While it certainly is true that there were fewer deaths in the four weeks leading up to Fukushima than there have been in the 10 weeks following, the entire year has seen no overall trend. When I plotted a best-fit line to the data, Excel calculated a very slight decrease in the infant mortality rate. Only by explicitly excluding data from January and February were Sherman and Mangano able to froth up their specious statistical scaremongering.
You can see that data all plotted out nicely by Moyer over at Sci Am. How did these numbers get so heinously distorted? It's hard to say. But there should be some important lessons here. In particular, this is a good reminder that human beings do not always behave the way some economists think we do. We're not totally rational creatures. And profit motive is not the only factor driving our choices.
When you think about what information be skeptical of, that decision can't begin and end with "corporate interests." Yes, those sources often give you bad information. But bad information comes from other places, too. The Fukushima accident was worse than TEPCO wanted people to believe when it first happened. Radiation isn't healthy for you, and there are people (plant workers, emergency crews, people who lived nearby) who will be dealing with the effects of Fukushima for years to come. But the fact that all of that is true does not mean that we should uncritically accept it when somebody says that radiation from Fukushima is killing babies in the United States. Just because the corporate interests are in the wrong doesn't mean that every claim against them is true.
Reality leaves us in a confusing place. As Slacktivist, one of my favorite bloggers, likes to say, "It's more complicated than that." With "it," in this case, meaning "everything." The best solution is to always ask questions, even when the people we're questioning look like the good guys.
Image: Baby toes, a Creative Commons Attribution (2.0) image from 40765798@N00's photostream