When I first heard that Eric Schlosser, the investigative journalist was writing a book on nuclear weapons accidents, I have to admit that I was pretty suspicious. I really enjoyed Fast Food Nation when it came out a decade ago. It was one of those books that never quite leaves you. The fact that the smell of McDonald’s French fries was deliberately engineered by food chemists to be maximally appealing, something I learned from Schlosser’s book, comes to mind whenever I smell any French fries. But nuclear weapons are not French fries. When writing about them, it is extremely easy to fall into either an exaggerated alarmism or a naïve acceptance of reassuring official accounts. In my own work, I’m always trying to sort out the truth of the matter, which is usually somewhere in between these two extremes.
This is especially the case when talking about nuclear weapons accidents — the many times during the Cold War when nuclear weapons were subjected to potentially dangerous circumstances, such as being set on fire, being accidentally dropped from a bomber, crashing with a bomber, having the missile they were attached to explode, and so on. The alarmist accounts generally inflate the danger of the accidents achieving a nuclear yield; the official accounts usually dismiss such danger entirely. There are also often contradictory official accounts — sometimes even the people with clearances can’t agree on whether the weapons in question were “armed” (that is, had intact fissile pits in them), whether the chance of detonation was low or high, and so on. I’ve always been pretty wary about the topic myself for this reason. Sorting out the truth seemed like it would require a lot of work that I wasn’t interested in doing.
Well, I’m happy to report that in his new book, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, Schlosser has done that work. I reviewed the book recently for Physics Today. You can read my PT review here, but the long and short of it is that I was really, really impressed with the book. And I’m not easily impressed by most works of nuclear weapons history, popular or academic. I’m not surprised it was a finalist for the Pulitzer Prize, either.
What I ask out of a new book is that it teach me something new — either novel facts or novel spins on things I already knew about. Schlosser’s book does both. He clearly did his homework when it came to doing the work, and it’s not really surprising it took him about a dissertation’s worth of time to write it. It’s not just a document dump of FOIA’d material, though. He really shines when contextualizing his new information, writing a very rich, synthetic history of nuclear weapons in the Cold War. So the new and the old are woven together in a really spectacular, unusually compelling fashion.
The book has two main threads. One is a very specific, moment-by-moment account of one accident. This is the so-called Damascus Accident, which is when a Titan II missile in Damascus, Arkansas, exploded in its silo in 1980, resulting in one fatality. It’s not one of the “standard” accidents one hears about, like the 1961 Goldsboro bomb, the 1958 Tybee bomb, the 1968 Thule crash, or the 1966 Palomares accident. But Schlosser’s journalist chops here really came through, as he tracked down a huge number of the people involved in the accident and used their memories, along with documentary records, to reconstruct exactly how one dropped spanner — itself just an apparently innocuous, everyday sort of mistake — could lead to such explosive outcomes.
The other thread is a more historical one, looking at the history of nuclear weapons and particular how the problem of command and control runs through it from the beginning. “Command and control” is one of those areas whose vastness I didn’t really appreciate until reading this book. Nominally it is just about making sure that you can use the weapons when you want to, but that also includes making sure that nobody is going to use the weapons when you don’t want them to, and that the weapons themselves aren’t going to do anything terrible accidentally. And this makes it mind-bogglingly complex. It gets into details about communication systems, weapons designs, delivery system designs, nuclear strategy, screening procedures, security procedures, accident avoidance, and so much more.
Schlosser weaves this all together wonderfully. I found very few statements, technical or otherwise, that struck me as genuine outright errors.1 Of course, there are places where there can be differences of interpretation, but there always are. This is pretty good for any book of this length and scope — there are many academic books that I’ve read that had more technical errors than this one.
What I found really wonderful, though, is that Schlosser also managed to give a compelling explanation for the contradictory official accident accounts that I mentioned before. It’s so simple that I don’t know why it never occurred to me before: the people concerned with nuclear weapon safety were not the same people who were in charge of the weapons. That is, the engineers at Sandia who were charged with nuclear safety and surety were institutionally quite remote from the Air Force people who handled the weapons. The Air Force brass believed the weapons were safe and that to suggest otherwise was just civilian hogwash. The engineers who got into the guts of the weapons knew that it was a more complicated story. And they didn’t communicate well — sometimes by design. After awhile the Air Force stopped telling the Sandia engineers about all of the accidents, and so misinformation became rampant even within the classified system.
We usually talk about nuclear weapons safety as a question of whether they are “one-point safe.” That is, will the weapon have a chance of a nuclear yield if one point on the chemical explosives surrounding the fission pit detonated inadvertently? Most of the time the answer is no, of course not. Implosion requires a very high degree of detonation symmetry — that’s why it’s hard to make work. So a one-point detonation of the explosive lenses will produce a fizzle, spreading plutonium or uranium like a “dirty bomb” but not producing a supercritical chain reaction.
But some of the time, answer is, “well, maybe.” We usually think of implosions as complex affairs but some weapons only require two-point implosion to begin with. So now you’re no longer talking about the possibility that one out of 36 explosive lenses will go off; you’re talking about one out of two. This isn’t to say that such weapons aren’t one-point safe, just to point out that weapons design isn’t limited to the sorts of things present in the first implosion weapons.
But even this doesn’t really get at the real problem here. “One-point safe” is indeed an important part of the safety question, but not the only one. Consider, for example, what would happen if the firing signal was only a simple amount of DC electrical current. Now imagine that during a fire, the firing circuit board soldering melts and a short-circuit is formed between the batteries and the firing switch. Now the bomb is actually trying to truly set itself off as if it had been deliberately dropped — and full implosion, with nuclear yield, is totally possible.
How likely is this kind of electrically-activated nuke scenario? What the Sandia engineers discovered was that in some weapons it was really not implausible at all. Under the “abnormal environment” of a weapons accident (such as a crashing or burning B-52), all sorts of crazy things could happen with electronic circuits. And unless they were really carefully designed for the possibility of this kind of accident, they could arm themselves and fire themselves. Which is the kind of thing you’d expect an engineer who is deeply connected with the electrical technology of the bomb to conclude.
And of course, as Schlosser (and his engineer sources) point out — this kind of thing is only one small detail in the broad, broad question of nuclear safety. These systems are big, complex, and non-linear. And so much hinges on them working correctly.
The sociologist of science Donald MacKenzie has proposed (in a slightly different context — nuclear weapons accuracy, not safety) that a “certainty trough” exists with regards to complex questions of technological uncertainty. He draws it somewhat like this:2
So this divides people into three groups. On the left are the people who actually build the technology and the knowledge. These people have reasonably high levels of uncertainty about the technology in question — they know the nuts and bolts of how it works and how it could go wrong. (I’ve added “confidence” as a label because I find it more straightforward than “uncertainty” at times.) They also know what kinds of failure situations are not likely as well. In the middle, you have people who are completely committed to the technology in question. These people aren’t completely divorced from solid knowledge about it, but they are just consumers of knowledge. They look at the final data, but they don’t really know how the data was made (and all of the uncertainty that gets screened out to make the final version of the data). They have very low uncertainty, and so very high confidence in the technology. At far right you have the people who are either total outsiders, or people who are totally committed to another approach. These have the highest levels of uncertainty and the lowest levels of confidence.
So if we were mapping Schlosser’s actors onto these categories, we’d have the Sandia engineers and other weapons scientists on the far left. They know what can go wrong, they know the limits of their knowledge. They also know which accident situations are outlandish. In the middle we have the military brass and even the military handlers of the weapons. They are committed to the weapons. They have data saying the weapons are safe — but they don’t know how the data was made, or how it was filtered. They think the weapons are totally safe and that anyone who suggests otherwise is just ignorant or foolish. And lastly, at far right, we have total outsiders (the activists, perhaps, or sometimes even politicians), or people who really are looking to amplify the uncertainty for their own purposes.
The disconnect between the far left group and the middle group is the one that disturbs me the most in Schlosser’s account. It also reflects what I’ve seen in online discussions of weapons accidents. People with a little bit of knowledge — e.g. they know about one-point safety, or they once handled nukes in the military — have very high confidence in the safety issues. But they don’t know enough to realize that under the hood, things are more complicated and have been, in the past at least, much more dangerous. Not, perhaps, as dangerous as some of the more alarmist, outsider, activist accounts have stressed. But dangerous enough to seriously concern people whose jobs it is to design the weapons — people who know about the nuts and bolts of them.
Anyway. Schlosser’s book is a great read, as well. Which it needs to be, because it is long. But it’s also segregable. Don’t care much of the details of the Damascus accident? You can skip those sections and still get a lot out of the book (even though the Damascus accident is really a perfect account of all of the little things that can go wrong with complex, non-linear systems). But part of that length is a copious amount of endnotes, which I applaud him and his publisher for including. For a book like this, you can’t skimp on the documentation, and Schlosser doesn’t. The only thing he did skimp on was illustration, which I — as a pretty visual guy — thought was too bad. So much of the Damascus story takes place inside of a Titan II silo, and while the inner flap of the cover did have a simplified illustration of one, I still felt like I didn’t really know what was happening where at times. (I wonder if this was a trade-off with the publisher in having so many notes and pages.)
Fortunately, there is a solution for this. If it were up to me, every copy of Schlosser’s book would be accompanied by a copy of Chuck Penson’s Titan II Handbook: A civilian’s guide to the most powerful ICBM America ever built. Penson’s book is a richly illustrated history of this particular missile, and contains lots of detailed photographs and accounts of daily life on a Titan II base (such as those seen above) It’s utterly fascinating and it gives so much visual life to what Schlosser describes. It also includes giant fold-out diagrams of the missiles themselves — the printing quality is really impressive all around. It includes fascinating technical details as well. For example, in the early days of the Titan II silos they had large motor-generators that constantly ran in case they needed to convert DC power into AC in the event of a failure of commercial power. Penson then notes that:
The motor-generator ran with a loud, monotonous high-pitched whine… noise in the [Launch Control Center] turned into a serious issue. Crew members complained of temporary hearing loss due not only the incessant buzz of the motor-generator, but also to the constant drone of the air conditions, fans and blowers in equipment. Eventually the Air Force covered the tile floor with carpeting, and acoustic batting was hung in the in the area of the stairway leading up to level 1 and down to level 3. … These changes made a tremendous improvement, but one that came too late for many of the crew, a significant number of whom now need hearing aids.
This kind of detail fits in perfectly with Schlosser’s approach to the facility, which itself seems strongly influenced by the sociologist Charles Perrow’s notion of “Normal Accidents.” That the devices in the facility would affect the hearing of the crew was certainly not something that anybody thought of ahead of time; it’s one of those little details that gets lost in the overall planning, but (at least for those who suffered the hearing loss) had real consequences. Ultimately this is the thesis of Schlosser’s book: that the infrastructure of nuclear command and control is much larger, much more complex, much more problematic than most people realize, and is one of those high-complexity, high-risk systems that human beings are notoriously pretty bad at managing.
If you’re the kind of person who geeks out on nuke history, both Schlosser’s and Penson’s books are must-reads, must-buys.
- The two biggest mistakes I noted, which I’ve told Schlosser about and may be fixed in the paperback, are that he misstates the size of the neutron initiator in the Fat Man bomb — he confuses the diameter for the radius — and he got the story of Szilard’s 1933 chain reaction work wrong, which lots of people do. Szilard’s patent is such a common source of misunderstanding even amongst scholars that I will be writing a blog post about it soon. Neither of these are terribly important to his argument or narrative. [↩]
- Adapted from Donald MacKenzie, Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance (Cambridge, Mass.: MIT Press, 1990), figure 7.2. [↩]