Posts Tagged ‘Accidents’


Here be dragons

Friday, November 20th, 2015

The most famous experiment conducted by Los Alamos during the Manhattan Project, after the Trinity test itself, is the one with the most evocative name. “Tickling the Dragon’s Tail,” also known internally as just “Dragon,” is straightforward about its meaning, compared to the enigma of “Trinity.” Dragons don’t like to have their tails tickled — so watch out for the fire.

On the latest episode of Manhattan (204), protagonist Frank Winter encounters the "dragon" — and pushes it a little further than he ought to have.

On the latest episode of Manhattan (204), protagonist Frank Winter encounters the “dragon” — and pushes it a little further than he ought to have.

The “dragon” moniker was coined by Richard Feynman (who else?) after he about it from fellow scientist Otto Frisch. It was one of a category of criticality experiments that Frisch (nephew of Lise Meitner, co-author of the famous Frisch-Peierls report) was working on at Los Alamos. Criticality experiments were dangerous by design: they were attempts to experimentally determine the critical condition of different quantities, types, and geometries of fissile material. Because of the unknowns involved, all of these experiments involved pushing very close to the boundary of an uncontrolled fission chain reaction, an embryonic atomic bomb (or reactor) that, while probably not very explosive (it would likely destroy itself before too much energy was released), would create enough radioactivity to cause serious hazard to those working around the site.1

The experiment Feynman dubbed “dragon” was what Frisch had called the “guillotine,” and was one of the more ambitious and dangerous of Frisch’s many criticality experiments. It involved dropping a slug of enriched uranium hydride through an almost-critical assembly of the same substance. Gravity alone would cause the two pieces to briefly form a critical mass — and then to briefly un-form, before too many fission reactions had occurred. If all worked as planned, the slug would release a burst of neutrons and then stop reacting. But if the slug got stuck in the critical figuration, it would release impressive amounts of radioactivity and potentially cause a (very small) explosion.2

Otto Frisch's original "dragon" reactor — the uranium "guillotine." Source: R.E. Malenfant, "Experiments with the Dragon Machine" (LA-14241-H, August 2005).

Otto Frisch’s original “dragon” reactor — the uranium “guillotine.” Source: R.E. Malenfant, “Experiments with the Dragon Machine” (LA-14241-H, August 2005).

The experiments could produce upwards of 20 million watts worth of energy, increasing the temperature of the fuel by 2 degrees C per millisecond. At their most daring, one burst of the experiment released 1015 neutrons. These experiments were, as the official, secret Manhattan District History notes, “of historical importance,” as they constituted “the first controlled nuclear reaction which was supercritical with prompt neutrons alone.” As far as I can tell, this particular “guillotine” was the original experiment that earned the nickname “dragon,” but the name has been applied to other, similarly close-to-critical experiments as well.3

Criticality experiments were inherently dangerous. They didn’t have to kill you immediately to be a threat: it had been known since the days of the “Radium Girls” that radiation exposure could be cumulatively crippling. The experimental physicists by the 1940s had lost a bit of the “devil may care” air that they had in the early years of radioactivity, when you could spot an X-ray operator by his mangled hands. The Health Group at Los Alamos attempted to keep external radiation exposures within the national radiation standards at the time (0.1 roentgens per day), and optimistically hoped they could aim for zero internal exposures per day. For the time, this was considered conservative, though by the late 1950s the standards for exposure had dropped by a factor of seven.4

Los Alamos scientists keep their distance from a 1,000 ci radiation source used in the RaLa experiments.

The first criticality accident at Los Alamos wasn’t a fatal one, but it did cause some trouble. The experiment was (ironically, or appropriately?) made in the name of safety: it was a question of what would happen if certain geometries and enrichments of uranium were submerged in water. For a weapon that was going to be deployed to the Pacific Ocean, this was not an idle danger — sink Little Boy in the ocean and it becomes a nuclear reactor, because, for enriched materials, regular “light” water acts as a neutron moderator, lowering the effective critical mass. The Manhattan District History outlines the experiment and its outcome:

A large amount of enriched uranium, surrounded by polythene, had been placed in a container in which water was being slowly admitted. The critical condition was reached sooner than expected, and before the water level could be sufficiently lowered the reaction became quite intense. No ill effects were felt by the men involved, although one lost a little of the hair on his head. The material was so radioactive for several days that experiments planned for those days had to be postponed. [emphasis added]5

“Although one lost a little of the hair on his head” — one of those sentences one rarely runs across, especially without any further elaboration, that really sounds disturbing to the modern ear. There were other “minor” exposures too, noted briefly (and anonymously) in the Manhattan District History. Not all were related to criticality; some were related to other experiments, such as the “water boiler” and “power boiler” reactors (more on those in a second), and the RaLa (Radiolanthanum) implosion experiments:

Operation of the power boiler resulted in several instances of mild overexposure to radiation caused by leaks in the exhaust gas line and one serious exposure of several chemists during decontamination of active material. The implosion studies of the RaLa Group which used large amounts of radioactive barium and lanthanum brought a serious situation which the health group monitored closely. A series of accidents and equipment failures caused considerable overexposure of chemists in this group. This condition persisted about six months until the system of remote control operation was finally perfected.6

Interestingly, the Health Group had “no responsibility” over the criticality experiments, “except that of being sure that the men were aware of the dangers involved.” The Manhattan District History notes that the criticality experiments were “especially dangerous” because “there is no absolute way of anticipating the dangers of any particular experiment, and the experiments seem so safe when properly carried out that they lead to a feeling of overconfidence on the part of the experimenter.” The author of this section of the History attributes this overconfidence to the death of Harry Daghlian, who died after accidentally creating a critical mass with a plutonium core. It also notes another accident where “four individuals” received an “acute exposure… to a large amount of radiation” during a similar experiment. The same core would lead to the death of another scientist, Louis Slotin (known for his nonchalance regarding the hazards), less than a year later.7

Harry K. Daghlian's blistered and burnt hand after he received his fatal radiation dose from his own dragon-tickling experiment gone wrong.

Harry K. Daghlian’s blistered and burnt hand after he received his fatal radiation dose from his own dragon-tickling experiment gone wrong.

Reading through the various exposures and radiation hazards in the Manhattan District History can be a bit spine-tingling, even if one tries to have a measured view of the threats of radiation. Radiation risks, of course, are more exciting to most of us than the dozens of other ways to die at Los Alamos during the war. Radiation is relatively exotic and mysterious — simultaneously invisible to our basic senses while very easy to track and follow with the right instruments. You can’t see it until you start looking for it, and then you can find it everywhere.

But even with that caveat, some of these reports are still pretty eyebrow raising. One example: The “water boiler” reactor was a small assembly of enriched uranium used as a neutron source at the laboratory. The scientists knew it presented radiation risks: the fuel inside the reactor would get fiendishly radioactive during and after operation, and if there was a small, inadvertent explosion, it could be a real contamination problem. So they (sensibly) isolated it from the rest of the laboratory, along with the criticality experiments.8

But later study showed that they hadn’t quite solved the problem. Gaseous materials, including fission products, were being discharged “near the ground level at the tip of the mesa just to the south of Los Alamos Canyon.” This, the Manhattan District History notes, was “most unsatisfactory and represented a potential and serious health hazard.” They had warning signs, but they were “inadequate and the area was accessible to any casual visitor.” Radiation intensities “in excess of 50 r/hr were repeatedly measured near the discharge point when the boiler was in operation.” Just to put that into perspective, even by the relatively lax standards of the Manhattan Project, you would hit your yearly limit of acceptable radiation exposure if you spent about 45 minutes near the discharge point when the reactor was running. By the standards from the late 1950s onward, you would hit your yearly limit after only six minutes. (The committee recommended to put a fence around the area, and looking into building a large smoke stack. Later work determined that the larger smoke stack improved things a bit, but did not ultimately solve the problem.)9

The "Water Boiler" reactor at Los Alamos — a neat scientific experiment, but watch where you put the exhaust port. Source: Los Alamos Archives (12784), via Galison 1998.

The “Water Boiler” reactor at Los Alamos — a neat scientific experiment, but watch where you put the exhaust port. Source: Los Alamos Archives (12784), via Galison 1998.

Did these cavalier radiation exposures have long-term consequences for the scientists? (Other, of course, than the two who actually died, or the few people whose acute radiation exposures were so high that they produced obvious physical damage.) Remarkably, very little follow-up seems to have been made. It takes work to know whether there are hazards, and it takes even more work (longitudinal studies, epidemiological work, etc.) to see whether there have been health effects. Radiation-based cancers are probabilistic; exposures to radiation just increases the chance of a cancer, it doesn’t guarantee it. Epidemiological studies, like the ones done on the Japanese who survived the attacks on Hiroshima and Nagasaki, look for the statistical excesses, the cancers beyond what you would expect to naturally occur in a given population. This apparently was never done for Manhattan Project employees. There are many anecdotes about exposed employees developing debilitating health effects, but little hard science — not because the exposures or consequences didn’t happen, but because apparently nobody did the studies necessary to establish their existence.10

Why wouldn’t the Manhattan Project or Atomic Energy Commission officials follow up on this question? Two interrelated and non-exclusive hypotheses immediately spring to mind. One is that they were genuinely rather sanguine about the effects of radiation in low exposures. Their standards for “low exposures” were considerably higher than ours are today, and the requirements of war didn’t encourage them to adopt the precautionary principle, to say the least. The second is that there were legal stakes involved. They were eager, especially in the postwar, to avoid claims of radiation damage from former employees. Partially one can see in this the attitude of the bureaucrat who believes they are protecting the government’s interests (at the expense of labor’s), partially this is another reflection of the aforementioned sanguinity regarding radiation exposure (they legitimately believed the claims were probably false, or at least not provable). Following the community of scientists, technicians, and laborers after they had left the laboratory would have been difficult. And what if they had found higher-than-normal rates of injury and death? Better not to look at all, from that standpoint.11

  1. One of the key factors in designing an actual atomic bomb is holding together the reacting mass as long as possible. Without that, once enough energy has been released to separate the reacting material, the reaction will stop. So a chain-reacting critical assembly ought not release more than a few pounds of TNT worth of explosive power — but it would release an awful lot of radiation in the immediate area. []
  2. On Feynman and Frisch, and Frisch’s earlier experiments, see Richard Rhodes, Making of the Atomic Bomb (Simon and Schuster, 1986): 610-611. The description of “dragon” and its dangers in this paragraph comes from Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 15.7. For an example of the size of the explosion, consider the effect of the accidental criticality excursion on another such device, “Godiva.” []
  3. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 15.8. The “dragon” experiment had one criticality “excursion” of note, when towards the end of a series of experiments of increasing power, a burst of 6 x 1015 fission reactions occurred, blistering and swelling the cubes that composed the assembly. No one was exposed and there was no contamination, but it got put into a criticality accident report. United States Atomic Energy Commission, Operational accidents and radiation exposure experience within the United States Atomic Energy Commission (Washington, DC: Atomic Energy Commission, Division of Operational Safety, 1975), 38. []
  4. The 0.1 roentgens per day (so around 37 r per year) standard for whole-body exposure was adopted by the United States in 1934. By 1946, the US had dropped the standard by half that amount. By the late 1950s, the standard for permissible amount of radiation exposure had dropped to around 5 r per year, where it remains for people who work in nuclear settings (the standard for the general public is lower). Note that in the 1940s the roentgen unit changed to the rem, and is now measured in sieverts, but they are pretty easy to convert (~1 r = 1 rem = 1 mSv). See George T. Mazuzan and J. Samuel Walker, Controlling the Atom: The Beginnings of Nuclear Regulation 1946-1962 (Washington, DC: Nuclear Regulatory Commission, 1997), 35, 39, and 54. On Manhattan Project standards, see Vincent C. Jones, Manhattan: The Army and the Atomic Bomb (Washington, DC: Center of Military History, United States Army, 1985), 419, and Barton C. Hacker, The Dragon’s Tail: Radiation Safety in the Manhattan Project, 1942-1946 (Berkeley: University of California Press, 1987). Separately, it is of interest that the “Radium Girls” was not just an oblique connection: scientists from Los Alamos, Chicago, and Oak Ridge visited a luminous (radium) paint company in Boston to learn how they dealt with radiation hazards in industry, and adapted their techniques to the problems of dealing with plutonium. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 3.95. []
  5. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 15.10-15.11. The accident in question took place in June 1945, involved 35.4 kg of 83% enriched uranium cubes. United States Atomic Energy Commission, Operational accidents and radiation exposure experience within the United States Atomic Energy Commission (Washington, DC: Atomic Energy Commission, Division of Operational Safety, 1975), 37-38. []
  6. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 9.34. []
  7. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 9.34. []
  8. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), 6.60. []
  9. Manhattan District History, Book VIII (Los Alamos Project), Volume 2 (Technical), Supplement, 2.85. []
  10. There have been some very small-sample studies of very specific cohorts from this period, but nothing of the sort one might imagine might exist. []
  11. Gabrielle Hecht’s Being Nuclear: Africans and the Global Uranium Trade (Cambridge, Mass.: MIT Press, 2012) emphasizes, in the case of exposures from uranium mining in Africa, that the easiest way to avoid worrying about radiation exposures is not to measure them, not to do the work that makes them “exist” as observable scientific facts. []

How to die at Los Alamos

Friday, February 13th, 2015

The people who ran the Manhattan Project worried about a lot of different things. Usually when we talk about this, it’s a story about the Germans, or the Japanese, or the physics, or other very specific things of that nature. But they also worried about banal things, like occupational safety: reducing the number of people injured, or killed, as part of doing their job.

Around half of the 500,000 or so people employed by the Manhattan Project were employed in construction. As a result, most of the injuries and fatalities associated with making the bomb were of a banal, construction-related variety. Heavy machinery, ditches, collapsing buildings — these were the most dangerous parts of the project for those who made it. Occasionally there were more exotic threats. Criticality accidents took the lives of two scientists in the immediate postwar, as is well known. Concerns about criticality excursions at the plants used to enrich uranium were a non-trivial concern. And there were other, more unusal ways to die, as you would expect from any body of people that large, working over so great an area, especially when they are concentrated in places that were for much of this period constant construction sites, as were Los Alamos, Oak Ridge, and Hanford.

Exhibit 14 - Fatalities at Los Alamos

“Exhibit 14: FATAL ACCIDENTS: Since the inception of the Project in the Spring of 1943, until September 1946, twenty-four (24) fatal accidents have occurred. The following history of these incidents was taken from hospital records, reports of investigation boards, and the safety division files.”

Some time ago I happened upon a list of all of the fatal accidents that occurred at Los Alamos between its inception in 1943 through September 1946. There were exactly twenty-four, an even two-dozen ways to die while working at an isolated nuclear weapons laboratory. I reprint them here, not only because there is a morbid fascination with this sort of thing, but because I’ve found that this list gives a really remarkable summary of the people of Los Alamos, the hazards of Los Alamos, and the work that goes into making a bomb, which requires much more than star physicists to pull off successfully. Each death was followed by an inquiry.

My summaries are below; the original document (linked to at the end of this post) contains more details on some of them. The copy of the document I have is very hard to read, so I may have gotten a few of the names wrong.

  1. Estevan Roches, bulldozer operator. Crushed by a rock in his tractor while trying to build an access road to Los Alamos, at night. Died February 11, 1943.
  2. George H. Holtary, diesel motor mechanic. Was working on the power plant at Los Alamos, got crushed between a crankshaft and the housing. Died March 1, 1943.
  3. George J. Edwards, a soldier. Fell into a drainage ditch at night after drinking, injuring his back and puncturing his kidneys. Died July 19, 1943.
  4. Jose Montoya, construction laborer. Was digging an acid sewer ditch between “C” and “D” buildings. The 8-foot ditch was not reinforced and it collapsed on him. Died November 2, 1943. Investigation board recommended reinforcing ditches in the future.
  5. Pfc. Frederick Galbraith, military police. Was accidentally shot by another serviceman while sleeping. Another private was cleaning the gun and did not realize there was a live round in the chamber. It caused a severe wound in Galbraith’s thigh. He died of severe shock, November 4, 1943.
  6. Efren Lovato, construction laborer. Lovato was in the back of a dump truck being used to transport laborers to lunch. The truck’s accelerator got stuck and it crashed into a car at the pass gate and overturned, killing Lovato and another laborer, on November 20, 1943. Investigation board recommended increasing the size of the motor pool so the vehicles could be inspected more regularly.
  7. Fridon Virgil, construction laborer. Killed in the same accident as previous.
  1. Fred Wolcott, contractor engaged to clear woods near the site. Attached a bulldozer to a tree and tried to pull it out. The tree snapped and fell on him. Witnesses say he appeared to be “frozen” to the seat of his tractor. Died May 9, 1944.
  2. Elmer R. Bowen, Jr., age 10 and a half. With a friend, was using a canoe from the former Los Alamos Ranch School in the main pond. His canoe capsized; neither him nor his friend could swim, and he drowned on July 1, 1944. He was the son of a maintenance mechanic, one who remained at Los Alamos for several decades after the war, until his retirement. Canoeing prohibited after death.
  3. Ernesto Freques, truck driver. He was standing next to a pile of reinforcing steel, unaware that workers on top were trying to move pieces and having difficulty because the steel was bent. The pile of steel collapsed on him; he was pinned against the truck, his heart lacerated. Died on July 6, 1944.
  4. Horace Russell, Jr., a research chemist, age 26. Fell from a horse while riding it in a canyon near the project. Suffered a serious head injury. Died August 5, 1944. The first of only four scientists on this list.
  5. Pfc. Hugo B. Kivsto, a member of the Provisional Engineer Detachment. Was fatally injured while driving an Army vehicle on a poorly graded surface of dirt road near Santa Cruz, New Mexico. Lost control of the vehicle while rounding a hazardous curve. Tried to jump clear of the truck as it went over the embankment and was pinned under it. Died on December 3, 1944.
  1. Pvt. Grover C. Atwell, member of Special Engineer Detachment. Assigned to hospital ward duty, died of an overdose of barbiturates taken from the hospital pharmacy. He died on July 21, 1945, but his body was not found until August 22, 1945. The report does not elaborate on why there was such a delay in finding his body. The investigation concluded he was “depressed over his assignment,” no indication of financial or family difficulties. Declared mentally irresponsible for his death, and thus his “death was in the line of duty and not a result of his own misconduct.”
  2. James W. Popplewell, civilian carpenter. Was working inside a building on August 7, 1945, at the same time a caterpillar tractor was pushing dirt over the roof. The roof collapsed and both tractor and dirt crushed Popplewell. Investigation blamed the foreman for not seeing if the building could support the load of the dirt and the tractor; the foreman was recommended for termination. This is a rare case of any liability being found.
  3. Harry Daghlian, physicist, age 24. Criticality accident with the so-called “demon core.” Report notes he “was exposed to too great radiation” on August 21, died on September 15, 1945. The report carries no further information on him and says that Health Physics is still investigating the matter. Second of the four scientists.
  4. Asa Houghton, civilian carpenter. Was going down the hill from project towards Santa Fe in his truck, front wheels locked and caused vehicle to run off the left side of the road, turned 5 or 6 times. Died of internal injuries on September 27, 1945.
  1. Manuel Salazar, janitor. With three friends (also janitors), got extremely drunk on muscatel wine mixed with ethylene glycol (antifreeze). Died from ethylene glycol poisoning on January 29, 1945. Because deaths were not result of duty, descendants received no benefits of compensation.
  2. Alberto Roybal, janitor. Same event as above, same death date.
  3. Pedro Baca, janitor. Same event as above, same death date.
  4. Levi W. Cain, civilian blacksmith. Struck by car driven by a military sergeant on site. The sergeant was absolved of blame; the visibility was low, but car was not being driven at an excessive speed. Cain died on February 6, 1946.
  5. Louis Slotin, physicist, age 35. Criticality accident with the same core that killed Daghlian. While making measurements, “was exposed to radiation from radioactive materials” to a fatal degree. Third of the four scientists. Died on May 21, 1946. After Slotin’s death, criticality experiments were effectively put on hold until new safety guidelines could be devised.
  6. Livie R. Aguilar, truck driver for Zia Company. For reasons that were unknown (there were no witnesses or obvious evidence), his truck left the road and turned over into a trench, pinning Aguilard beneath it. He died on July 1, 1946.
  7. Joshua I. Schwartz, a scientist, age 21. With two other scientists (Robert A. Huffhines and William E. Bibbs), he was engaged in an experiment to trace air currents in Omega Canyon. They were instructed to use balloons or other non-flammable equipment for this. Instead, they tried to use smudge pots (smoke bombs). One of the smudge pots exploded, fatally injuring Schwartz, and critically injuring his companions (permanent blindness). Schwartz died on 2 August 1946. The investigation faulted their bosses with inadequate supervision. This resulted in at least one lawsuit over compensation. The fourth of four scientists.
  8. Herbert Schwaner, construction laborer. He was driving a bulldozer up a ramp when one of the treads locked, causing it to topple. He was pinned underneath. He was found five minutes later, by his brother, dead. He died on August 7, 1946.

It’s quite a list. Here is a copy of the original report, if you want more details on any of the above.1

Los Alamos population estimates, 1943-1946. For a more detailed breakdown of civilian duties, see this payroll census. The big dip in 1943 seems to be something about reshuffling how construction labor was accounted for when the University of California took over.

Los Alamos population estimates, 1943-1946. For a more detailed breakdown of civilian duties, see this payroll census. The big dip in 1943 seems to be something about reshuffling how construction labor was accounted for when the University of California took over.

Construction dominates, but automobiles, recreational mishaps, and scientific experiments make their appearance. As does suicide — one wonders what the report means by “depressed over his assignment” for the soldier at the hospital. The presence of a child reminds us that families lived at this secret laboratory — by the end of the war there were some 1,500 “dependents,” many of them children, living at the project site.

The Hispanic and/or Indian names point towards Los Alamos’ location. On the list of properties near the site that was seized by the Army (via condemnation), there are many Roybals, Montoyas, and Gomezes. In the list of Los Alamos badges, there are many Bacas, Virgils, Montoyas, and a Salazar.2  These are the people who lived there first, often written out of the more popular narratives of scientific triumph.

Even on the question of scientists, I was surprised to find two names I had not seen before: Russell and Schwartz. Both were young. Russell’s death adds a grim pall to all of that footage of scientists riding around in the woods on horses. Schwartz’s death is also a reminder of how much responsibility was thrust onto the young scientists — though frankly, it is maybe surprising that more people did not die this way, given the haste at which they worked and the toxicity, flammability, and radioactivity of the substances they were using.

Excerpt from a guide produced by the Oak Ridge Safety program.

Excerpt from a guide produced by the Oak Ridge Safety program.

Both Oak Ridge and Hanford had major industrial and public safety programs during the war. This was not just a matter of responsibility (though there was that), but also because industrial accidents caused lost-time problems. The more accidents, the slower it would be until they had an atomic bomb ready to use. At Oak Ridge and Hanford, they claimed an exceptional occupational safety record — their injury rates were (they claimed) 62% below those of private industry. That still translated into 62 fatalities between 1943 and 1945 at the two sites, and a 3,879 disabling injuries. Given that those sites employed some 500,000 people between them, that means your chance of dying there was about one in ten thousand, while your chance of getting disablingly injured was more around one in a hundred.

Sometimes it takes a raw document like this, something a little off the beaten path to get you out of the well-worn narratives of this history. One knows of the criticality accidents, because they are unusual, and they are famous. But who knew of the child drowning? The janitor’s night out gone wrong? The carpenter crushed by a bulldozer? The accidental shooting of a bunkmate? Out of these little details, grim as they are, a whole social ecosystem falls out. It doesn’t have to supplant the traditional scientific story, which is still an important one. But it augments it, and makes it more human.

  1. Exhibit 14, “Fatal Accidents,” (ca. late 1946) in Los Alamos Project Y, Book II: Army Organization, Administration, and Operation, copy in Manhattan Project: Official history and documents [microform] (Washington, DC: University Publications of America, 1977), reel 12. []
  2. Interestingly, I have found no badges in the list that obviously correspond to the people who died, with the exception of Elmer Bowen, Sr., the father of the little boy, and a few people who may be wives or relatives. There is a “Joe Montoya” but this seems like a common name. I wonder if this is because part of the procedure upon death would be to destroy their security passes? Obviously not everyone would have a security pass, but it is a little unusual to have exactly zero hits, including Daghlian, Slotin, Schwartz, and Russell, the scientists. []

Accidents and the bomb

Friday, April 18th, 2014

When I first heard that Eric Schlosser, the investigative journalist was writing a book on nuclear weapons accidents, I have to admit that I was pretty suspicious. I really enjoyed Fast Food Nation when it came out a decade ago. It was one of those books that never quite leaves you. The fact that the smell of McDonald’s French fries was deliberately engineered by food chemists to be maximally appealing, something I learned from Schlosser’s book, comes to mind whenever I smell any French fries. But nuclear weapons are not French fries. When writing about them, it is extremely easy to fall into either an exaggerated alarmism or a naïve acceptance of reassuring official accounts. In my own work, I’m always trying to sort out the truth of the matter, which is usually somewhere in between these two extremes.

Schlosser - Command and Control book

This is especially the case when talking about nuclear weapons accidents — the many times during the Cold War when nuclear weapons were subjected to potentially dangerous circumstances, such as being set on fire, being accidentally dropped from a bomber, crashing with a bomber, having the missile they were attached to explode, and so on. The alarmist accounts generally inflate the danger of the accidents achieving a nuclear yield; the official accounts usually dismiss such danger entirely. There are also often contradictory official accounts — sometimes even the people with clearances can’t agree on whether the weapons in question were “armed” (that is, had intact fissile pits in them), whether the chance of detonation was low or high, and so on. I’ve always been pretty wary about the topic myself for this reason. Sorting out the truth seemed like it would require a lot of work that I wasn’t interested in doing.

Well, I’m happy to report that in his new book, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of SafetySchlosser has done that work. I reviewed the book recently for Physics Today. You can read my PT review here, but the long and short of it is that I was really, really impressed with the book. And I’m not easily impressed by most works of nuclear weapons history, popular or academic. I’m not surprised it was a finalist for the Pulitzer Prize, either.

Titan II silo complex. There's a lot going on in one of these. This, and all of the other Titan II images in this post, are from Chuck Penson's wonderful, beautiful Titan II Handbook.

Titan II silo complex. There’s a lot going on in one of these. This, and all of the other Titan II images in this post, are from Chuck Penson’s wonderful, beautiful Titan II Handbook.

What I ask out of a new book is that it teach me something new — either novel facts or novel spins on things I already knew about. Schlosser’s book does both. He clearly did his homework when it came to doing the work, and it’s not really surprising it took him about a dissertation’s worth of time to write it. It’s not just a document dump of FOIA’d material, though. He really shines when contextualizing his new information, writing a very rich, synthetic history of nuclear weapons in the Cold War. So the new and the old are woven together in a really spectacular, unusually compelling fashion.

The book has two main threads. One is a very specific, moment-by-moment account of one accident. This is the so-called Damascus Accident, which is when a Titan II missile in Damascus, Arkansas, exploded in its silo in 1980, resulting in one fatality. It’s not one of the “standard” accidents one hears about, like the 1961 Goldsboro bomb, the 1958 Tybee bomb, the 1968 Thule crash, or the 1966 Palomares accident. But Schlosser’s journalist chops here really came through, as he tracked down a huge number of the people involved in the accident and used their memories, along with documentary records, to reconstruct exactly how one dropped spanner — itself just an apparently innocuous, everyday sort of mistake — could lead to such explosive outcomes.

The other thread is a more historical one, looking at the history of nuclear weapons and particular how the problem of command and control runs through it from the beginning. “Command and control” is one of those areas whose vastness I didn’t really appreciate until reading this book. Nominally it is just about making sure that you can use the weapons when you want to, but that also includes making sure that nobody is going to use the weapons when you don’t want them to, and that the weapons themselves aren’t going to do anything terrible accidentally. And this makes it mind-bogglingly complex. It gets into details about communication systems, weapons designs, delivery system designs, nuclear strategy, screening procedures, security procedures, accident avoidance, and so much more.

How do you service a Titan II? Very carefully. This is a RFHCO suit, required for being around the toxic fuel and oxidizer. Not the most comfortable of outfits. From Penson's Titan II Handbook.

How do you service a Titan II? Very carefully. This is a RFHCO suit, required for being around the toxic fuel and oxidizer. Not the most comfortable of outfits. From Penson’s Titan II Handbook.

Schlosser weaves this all together wonderfully. I found very few statements, technical or otherwise, that struck me as genuine outright errors.1 Of course, there are places where there can be differences of interpretation, but there always are. This is pretty good for any book of this length and scope — there are many academic books that I’ve read that had more technical errors than this one.

What I found really wonderful, though, is that Schlosser also managed to give a compelling explanation for the contradictory official accident accounts that I mentioned before. It’s so simple that I don’t know why it never occurred to me before: the people concerned with nuclear weapon safety were not the same people who were in charge of the weapons. That is, the engineers at Sandia who were charged with nuclear safety and surety were institutionally quite remote from the Air Force people who handled the weapons. The Air Force brass believed the weapons were safe and that to suggest otherwise was just civilian hogwash. The engineers who got into the guts of the weapons knew that it was a more complicated story. And they didn’t communicate well — sometimes by design. After awhile the Air Force stopped telling the Sandia engineers about all of the accidents, and so misinformation became rampant even within the classified system.

The fate of the world in a few punched holes. Penson: "Targeting information was stored on Mylar-backed punched paper tape. Though primitive by today's standards, punched paper tape will retain data decades longer than magnetic tapes or CDs. This tape is somewhat worse for wear from 20 years of museum use, but probably would still work."

The fate of the world in a few punched holes. Penson: “Targeting information was stored on Mylar-backed punched paper tape. Though primitive by today’s standards, punched paper tape will retain data decades longer than magnetic tapes or CDs. This tape is somewhat worse for wear from 20 years of museum use, but probably would still work.”

We usually talk about nuclear weapons safety as a question of whether they are “one-point safe.” That is, will the weapon have a chance of a nuclear yield if one point on the chemical explosives surrounding the fission pit detonated inadvertently? Most of the time the answer is no, of course not. Implosion requires a very high degree of detonation symmetry — that’s why it’s hard to make work. So a one-point detonation of the explosive lenses will produce a fizzle, spreading plutonium or uranium like a “dirty bomb” but not producing a supercritical chain reaction.

But some of the time, answer is, “well, maybe.” We usually think of implosions as complex affairs but some weapons only require two-point implosion to begin with. So now you’re no longer talking about the possibility that one out of 36 explosive lenses will go off; you’re talking about one out of two. This isn’t to say that such weapons aren’t one-point safe, just to point out that weapons design isn’t limited to the sorts of things present in the first implosion weapons.

But even this doesn’t really get at the real problem here. “One-point safe” is indeed an important part of the safety question, but not the only one. Consider, for example, what would happen if the firing signal was only a simple amount of DC electrical current. Now imagine that during a fire, the firing circuit board soldering melts and a short-circuit is formed between the batteries and the firing switch. Now the bomb is actually trying to truly set itself off as if it had been deliberately dropped — and full implosion, with nuclear yield, is totally possible.

The injector plate of a Titan II. I thought the somewhat abstract pattern of holes and corrosion on the recovered plate made for a beautiful image. The diagram at left shows you what you are looking at — this is where fuel and oxidizer would come together, propelling the missile.

The injector plate of a Titan II. I thought the somewhat abstract pattern of holes and corrosion on the recovered plate made for a beautiful image. The diagram at left shows you what you are looking at — this is where fuel and oxidizer would come together, propelling the missile.

How likely is this kind of electrically-activated nuke scenario? What the Sandia engineers discovered was that in some weapons it was really not implausible at all. Under the “abnormal environment” of a weapons accident (such as a crashing or burning B-52), all sorts of crazy things could happen with electronic circuits. And unless they were really carefully designed for the possibility of this kind of accident, they could arm themselves and fire themselves. Which is the kind of thing you’d expect an engineer who is deeply connected with the electrical technology of the bomb to conclude.

And of course, as Schlosser (and his engineer sources) point out — this kind of thing is only one small detail in the broad, broad question of nuclear safety. These systems are big, complex, and non-linear. And so much hinges on them working correctly.

The sociologist of science Donald MacKenzie has proposed (in a slightly different context — nuclear weapons accuracy, not safety) that a “certainty trough” exists with regards to complex questions of technological uncertainty. He draws it somewhat like this:2

MacKenzie's Certainty Trough

So this divides people into three groups. On the left are the people who actually build the technology and the knowledge. These people have reasonably high levels of uncertainty about the technology in question — they know the nuts and bolts of how it works and how it could go wrong. (I’ve added “confidence” as a label because I find it more straightforward than “uncertainty” at times.) They also know what kinds of failure situations are not likely as well. In the middle, you have people who are completely committed to the technology in question. These people aren’t completely divorced from solid knowledge about it, but they are just consumers of knowledge. They look at the final data, but they don’t really know how the data was made (and all of the uncertainty that gets screened out to make the final version of the data). They have very low uncertainty, and so very high confidence in the technology. At far right you have the people who are either total outsiders, or people who are totally committed to another approach. These have the highest levels of uncertainty and the lowest levels of confidence.

So if we were mapping Schlosser’s actors onto these categories, we’d have the Sandia engineers and other weapons scientists on the far left. They know what can go wrong, they know the limits of their knowledge. They also know which accident situations are outlandish. In the middle we have the military brass and even the military handlers of the weapons. They are committed to the weapons. They have data saying the weapons are safe — but they don’t know how the data was made, or how it was filtered. They think the weapons are totally safe and that anyone who suggests otherwise is just ignorant or foolish. And lastly, at far right, we have total outsiders (the activists, perhaps, or sometimes even politicians), or people who really are looking to amplify the uncertainty for their own purposes.

Titan II Launch Control Center, with the facilities console at center. From Penson.

Titan II Launch Control Center, with the facilities console at center. From Penson.

The disconnect between the far left group and the middle group is the one that disturbs me the most in Schlosser’s account. It also reflects what I’ve seen in online discussions of weapons accidents. People with a little bit of knowledge — e.g. they know about one-point safety, or they once handled nukes in the military — have very high confidence in the safety issues. But they don’t know enough to realize that under the hood, things are more complicated and have been, in the past at least, much more dangerous. Not, perhaps, as dangerous as some of the more alarmist, outsider, activist accounts have stressed. But dangerous enough to seriously concern people whose jobs it is to design the weapons — people who know about the nuts and bolts of them.

Anyway. Schlosser’s book is a great read, as well. Which it needs to be, because it is long. But it’s also segregable. Don’t care much of the details of the Damascus accident? You can skip those sections and still get a lot out of the book (even though the Damascus accident is really a perfect account of all of the little things that can go wrong with complex, non-linear systems). But part of that length is a copious amount of endnotes, which I applaud him and his publisher for including. For a book like this, you can’t skimp on the documentation, and Schlosser doesn’t. The only thing he did skimp on was illustration, which I — as a pretty visual guy — thought was too bad. So much of the Damascus story takes place inside of a Titan II silo, and while the inner flap of the cover did have a simplified illustration of one, I still felt like I didn’t really know what was happening where at times. (I wonder if this was a trade-off with the publisher in having so many notes and pages.)

Chuck Penson's Titan II Handbook, and one of its several amazing fold-out diagrams. Adorable pupper (Lyndon) for scale.

Chuck Penson’s Titan II Handbook, and one of its several amazing fold-out diagrams. Adorable pupper (Lyndon) included for scale.

Fortunately, there is a solution for this. If it were up to me, every copy of Schlosser’s book would be accompanied by a copy of Chuck Penson’s Titan II Handbook: A civilian’s guide to the most powerful ICBM America ever built. Penson’s book is a richly illustrated history of this particular missile, and contains lots of detailed photographs and accounts of daily life on a Titan II base (such as those seen above) It’s utterly fascinating and it gives so much visual life to what Schlosser describes. It also includes giant fold-out diagrams of the missiles themselves — the printing quality is really impressive all around. It includes fascinating technical details as well. For example, in the early days of the Titan II silos they had large motor-generators that constantly ran in case they needed to convert DC power into AC in the event of a failure of commercial power. Penson then notes that:

The motor-generator ran with a loud, monotonous high-pitched whine… noise in the [Launch Control Center] turned into a serious issue. Crew members complained of temporary hearing loss due not only the incessant buzz of the motor-generator, but also to the constant drone of the air conditions, fans and blowers in equipment. Eventually the Air Force covered the tile floor with carpeting, and acoustic batting was hung in the in the area of the stairway leading up to level 1 and down to level 3. … These changes made a tremendous improvement, but one that came too late for many of the crew, a significant number of whom now need hearing aids.

This kind of detail fits in perfectly with Schlosser’s approach to the facility, which itself seems strongly influenced by the sociologist Charles Perrow’s notion of “Normal Accidents.” That the devices in the facility would affect the hearing of the crew was certainly not something that anybody thought of ahead of time; it’s one of those little details that gets lost in the overall planning, but (at least for those who suffered the hearing loss) had real consequences. Ultimately this is the thesis of Schlosser’s book: that the infrastructure of nuclear command and control is much larger, much more complex, much more problematic than most people realize, and is one of those high-complexity, high-risk systems that human beings are notoriously pretty bad at managing.

If you’re the kind of person who geeks out on nuke history, both Schlosser’s and Penson’s books are must-reads, must-buys.

  1. The two biggest mistakes I noted, which I’ve told Schlosser about and may be fixed in the paperback, are that he misstates the size of the neutron initiator in the Fat Man bomb — he confuses the diameter for the radius — and he got the story of Szilard’s 1933 chain reaction work wrong, which lots of people do. Szilard’s patent is such a common source of misunderstanding even amongst scholars that I will be writing a blog post about it soon. Neither of these are terribly important to his argument or narrative. []
  2. Adapted from Donald MacKenzie, Inventing Accuracy: A Historical Sociology of Nuclear Missile Guidance (Cambridge, Mass.: MIT Press, 1990), figure 7.2. []

Castle Bravo at 60

Friday, February 28th, 2014

Tomorrow, March 1, 2014, is the 60th anniversary of the Castle Bravo nuclear test. I’ve written about it several times before, but I figured a discussion of why Bravo matters was always welcome. Bravo was the first test of a deliverable hydrogen bomb by the United States, proving that you could not only make nuclear weapons that had explosive yields a thousand times more powerful than the Hiroshima bomb, but that you could make them in small-enough packages that they could fit onto airplanes. It is was what truly inaugurated the megaton age (more so than the first H-bomb test, Ivy Mike, which was explosively large but still in a bulky, experimental form). As a technical demonstration it would be historically important even if nothing else had happened.

One of the early Bravo fallout contours. Source.

One of the early Castle Bravo fallout contours showing accumulated doses. Source.

But nobody says something like that unless other things — terrible things — did happen. Two things went wrong. The first is that the bomb was even more explosive than the scientists thought it was going to be. Instead of 6 megatons of yield, it produced 15 megatons of yield, an error of 250%, which matters when you are talking about millions of tons of TNT. The technical error, in retrospect, reveals how grasping their knowledge still was: the bomb contained two isotopes of lithium in the fusion component of the design, and the designers assumed only one of them would be reactive, but they were wrong. The second problem is that the wind changed. Instead of carrying the copious radioactive fallout that such a weapon would produce over the open ocean, where it would be relatively harmless, it instead carried it over inhabited atolls in the Marshall Islands. This necessitated evacuation, long-term health monitoring, and produced terrible long-term health outcomes for many of the people on those islands.

If it had just been natives who were exposed, the Atomic Energy Commission might have been able to keep things hushed up for awhile — but it wasn’t. A Japanese fishing boat, ironically named the Fortunate Dragon, drifted into the fallout plume as well and returned home sick and with a cargo of radioactive tuna. One of the fishermen later died (whether that was because of the fallout exposure or because of the treatment regime is apparently still a controversial point). It became a major site of diplomatic incident between Japan, who resented once again having the distinction of having been irradiated by the United States, and this meant that Bravo became extremely public. Suddenly the United States was, for the first time, admitting it had the capability to make multi-megaton weapons. Suddenly it was having to release information about long-distance, long-term contamination. Suddenly fallout was in the public mind — and its popular culture manifestations (Godzilla, On the Beach) soon followed.

Map showing points (X) where contaminated fish were caught or where the sea was found to be unusually radioactive, following the Castle Bravo nuclear test.

Map showing points (X) where contaminated fish were caught or where the sea was found to be unusually radioactive, following the Castle Bravo nuclear test. This sort of thing gets public attention.

But it’s not just the public who started thinking about fallout differently. The Atomic Energy Commission wasn’t new to the idea of fallout — they had measured the plume from the Trinity test in 1945, and knew that ground bursts produced radioactive debris.

So you’d think that they’d have made lots of fallout studies prior to Castle. I had thought about producing some kind of map with all of the various fallout plumes through the 1950s superimposed on it, but it became harder than I thought — there are just a lot fewer fallout plumes prior to Bravo than you might expect. Why? Because prior to Bravo, they generally did not map downwind fallout plumes for shots in Marshall Islands — they only mapped upwind plumes. So you get results like this for Ivy Mike, a very “dirty” 10.4 megaton explosion that did produce copious fallout, but you’d never know it from this map:

Fallout from the 1952 "Ivy Mike" shot of the first hydrogen bomb. Note that this is actually the "back" of the fallout plume (the wind was blowing it north over open sea), and they didn't have any kind of radiological monitoring set up to see how far it went. As a result, this makes it look far more local than it was in reality. This is from a report I had originally found in the Marshall Islands database.

To make it even more clear what you’re looking at here: the wind in this shot was blowing north — so most of the fallout went north. But they only mapped the fallout that went south, a tiny amount of the total fallout. So it looks much, much more contained than it was in reality. You want to shake these guys, retrospectively.

It’s not that they didn’t know that fallout went further downwind. They had mapped the Trinity test’s long-range fallout in some detail, and starting with Operation Buster (1951) they had started mapping downwind plumes for lots of tests that took place at the Nevada Test Site. But for ocean shots, they didn’t their logistics together, because, you know, the ocean is big. Such is one of the terrible ironies of Bravo: we know its downwind fallout plume well because it went over (inhabited) land, and otherwise they probably wouldn’t have bothered measuring it.

The publicity given to Bravo meant that its fallout plume got wide, wide dissemination — unlike the Trinity test’s plume, unlike the other ones they were creating. In fact, as I mentioned before, there were a few “competing” drawings of the fallout cloud circulating internally, because fallout extrapolation is non-trivially difficult:

BRAVO fallout contours produced by the AFSWP, NRDL, and RAND Corp. Source.

But once these sorts of things were part of the public discourse, it was easy to start imposing them onto other contexts beyond islands in the Pacific Ocean. They were superimposed on the Eastern Seaboard, of course. They became a stock trope for talking about what nuclear war was going to do to the country if it happened. The term “fallout,” which was not used even by the government scientists as a noun until around 1948,1 suddenly took off in popular usage:

Google Ngram chart of the usage of the word "fallout" in English language books and periodicals. Source.

Google Ngram chart of the usage of the word “fallout” in English language books and periodicals. Source.

The significance of fallout is that it threatens and contaminates vast areas — far more vast than the areas immediately affected by the bombs themselves. It means that even a large-scale nuclear attack that tries to only threaten military sites is also going to do both short-term and long-term damage to civilian populations. (As if anyone really considered just attacking military sites, though; everything I have read suggests that this kind of counter-force strategy was never implemented by the US government even if it was talked about.)

It meant that there was little escaping the consequences of a large nuclear exchange. Sure, there are a few blank areas on maps like this one, but think of all the people, all the cities, all the industries that are within the blackened areas of the map:

Oak Ridge National Laboratory estimate of "accumulated 14-day fallout dose patterns from a hypothetical attack on the United States," 1986. I would note that these are very high exposures and I'm a little skeptical of them, but in any case, it represents the kind of messages that were being given on this issue. Source.

Oak Ridge National Laboratory estimate of “accumulated 14-day fallout dose patterns from a hypothetical attack on the United States,” 1986. I would note that these are very high exposures and I’m a little skeptical of them, but in any case, it represents the kind of messages that were being given on this issue. Source.

Bravo inaugurated a new awareness of nuclear danger, and arguably, a new era of actual danger itself, when the weapons got big, radiologically “dirty,” and contaminating. Today they are much smaller, though still dirty and contaminating.

I can’t help but feel, though, that while transporting the Bravo-like fallout patterns to other countries is a good way to get a sense of their size and importance, that it still misses something. I recently saw this video that Scott Carson posted to his Twitter account of a young Marshallese woman eloquently expressing her rage about the contamination of her homeland, at the fact that people were more concerned about the exposure of goats and pigs to nuclear effects than they were the islanders:

I’ve spent a lot of time looking at the reports of the long-term health effects on the Marshallese people. It is always presented as a cold, hard science — sometimes even as a “benefit” to the people exposed (hey, they got free health care for life). Here’s how the accident was initially discussed in a closed session of the Congressional Joint Committee on Atomic Energy, for example:

Chairman Cole: “I understand even after they [the natives of Rongelap] are taken back you plan to have medical people in attendance.”

Dr. Bugher: “I think we will have to have a continuing study program for an indefinite time.”

Rep. James Van Zandt: “The natives ought to benefit — they got a couple of good baths.”

Which is a pretty sick way to talk about an accident like this, even if all of the facts aren’t in yet. Even for a classified hearing.

What’s the legacy of Bravo, then? For most of us, it was a portent of dangers to come, a peak into the dark dealings that the arms race was developing. But for the people on those islands, it meant that “the Marshall Islands” would always be followed by “where the United States tested 67 nuclear weapons” and a terrible story about technical hubris, radioactive contamination, and long-term health problems. I imagine that people from these islands and people who grew up near Chernobyl probably have similar, terrible conversations.

A medical inspection of a Marshallese woman by an American doctor. "Project 4," the biomedical effects program of Operation Castle was initially to be concerned with "mainly neutron dosimetry with mice" but after the accident an additional group, Project 4.1, was added to study the long-term exposure effects in human beings — the Marshallese. Image source.

A medical inspection of a Marshallese woman by an American doctor. “Project 4,” the biomedical effects program of Operation Castle was initially planned to be concerned with “mainly neutron dosimetry with mice” but after the accident an additional group, Project 4.1, was added to study the long-term exposure effects in human beings — the Marshallese. Image source.

I get why the people who made and tested the bombs did what they did, what their priorities were, what they thought hung in the balance. But I also get why people would find their actions a terrible thing. I have seen people say, in a flip way, that there were “necessary sacrifices” for the security that the bomb is supposed to have brought the world. That may be so — though I think one should consult the “sacrifices” in question before passing that judgment. But however one thinks of it, one must acknowledge that the costs were high.

  1. William R. Kennedy, Jr., “Fallout Forecasting—1945 through 1962,” LA-10605-MS (March 1986), on 5. []

The final switch: Goldsboro, 1961

Friday, September 27th, 2013

The threat of nuclear weapons accidents isn’t a new one. Even in 1945, Los Alamos physicists sweated when contemplating all that could possibly go wrong with their bombs, if they went off at the wrong place or the wrong time. Or didn’t go off at all. That’s the bind, really: a nuclear state wants a weapon that always goes off exactly when you tell it to, and never goes off any other time. That’s a hard thing to guarantee, especially when the stakes are so high in both directions, and especially since these two requirements can be directly in tension.

Schlosser - Command and Control book

I recently heard Eric Schlosser give that elegant formulation at a talk he gave last week in support of the release of his new book, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. I haven’t had a chance to read the book, yet (it’s currently en route), but I’m looking forward to it. I read Schlosser’s Fast Food Nation a decade (!) ago and found it completely eye-opening. But I went to his talk last week not sure what to expect. From McDonald’s to nuclear weapons accidents? Stranger things have happened, but I worried that maybe he would take the “easy” route with regards to the accidents, not bothering to learn to nitty-gritty technical details that let one talk about such things sensibly, or, at the very least, sensationalize the findings. So I was pretty pleased to find that neither seemed to be the case. Schlosser has seriously done his homework, spending 6 years digging through records, FOIAing documents, and interviewing weapons designers. His discussion of the risks seemed right on the mark so far as I could tell — they don’t need to be exaggerated one bit to be perfectly horrifying. He answered questions expertly, even a tough, devil’s-advocate one from Hugh Gusterson. So I’ve been looking forward to reading the full book.

Last week, the Guardian released a new document, obtained by Schlosser through a FOIA request, regarding one particular accident, the 1961 crash of a B-52 near Goldsboro, North Carolina, which resulted in the jettisoning of two Mark-39 hydrogen bombs. The document in question is a government nuclear expert’s evaluation of a popular account of the Goldsboro accident, in which he finds some major errors (like overstating the yield of the bomb), but ultimately concludes that at least one of the bombs was, in fact, pretty damned close to accidental detonation: “one simple, dynamo-technology, low voltage switch stood between the United States and a major catastrophe … It would have been bad news – in spades.

The bomb in question, stuck in the mud.

The bomb in question, stuck in the mud.

I’ve been watching how the above document has been discussed by people on the web. The most interesting response has been people saying, “I thought that bomb lacked a nuclear core?” You know that there have been too many nuclear weapons accidents when people start getting them confused with one another. The missing-bomb-that-maybe-lacked-a-core is the 1958 Tybee bomb, where a Mark-15 hydrogen bomb was lost near Savannah, Georgia. Different bomb, different day.

The other response I commonly saw was one that assumed that any such fears of a bomb going off accidentally were exaggerated. Now this is kind of an interesting response. For the one thing, they’re discounting a contemporary, internal, once-classified evaluation made by a relevant expert. In exchange, they’re parroting either general skepticism at the idea that a nuclear weapon could technically be unsafe, or they are parroting a standard line about how hard it is to set off an implosion bomb accidentally, because all of the lenses need to detonate at exactly the same time. Which is sometimes the right approach (though not all American bomb designs were “one-point safe” — that is, there were designs that ran a real risk of producing a nuclear yield even if just one of the explosive lenses accidentally fired), but in this case, it’s entirely irrelevant, for reasons I’ll explain below.

I’ve been in touch with Schlosser since the talk, and he shared with me a video he had (somehow) gotten his hands on produced by Sandia National Laboratory (the weapons lab that specializes in making bombs go off at just the right moment) about the Goldsboro accident. He’s put it up on YouTube for me to share with you. It is only a few minutes long and worth the watch.

I love the CGI — “all the sudden, now that weapon system is free.” The bomb looks so… liberated. And the part at the end, where they talk about how they had plenty of opportunities for future data, because there were so many accidents, is wonderfully understated. But the stuff that really hits you in your gut is the description of exactly what happened:

“All of the sudden now that weapon system [the Mk-39] is free. As the weapon dropped, power was now coming on, and the arming rods were pulled, the baroswitches began to operate.1 The next thing on the timing sequence was for the parachute to deploy. When it hit the ground, it tried to fire.” “There was still one safety device that had not operated. And that one safety device was the pre-arming switch which is operated by a 28 volt signal.” “Some people could say, hey, the bomb worked exactly like designed. Others can say, all but one switch operated, and that one switch prevented the nuclear detonation.” “Unfortunately there had been some 30-some incidents where the ready-safe switch was operated inadvertently. We’re fortunate that the weapons involved at Goldsboro were not suffering from that same malady.”

What’s amazing about the above, in part, is that everything in quotation marks is coming from Sandia nuclear weapons safety engineers, not anti-nuclear activists on the Internet. This isn’t a movie made for public consumption (and I’ve been assured that it is not classified, in case you were wondering). It’s a film for internal consumption by a nuclear weapons laboratory. So it’s hard to not take this as authoritative, along with the other aforementioned document. Anyone who brushes aside such concerns as “hysterical” is going to have to contend with the fact that this is what the nuclear weapons designers tell themselves about this accident. Which is pretty disconcerting.

There are further details in another document sent to me by Schlosser, a previously-classified review of nuclear weapons accidents from 1987 that clarifies that one of the reasons the Goldsboro bomb in particular almost detonated was because of the way it was tossed from the aircraft, which removed a horizontally-positioned arming pin. That is, an arming pin was supposed to be in a position that it couldn’t be removed accidentally, but the particulars of how violently the aircraft broke up as it crashed were what armed the bomb in question. The other bomb, the one whose parachute didn’t fire, just had its HE detonate while it was in the mud. From the 1987 review:

Before the accident, the manual arming pin in each of the bombs was in place. Although the pins required horizontal movement for extraction, they were both on a lanyard to allow the crew to pull them from the cockpit. During the breakup, the aircraft experienced structural distortion and torsion in the weapons bay sufficient to pull the pin from one of the bombs, thus arming the Bisch generator.2 The Bisch generator then provided internal power to the bomb when the pullout cable was extracted by the bomb falling from the weapons bay. The operation of the baroswitch arming system,3 parachute deployment, timer operation,4 low and high voltage thermal batteries activation, and delivery of the fire signal at the impact by the crush switch all followed as a natural consequence of the bombing falling free with an armed Bisch generator. The nonoperation of the cockpit-controlled ready-safe switch prevented nuclear detonation of the bomb. The other bomb, which free-fell, experienced HE detonation upon impact. One of the secondary subassemblies was not recovered.5

The secondary subassembly is the fusion component of the hydrogen bomb. Normally I would not be too concerned with a lost secondary in and of itself, because bad folks can’t do a whole lot with them, except that in this particular bomb, the secondary contained a significant amount of high-enriched uranium, and lost HEU is never a good thing. The government’s approach to this loss was to get an easement on the land in question that would stop anyone from digging there. Great…

Mk-39 ready-safe switch

From the video, I was also struck by the picture of the ready-safe switch then employed. I’d never seen one of these before. Presumably “S” means “safe” and “A” means “armed.” It looks ridiculously crude by modern standards, one little twirl away from being armed. This little electronic gizmo was all that stood between us and a four megaton detonation? That’s a wonderful thing to contemplate first thing in the morning. Even the later switches which they show look more crude than I’d prefer — but then again, probably all 1950s and 1960s technology is going to look crude to a modern denizen. And again, just to reiterate, we’re not talking about “merely” accidentally igniting the explosives on the primary bomb — we’re talking about the bomb actually sending a little electrical charge through the firing circuit saying “Fire!” and starting the regular, full-yield firing sequence, stopped only by this little gizmo. A little gizmo prone to accidentally firing, in some of the bombs.

Lest you think that perhaps Sandia overstates it (which seems rather unlikely), take also the testimony of Secretary of Defense Robert McNamara into account. In January of 1963, McNamara explained at a meeting between the Defense and State Departments that he was opposed to Presidential pre-delegation of nuclear weapons in part because of the danger of accidental detonation — either ours or the Soviets’. In the meeting notes, posted some time back by the National Security Archive and forwarded to me by Schlosser, McNamara’s participation is listed as follows:

Mr. McNamara went on to describe the possibilities which existed for an accidental launch of a missile against the USSR. He pointed out that we were spending millions of dollars to reduce this problem to a minimum, but that we could not assure ourselves completely against such a contingency. Moreover he suggested that it was unlikely that the Soviets were spending as much as we were in attempting to narrow the limits of possible accidental launch. He went on to describe crashes of US aircraft[,] one in North Carolina and one in Texas, where, by the slightest margin of chance, literally the failure of two wires to cross, a nuclear explosion was averted.

This one’s interesting because it embeds these accidents in a context as well — the possibility of either us, or the Soviets, accidentally launching a nuke and wondering if a full-scale nuclear exchange has to follow. It’s not quite Strangelovian, since that would require a rogue commander, but it is very Fail-Safe.

As to what the Goldsboro blast would look like, the only time we tested this warhead at full yield was the shot “Cherokee” at Operation Redwing, in 1958. It was a pretty big boom, far more impressive than some of the Hiroshima shots that have been posted along with the Goldsboro story:


And, of course, you can use the NUKEMAP to chart the damage. I’ve added the W-39 warhead to the list of presets in NUKEMAP2, using 4 megatons as the yield (the tested yield was 3.8 megatons, though the W-39 is often stated as an even 4. I rounded up, just because quibbling over 200 kilotons seemed pointless), and a fission fraction of 55%.6 It’s a pretty big explosion, with a fallout plume capable of covering tens of thousands of square miles with hazardous levels of contamination (and nearly a thousand square miles with fatal levels). Note that the Cherokee test was a true airburst (the fireball didn’t touch the ground), and so didn’t generate any significant fallout. The Goldsboro bomb, however, was meant to operate on impact, as a surface burst, and would have created significant fallout.

Again, one doesn’t have to exaggerate the risks to find it unsettling. The bomb didn’t go off, that final switch thankfully did work as intended. But that’s cold comfort, the more you learn about the accident. Our current nuclear weapons are much safer than the Mk-39 was, back in 1961, though Schlosser thinks (following the testimony of experts) there are still some unsettling aspects about several of our weapons systems. If we are going to have nukes, he reasons, we should be willing to spend whatever it costs to make sure that they’ll be safe. That seems to me like an argument guaranteed to appeal to nobody in today’s current political climate, with the left-sorts wanting no nukes and no modernization, and the right-sorts not really wanting to talk about safety issues. But I’ll get to that more another day, once I’ve read the book.

If that bomb had gone off, we’d speak of “Goldsboro” as a grim mnemonic, in the same way that we do “Chernobyl” today. One wonders how that would have changed our approach to nuclear weapons, had the final switch not held strong.

  1. The “arming rods” were pull-out switches that would activate when the weapon left the bomb bay. The baro(meter) switches were pressure sensitive switches that would make sure the bomb was nearing the appropriate height before starting the next sequence of arming. In the World War II bombs, the next stage in the sequence would be to consult radar altimeters to check the precise distance from the ground. The Goldsboro bombs were set to go off on ground impact. []
  2. A Bisch generator, as the context implies, is an electrical pulse generator. []
  3. Again, a pressure-sensitive switch that tried to guarantee that the bomb was roughly where it was supposed to be. []
  4. Timer switches were often used to make sure that the bomb cleared the aircraft before seriously starting to arm. []
  5. R.N. Brodie, “A Review of the US Nuclear Weapon Safety Program – 1945 to 1986,” SAND86-2955 [Extract] (February 1987). []
  6. Chuck Hansen, in his Swords of Armageddon, estimates that shots Cherokee and Apache of Operation Redwing had an average fission fraction of 55%, but isn’t able to get it any more precise than that. Given what I’ve read about the bomb — that it used an HEU secondary, for example — I would expect it to be at least 55%, if not more. It seems like a pretty “dirty” weapon, emphasizing a big yield in a relatively small package over any other features. See Chuck Hansen, Swords of Armageddon, V-224 (footnote 325). []