The most famous experiment conducted by Los Alamos during the Manhattan Project, after the Trinity test itself, is the one with the most evocative name. “Tickling the Dragon’s Tail,” also known internally as just “Dragon,” is straightforward about its meaning, compared to the enigma of “Trinity.” Dragons don’t like to have their tails tickled — so watch out for the fire.
The “dragon” moniker was coined by Richard Feynman (who else?) after he heard about it from fellow scientist Otto Frisch. It was one of a category of criticality experiments that Frisch (nephew of Lise Meitner, co-author of the famous Frisch-Peierls report) was working on at Los Alamos. Criticality experiments were dangerous by design: they were attempts to experimentally determine the critical condition of different quantities, types, and geometries of fissile material. Because of the unknowns involved, all of these experiments involved pushing very close to the boundary of an uncontrolled fission chain reaction, an embryonic atomic bomb (or reactor) that, while probably not very explosive (it would likely destroy itself before too much energy was released), would create enough radioactivity to cause serious hazard to those working around the site.
The experiment Feynman dubbed “dragon” was what Frisch had called the “guillotine,” and was one of the more ambitious and dangerous of Frisch’s many criticality experiments. It involved dropping a slug of enriched uranium hydride through an almost-critical assembly of the same substance. Gravity alone would cause the two pieces to briefly form a critical mass — and then to briefly un-form, before too many fission reactions had occurred. If all worked as planned, the slug would release a burst of neutrons and then stop reacting. But if the slug got stuck in the critical figuration, it would release impressive amounts of radioactivity and potentially cause a (very small) explosion.
The experiments could produce upwards of 20 million watts worth of energy, increasing the temperature of the fuel by 2 degrees C per millisecond. At their most daring, one burst of the experiment released 1015 neutrons. These experiments were, as the official, secret Manhattan District History notes, “of historical importance,” as they constituted “the first controlled nuclear reaction which was supercritical with prompt neutrons alone.” As far as I can tell, this particular “guillotine” was the original experiment that earned the nickname “dragon,” but the name has been applied to other, similarly close-to-critical experiments as well.
Criticality experiments were inherently dangerous. They didn’t have to kill you immediately to be a threat: it had been known since the days of the “Radium Girls” that radiation exposure could be cumulatively crippling. The experimental physicists by the 1940s had lost a bit of the “devil may care” air that they had in the early years of radioactivity, when you could spot an X-ray operator by his mangled hands. The Health Group at Los Alamos attempted to keep external radiation exposures within the national radiation standards at the time (0.1 roentgens per day), and optimistically hoped they could aim for zero internal exposures per day. For the time, this was considered conservative, though by the late 1950s the standards for exposure had dropped by a factor of seven.
The first criticality accident at Los Alamos wasn’t a fatal one, but it did cause some trouble. The experiment was (ironically, or appropriately?) made in the name of safety: it was a question of what would happen if certain geometries and enrichments of uranium were submerged in water. For a weapon that was going to be deployed to the Pacific Ocean, this was not an idle danger — sink Little Boy in the ocean and it becomes a nuclear reactor, because, for enriched materials, regular “light” water acts as a neutron moderator, lowering the effective critical mass. The Manhattan District History outlines the experiment and its outcome:
A large amount of enriched uranium, surrounded by polythene, had been placed in a container in which water was being slowly admitted. The critical condition was reached sooner than expected, and before the water level could be sufficiently lowered the reaction became quite intense. No ill effects were felt by the men involved, although one lost a little of the hair on his head. The material was so radioactive for several days that experiments planned for those days had to be postponed. [emphasis added]
“Although one lost a little of the hair on his head” — one of those sentences one rarely runs across, especially without any further elaboration, that really sounds disturbing to the modern ear. There were other “minor” exposures too, noted briefly (and anonymously) in the Manhattan District History. Not all were related to criticality; some were related to other experiments, such as the “water boiler” and “power boiler” reactors (more on those in a second), and the RaLa (Radiolanthanum) implosion experiments:
Operation of the power boiler resulted in several instances of mild overexposure to radiation caused by leaks in the exhaust gas line and one serious exposure of several chemists during decontamination of active material. The implosion studies of the RaLa Group which used large amounts of radioactive barium and lanthanum brought a serious situation which the health group monitored closely. A series of accidents and equipment failures caused considerable overexposure of chemists in this group. This condition persisted about six months until the system of remote control operation was finally perfected.
Interestingly, the Health Group had “no responsibility” over the criticality experiments, “except that of being sure that the men were aware of the dangers involved.” The Manhattan District History notes that the criticality experiments were “especially dangerous” because “there is no absolute way of anticipating the dangers of any particular experiment, and the experiments seem so safe when properly carried out that they lead to a feeling of overconfidence on the part of the experimenter.” The author of this section of the History attributes this overconfidence to the death of Harry Daghlian, who died after accidentally creating a critical mass with a plutonium core. It also notes another accident where “four individuals” received an “acute exposure… to a large amount of radiation” during a similar experiment. The same core would lead to the death of another scientist, Louis Slotin (known for his nonchalance regarding the hazards), less than a year later.
Reading through the various exposures and radiation hazards in the Manhattan District History can be a bit spine-tingling, even if one tries to have a measured view of the threats of radiation. Radiation risks, of course, are more exciting to most of us than the dozens of other ways to die at Los Alamos during the war. Radiation is relatively exotic and mysterious — simultaneously invisible to our basic senses while very easy to track and follow with the right instruments. You can’t see it until you start looking for it, and then you can find it everywhere.
But even with that caveat, some of these reports are still pretty eyebrow raising. One example: The “water boiler” reactor was a small assembly of enriched uranium used as a neutron source at the laboratory. The scientists knew it presented radiation risks: the fuel inside the reactor would get fiendishly radioactive during and after operation, and if there was a small, inadvertent explosion, it could be a real contamination problem. So they (sensibly) isolated it from the rest of the laboratory, along with the criticality experiments.
But later study showed that they hadn’t quite solved the problem. Gaseous materials, including fission products, were being discharged “near the ground level at the tip of the mesa just to the south of Los Alamos Canyon.” This, the Manhattan District History notes, was “most unsatisfactory and represented a potential and serious health hazard.” They had warning signs, but they were “inadequate and the area was accessible to any casual visitor.” Radiation intensities “in excess of 50 r/hr were repeatedly measured near the discharge point when the boiler was in operation.” Just to put that into perspective, even by the relatively lax standards of the Manhattan Project, you would hit your yearly limit of acceptable radiation exposure if you spent about 45 minutes near the discharge point when the reactor was running. By the standards from the late 1950s onward, you would hit your yearly limit after only six minutes. (The committee recommended to put a fence around the area, and looking into building a large smoke stack. Later work determined that the larger smoke stack improved things a bit, but did not ultimately solve the problem.)
Did these cavalier radiation exposures have long-term consequences for the scientists? (Other, of course, than the two who actually died, or the few people whose acute radiation exposures were so high that they produced obvious physical damage.) Remarkably, very little follow-up seems to have been made. It takes work to know whether there are hazards, and it takes even more work (longitudinal studies, epidemiological work, etc.) to see whether there have been health effects. Radiation-based cancers are probabilistic; exposures to radiation just increases the chance of a cancer, it doesn’t guarantee it. Epidemiological studies, like the ones done on the Japanese who survived the attacks on Hiroshima and Nagasaki, look for the statistical excesses, the cancers beyond what you would expect to naturally occur in a given population. This apparently was never done for Manhattan Project employees. There are many anecdotes about exposed employees developing debilitating health effects, but little hard science — not because the exposures or consequences didn’t happen, but because apparently nobody did the studies necessary to establish their existence.
Why wouldn’t the Manhattan Project or Atomic Energy Commission officials follow up on this question? Two interrelated and non-exclusive hypotheses immediately spring to mind. One is that they were genuinely rather sanguine about the effects of radiation in low exposures. Their standards for “low exposures” were considerably higher than ours are today, and the requirements of war didn’t encourage them to adopt the precautionary principle, to say the least. The second is that there were legal stakes involved. They were eager, especially in the postwar, to avoid claims of radiation damage from former employees. Partially one can see in this the attitude of the bureaucrat who believes they are protecting the government’s interests (at the expense of labor’s), partially this is another reflection of the aforementioned sanguinity regarding radiation exposure (they legitimately believed the claims were probably false, or at least not provable). Following the community of scientists, technicians, and laborers after they had left the laboratory would have been difficult. And what if they had found higher-than-normal rates of injury and death? Better not to look at all, from that standpoint.