Redactions

Conant’s war: Inside the Mouse-Trap

by Alex Wellerstein, published January 17th, 2014

I’ve started teaching my “Science and the Cold War” course again, this time in the History Department at Georgetown University.1 The course starts with World War I and goes all the way through the early 1990s — quite a whirlwind tour of how science, technology, and the state got to be so seriously intermingled. On Tuesday I gave a lecture that forced me to go over some material I hadn’t thought about for awhile: what James B. Conant did during the war. No, not the war you’re probably thinking about.

James B. Conant (fourth from left) at a meeting with Uranium Committee principles, March 1940. Left to right: Ernest O. Lawrence, Arthur C. Compton, Vannevar Bush, Conant, Karl Compton, Alfred L. Loomis.

James B. Conant (fourth from left) at a meeting with Uranium Committee principles, March 1940. Left to right: Ernest O. Lawrence, Arthur C. Compton, Vannevar Bush, Conant, Karl Compton, Alfred L. Loomis.

James B. Conant’s wartime work is usually thought of as being part of the Second World War, but what I’m interested here is what he did during the First. During World War II, Conant was part of the scientist-administrator cabal that launched the National Defense Research Committee, the Office of Scientific Research and Development, and the Manhattan Project. He was Vannevar Bush’s right hand man, an interested, similarly-thinking scientist who tried to take the long view of things. And as President of Harvard since 1933, he commanded a lot of academic clout. He was at the Trinity test. He and Bush bent Roosevelt’s ear about making the bomb, and later trying to control it.

But Conant’s work during World War I is in some ways even more interesting, especially in that it gives an eerie prelude of things to come. I only learned about it while preparing for this class the first time around, reading James G. Hershberg’s authoritative biography, James B. Conant: Harvard to Hiroshima and the Making of the Nuclear Age (Knopf, 1993). Everything I know about Conant comes from Hershberg; if you’re interested in more, check out the book.

Conant longed to be a Harvard man. He got his B.A. there in 1914, and his Ph.D. in 1917, both in chemistry. He longed to stay. (He ended up marrying the daughter of one of the more senior professors there, potentially for careerist reasons, Hershberg hints.) But unlike many in the Yard, when war broke out in Europe, he tried to stay neutral — he brooked no anti-German sentiment, even though reports of German “atrocities” in Belgium, even after the use of chemical gas at Ypres in 1915, even after the Lusitania. Harvard itself became very politicized, mostly against the Germans.

Revenge of the Nerds: James Conant, 1921. That's right — four years after World War I ended, he still looked like an alter boy. Source: Harvard University Archives.

Revenge of the Nerds: James Conant, 1921. Don’t let the “innocent geek” look fool you — the guy could cook up some nasty brews. Source: Harvard University Archives.

What Conant did realize, though, was that there might be money to be made. With the war came shortages of organic chemicals. With shortages came the possibility of profiteering for a chemist like Conant. So Conant and two of his college friends tried to create their own little “start-up” to manufacture several key, in-demand chemicals. They bought a “shack” in Queens, and set it up to produce benzoic acid (a food preservative). It promptly burned down. Undeterred, they rented at a new location in Newark — an abandoned slaughterhouse.

Conant then received a sudden offer to teach back at Harvard. Conant promptly raced back to Cambridge — this was what he really wanted more than anything else. His company in Newark (“Aromatic Chemical”) got set up without him. And on the first production day, in November 1916… the building exploded. Which killed one Conant’s college buddies and two of the staff they had hired. (The other college buddy was merely “blown off of a ladder” and had his face and eyes scorched by corrosive chemicals, leading to only temporary blindness.)

"WAS REALLY GREAT PLAYER."

Poor Stan Pennock — “WAS REALLY GREAT PLAYER,” but was not so great chemist. Boston Daily Globe, November 29, 1916.

The 23-year-old Conant felt terrible. He blamed himself for not helping set up the plant better. Conant the social-climber managed to have his name kept out of newspaper accounts, but his dabbling in war profiteering was over. At the same time, his dabbling in war was now beginning.

By 1917, Conant’s initial skepticism of the war had faded. Unrestricted submarine warfare, the Zimmerman telegram revelation, and no doubt the fact that US entry seemed unavoidable seems to have swayed his feelings. In late March 1917 he looked for a foot-hold into the war, even though he thought of himself as a pacifist. (His one major regret at the time was that it was threatening to derail his perfect Harvard career, right when he got his foot in the door.) He ended up doing something he knew well — making chemicals. Nasty chemicals.

Fritz Haber at Ypres, 1915. (Haber is the one pointing.)

Fritz Haber at Ypres, 1915. Haber is the one pointing; chlorine gas vials sit before him.

Chlorine gas had been used first by the Germans at Ypres in 1915. Fritz Haber, one of the great chemists of the 20th-century, personally oversaw the first use. It killed a lot of Frenchmen, but didn’t get the Germans any ground, since the German troops were not exactly eager to march into trenches where gas still lingered. Still, the propaganda effect was huge — and the outcry even huger. The French and the British went from protesting the German use to developing gas masks and their own offensive chemicals. The number of agents rapidly grew, from chlorine to phosgene, from that to mustard gas. The gas didn’t end up giving anyone a major tactical advantage, though — it just became another way to make war hell.

The US was late to the chemical game, just as it was late to the war. Even though gas warfare had become a major component of the war after 1915, the US government made only feeble efforts to reach out to chemists on the issue. By the time they entered the war in 1917, they still had no gas masks, no offensive gases of their own, and no training of troops in gas procedures. They sent out an emergency plea to chemists, and to the American Chemical Society, to get them up to speed.

Mustard gas, the most noxious of the German gases, is what pushed Conant towards chemical warfare more than anything else. He talked to a colleague at MIT who set him up at American University, in Washington, DC, as a group leader for the sprawling American chemical weapons effort. At American University, there were some 60 campus buildings dedicated to chemical weapons issues, employing some 1,700 chemists, testing some 1,600 compounds on animals. In September 1917, Conant became the head of Organic Research Unit #1. His job was to make the US capable of mustard gas production — within a year it was producing 30 tons a day. Conant was hardly alone in this — it seems that practically the entire Harvard chemistry department got involved in this effort. Conant himself received a lieutenant’s commission for the job, though he later remarked that: “We were not soldiers. We were chemists dressed as officers.”

British football/soccer team in gas masks, 1916.

British football/soccer team in gas masks, 1916.

Conant drove his team hard, and was noticed for it. He moved from mustard gas to a new assignment — a nasty chemical called Lewisite, an arsenic-based compound that was advertised by Harper’s Monthly as some 72X more deadly than any other gas developed during the war (modern classifications seem to put it at only 3X more deadly than mustard gas2), but unlike mustard gas it was very acute in its effects and dissipated quickly, allowing it to be considered for offensive maneuvers.

An article in Harper’s Monthly from 19193 has one of the more florid descriptions of Lewisite that I’ve come across:

Lewisite is described as “an oily liquid of an amber color and the odor of geranium blossoms.” It is highly explosive, and on contact with water it bursts into flame. Let loose in the open air, it diffuses into a gas which kills instantly on the inhalation of the smallest amount that can by any means be measured. A single drop of the liquid on the hand causes death in a few hours, the victim dying in fearful agony. The pain on contact is acute and almost unendurable. It acts by penetrating through the skin or, in the gaseous form, through the lung tissue, poisoning the blood, affecting in turn the kidneys, the lung tissue, and the heart.

Lewisite identification poster from World War II.

Lewisite identification poster from World War II. Are geraniums one of those common smells that everyone knows?

The plant to make Lewisite was located in Willoughby, Ohio, a suburb of Cleveland. It was apparently referred to the people who worked there as “the mouse-trap.” Harper’s explained the name:

Men who went in never came out until the war was over; each of the eight hundred workers signed an agreement of voluntary imprisonment before going to work. They could write letters, but could give no address but that of a locked box in the Cleveland post-office… The hours were long, the work hard, the risk tremendous. But in spite of the frightfully poisonous nature of the stuff they were making, not a man was poisoned; the only death in the plant was from influenza. To protect the men while at work there was devised a mask and overall suit that rendered them absolutely immune. Masks that gave full protection against the most powerful German gases were useless against Lewisite.

Conant at Mouse-Trap, 1918. Source: Daily Boston Globe, May 27, 1933.

Conant at Mouse-Trap, 1918. Source: Daily Boston Globe, May 27, 1933.

Conant’s group at American University helped devise the process by which Lewisite would be manufactured. He was promoted to major and sent to Cleveland to supervise the production of the gas, officially code-named G-34, at the “Mouse-Trap” facility. The facility practiced strict compartmentalization. Conant was one of the few who knew the whole story of what they were making, and he was the top technical man at the plant. He worked around the clock and gained a reputation for easy leadership — a must for people working under those conditions. He wanted to make Lewisite because he hoped it would be “the great American gas which would win the war.

The facility was a commandeered automobile factory, and was under strict guard. Conant’s only address was Lock Drawer 426, Cleveland. I don’t know if it was really a “voluntary imprisonment” situation — that sounds possibly exaggerated, though perhaps not — but it was high security. By the end of the war the plant was producing 10 tons of Lewisite a day, ready to be shipped to Europe to be packed into artillery shells. Harper’s claimed that “half a dozen 300-pound bombs of Lewisite, exploded windward of the city of Berlin, would have killed the entire population of the German capital.” Furthermore, they reported that the preferred method for this kind of delivery was via an “automatic airplane” — a drone.

But Lewisite was never used in battle. The war ended too soon. The US stockpile of Lewisite, save for a few small samples kept for future research, was loaded onto a boat in barrels at Baltimore, taken 50 miles offshore, and sunk into the deep.

Time Magazine - James Conant

It’s hard to not see so many interesting parallels here with the atomic bomb. The eventual call of the scientists to war. The race towards a new weapon that will “win the war” — no matter how destructive. The transformation of university campuses into laboratories for weapons of mass destruction. The creation of new, top-secret facilities where compartmentalization, isolation, and secrecy rule the day. And the fact that it’s Conant resonates too. Conant was one of the earliest scientists in the uranium work to call for compartmentalization, one of the first to call for creating an isolated laboratory (Los Alamos). It’s hard not to see Conant’s lessons of World War I affecting his approach to the bomb situation in World War II. It wasn’t his first rodeo.

In 1927, Conant took his first trip to Germany. He held no ill-will towards the Germans for the First World War. While there, he met none other than Fritz Haber, who was then 60 years old. No one knows exactly what the talked about, but apparently it included both politics and, well, oxidation. Conant’s only note on Haber was that “he paid me the greatest compliment an older man can pay a younger; he listened when I spoke.”

Haber’s story ended up much more sadly than Conant’s. Haber died while being exiled from his country, a hero turned into a martyr by a government that could not tolerate the fact that he had been born a Jew. Conant went on to be President of Harvard for 20 years, to help reform the American academy, to help make the atomic bomb, and, much later, to be the US Ambassador to West Germany. It’s fascinating that these two chemical weapons pioneers — one of whom became a nuclear weapon pioneer — managed to intersect, if only briefly.

James Conant, President of Harvard, 1933. Source: Harvard University Archives.

James Conant, President of Harvard, 1933. Source: Harvard University Archives.

Conant apparently had no moral scruples with working on toxic gas. Which perhaps isn’t that surprising. The Germans used it first, after all, and it had quickly become “the norm” in the First World War. His most toxic work, in any case, was never used against anybody. The fact that his “government work” came after a shameful failure probably made it feel redeeming, as well. More generally, he wrote in the late-1960s that:

I did not see in 1917, and I do not see in 1968, why tearing a man’s guts out by a high-explosive shell is to be preferred to maiming him by attacking his lungs or his skin. All war is immoral. Logically, the 100 percent pacifist has the only impregnable position. Once that is abandoned, as it is when a nation becomes a belligerent, one can talk sensibly only in terms of the violation of agreements about the way war is conducted, or the consequences of a certain tactic or weapon.

It’s a legitimate stance, and one taken by a lot of scientists who have worked on WMDs. But it seems like kind of a cop-out to me. There are better and worse ways to wage war. Both ethically, from the point of view of who gets killed and how they get killed, but also from the standpoint of achieving practical ends that you can live with in the peacetime. If one declares that the only options are pacifism or “anything goes,” one slides down a pretty nasty slope awfully quickly. One gets what Conant is trying to indicate — that war itself is the problem, not the means — but saying that the means are just details of immorality seems to be just a bit too dismissive for me. Nations that decide that the methods of war are just practical details become the stuff of nightmares.

  1. It is not a permanent gig, before anyone congratulates me on landing a new job! Just a temporary thing. []
  2. See, e.g., the LD50 doses for Sulfur Mustard (mustard gas) and Lewisite. []
  3. Frank Parker Stockbridge, “War Inventions That Came Too Late,” Harper’s Monthly (November 1919), 828-835. []
Redactions

Nuclear history bibliography, 2013

by Alex Wellerstein, published January 6th, 2014

It’s that time again. With the New Year comes new lists, and like I did last year, I’ve tried to put together a bibliography of nuclear history scholarship that was published over the course of the year. All of the same caveats about completeness and inclusion apply — it has to be something primarily about the past, it has to be more or less a work of “history” relating to nuclear technology (I’ve left out a lot of quantitative political science because while it can be quite interesting, I’m not sure it is history), and it had to have been published in 2013. I haven’t tried to track down chapters in books (sorry) or most web-only content (which means I’ve omitted the great stuff on Able Archer 83 that the National Security Archive published, but such is life).

"Any books on atomic power?" From the New York Times Book Review, November 18, 1945.

“Any books on atomic power?” New York Times Book Review, November 18, 1945.

Looking at the list, I don’t see any obvious trends from the titles alone. Last year was the anniversary of the Cuban Missile Crisis, so that was the one obvious trend there. This year, I don’t see anything that stands out (other than sampling issues like the fact that the Bulletin of the Atomic Scientists ran an issue on nuclear culture).

I‘m sure there is much missing — so please leave me a note below in the comments section, or send me an e-mail, if you know of something that might belong here, and if I think it meets my (somewhat loose) criteria I’ll add it to the list.

As an aside, it would be great if other scholars out there would produce similar lists for their own sub-fields! It takes a lot less time than one might imagine (hooray for academic search engines), and is a great way to get a quick survey of all of those things that you didn’t know you had missed.

Read the full post »

Redactions

The year of the disappearing websites

by Alex Wellerstein, published December 27th, 2013

I’m a big fan of digital historical research. Which is to say, I’ve benefited a lot from the fact that there are a lot of great online resources for primary source work in nuclear history. These aren’t overly-curated, no-surprises resources. The paper I gave at the last History of Science Society meeting, on US interest in 50-100 megaton weapons, was surprising to pretty much everyone I told about it, yet was based almost exclusively on documents I found in online databases. You can do serious research with these, above and beyond merely “augmenting” traditional archival practices.1

One of the most interesting documents I found in an online database — an estimate for the ease of developing a 100 megaton weapon in a letter from Glenn Seaborg to Robert McNamara. Knowing the estimated yield and weight of the bombs in question allows one to divine a lot of information about their comparative sophistication.

One of the most interesting documents I found in an online database — an estimate for the ease of developing a 100 megaton weapon in a letter from Glenn Seaborg to Robert McNamara. Knowing the estimated yield and weight of the bombs in question allows one to divine a lot of information about their comparative sophistication.

Like all things, digital history comes with its pitfalls. The completely obvious one is that not everything is digitized. No surprise there. That doesn’t really change the digital archival experience from the physical one, of course, since even physical archives always are missing huge chunks of the documentation. As with “regular” archives, the researcher compensates for this by looking at many such databases, and by looking closely at the materials for references to missing documents (e.g. “In response to your letter of March 5” indicates there ought to be a letter from March 5th somewhere). This doesn’t make digital archives less useful, it just means their role cannot usually be absolute. Being able to quickly search said databases usually more than compensates for this problem, of course, since the volume of material that can be looked at quickly is so much higher than with physical paper. And I might note that one of the best part about many of the digital archives for nuclear sources is that the documents often indicate their originating archive — which can point you to sources you might not have considered (like off-the-beaten-trail National Archives facilities).

But perhaps the biggest problem with digital sources, though, is that like so many things in the digital world, they somehow have the ability to vanish completely when you really want or need them. (As opposed to the normal online trend of things sticking around forever when you wish they would go away.) The fall of 2013 was, among other things, the season of the disappearing websites. At least three major web databases of nuclear history resources that I used on a regular basis silently disappeared.

Fallout from the 1952 "Ivy Mike" shot of the first hydrogen bomb. Note that this is actually the "back" of the fallout plume (the wind was blowing it north over open sea), and they didn't have any kind of radiological monitoring set up to see how far it went. As a result, this makes it look far more local than it was in reality. This is from a report I had originally found in the Marshall Islands database.

Fallout from the 1952 “Ivy Mike” shot of the first hydrogen bomb. Note that this is actually the “back” of the fallout plume (the wind was blowing it north over open sea), and they didn’t have any kind of radiological monitoring set up to see how far it went. As a result, this makes it look far more local than it was in reality. This is from a report I had originally found in the Marshall Islands database.

The first of these, I believe (it is hard to know exactly when things vanished as opposed to when I became aware of them — in this case, September 2013) was the DOE’s Marshall Islands Document Collection. This was an impressive collection of military and civilian reports and correspondence relating primarily to US nuclear testing in the Pacific. Its provenance isn’t completely clear, but it probably came out of the work done in the mid-1990s to compensate victims of US atmospheric testing.

I found this database incredibly useful for my creation of NUKEMAP’s fallout coding. It also had lots of information on high yield testing in general, and lots of miscellaneous documents that touched on all matter of US nuclear developments through the 1960s. It used to be at this URL, which now re-directs you to a generic DOE page. I e-mailed the webmaster and was told that it isn’t really gone per se, it’s just that “Access to the HSS website has been disabled for individuals trying to access our website from the public facing side of the internet. We are working to put mitigation in place that will allow us to enable public access to our web site.” Which was several months ago, right before the government shutdown. What I fear, here, is that a temporary technical disabling of the site — because they are re-shuffling around things on their web domains, as government agencies often do — will lead to nobody ever getting it back up again.

A photograph of an early Hanford reactor that used to be in the Hanford DDRS — one of my favorites, both because of its impressive communication of activity and scale.

A photograph of an early Hanford reactor that used to be in the Hanford DDRS — one of my favorites, both because of its impressive communication of activity and scale.

Next was the Hanford Declassified Document Retrieval System which in November 2013 (or so) went offline. It used to be here, which now gives a generic “not found” message. It used to have thousands of documents and photographs relating to the Hanford Site spanning the entire history of its operation. In my research, I used it extensively for its collection of Manhattan Project security records, as well as its amazing photographs. Again, I suspect it was a creation of the mid-to-late 1990s, when “Openness” was still a thing at DOE.

I’d be the first to admit that its technical setup seemed a little shaky. It required a clunky Java applet to view the files, and its search capabilities left a little to be desired. Still, it worked, and could be actively used for research. I got in contact with someone over there, who said it had to be taken down because it had security vulnerabilities, and that eventually they planned to get it back up again, but that “we don’t have a timeline for accomplishing that right now.” They offered to search the database for me, through queries sent via e-mail, but obviously that doesn’t quite cut it in terms of accessibility (especially since my database process involves many, many queries and glancing at many, many documents, most of which are irrelevant to what I’m looking for).

Will it get back up? The guy I talked to at Hanford said they were trying to resurrect it. But I have to admit, I’m a little skeptical. It’s not at the top of their agenda, and clearly hasn’t been for over a decade. If they do get it up, I’ll be thrilled.

d: Exploratory tunnel dug by a 25-foot-diameter tunnel boring machine at the proposed  Yucca Mountain, Nevada, repository for spent nuclear fuel. From the DOE Digital Archive.

Exploratory tunnel dug by a 25-foot-diameter tunnel boring machine at the proposed
Yucca Mountain, Nevada, repository for spent nuclear fuel. From the DOE Digital Photo Archive.

Lastly, there is the DOE Digital Photo Archive, which was a publicly-accessible database of DOE photographs, from the Manhattan Project through the present. Some of these were quite stunning, and quite rare. One of my all-time favorite photographs of the nuclear age came from this database. The archive used to be here, it now redirects to a generic page about e-mailing the DOE for photographs. Not the same thing. I got in touch with someone who worked there, who said that the database site “has been closed down,” and that instead I could trawl through their Flickr feed. They, too, offered to help me find anything I couldn’t — but that doesn’t actually help me too much, given how much serendipity and judgment play in archival practice.

As an extra “bonus” lost website, Los Alamos‘ pretty-good-but-not-perfect history website was also taken down very recently, and replaced with a single, corporate-ish page that skips from World War II to the present in one impressive leap and gives nothing but a feel-good account of the first atomic bombs. The site it replaced was more nuanced, had a reasonably good collection of documents and photographs, and covered Los Alamos’ history through the Cold War pretty well. It had its issues, to be sure, including some technical bugs. But even a buggy site is better than a dead one, in my opinion. A new site is supposedly in the works, but it seems to not be a high priority and no short-term changes are expected.

None of these sites were taken down because of anything objectionable about their content, so far as I know. The issues cited have been a mixture of technical and financial (which are, of course, intertwined). Websites require maintenance. They require upkeep. They require keeping technically-inclined people on staff, with part of their day devoted to putting out the little fires that inevitably come up over the years with a long-lasting website. Databases and interactive sites in particular require considerable effort to put together, and a lot of time over the years to keep up to date in terms of security practices.

I work on web development, so I get all that. Still, it’s a terrible thing when these things just vanish. Aside from that fact that some people (I imagine more than just myself) find them useful, the amount of resources essentially wasted when such a long-term investment (think of the man-hours that went into populating those databases!) is simply turned off.

What should scholars do about it? We can complain, and sometimes that works. A better solution, perhaps, is to keep better mirrors of the sites in question. This is particularly true of sites with any potential “national security implications.” When Los Alamos took their declassified reports offline after 9/11, the Federation of American Scientists managed to cobble together a fairly complete mirror. (The Los Alamos reports have since been quietly reinstated for public access through the Los Alamos library site.)

Los Alamos Technical Reports

I wish, in retrospect, that in the past I had considered the possibility that the Hanford and Marshall Islands databases might go down. Making a mirror of a database is harder than making a mirror of a static website, but it’s not impossible. (Archive.org does not do it, before you offer that possibility up.) For the specific reports, documents, and photographs that I actually use in my work, I always have a local copy saved. But there is so much out there that was yet to be found. I might try filing a FOIA request for the underlying data (it would be trivial for me to turn them into a useful database hosted on my own servers), but I’m not sure how well that will work out (it seems to go a bit beyond a normal FOIA).

After the Hanford database went down, I thought, what are the other public databases that my work depends on? The most important is DOE’s OpenNet database, which contains an incredibly rich (if somewhat idiosyncratic) collection of documents related to nuclear weapons development. Huge chunks of my dissertation were based on records found through it, as are most of the talks I give. If it went down tomorrow, I’d be pretty sunk. For that reason, while the government was going through its shutdown last October (I figured no one would be around to object), I made a reasonably complete duplicate of everything in OpenNet using what is known as a “scraper” script.2 Obviously as OpenNet gets updated, my database will fall out of sync, but it’s a start, and it’s better than nothing if it gets unplugged tomorrow.

The amazing thing about digital databases it that they take the archive everywhere at once, instantly. The terrible thing about them is that it only takes the pull of one plug to shut it down everywhere at once, instantly. Anyone who does research on nuclear history issues should be deeply disturbed by this rash of site closures, and should start thinking seriously about how to make copies of government databases they rely on. (Private databases are more complicated, for copyright reasons.) The government gave, and the government has taken away.

  1. Which databases, you ask? 1. The CIA’s online FOIA database; 2. Gale’s DDRS database; 3. the DOD’s online FOIA database; 4. DTIC; 5. ProQuest’s Congressional hearing database; 6. the JFK Library’s online files; 7. the National Security Archive’s online database; 8. the Nuclear Testing Archive (DOE OpenNet); 9. the OHP Marshall Islands Database; 10. the ProQuest Historical Newspaper database; 11. the UN’s website; 12. the searchable Foreign Relations of the United States. The only other significant non-online archival sources were the Hansen papers at the National Security Archive and some files from the JFK Library that they provided me over e-mail. []
  2. I whipped something together using Snoopy for PHP, which allows you to do all sorts of clever database queries very easily. []
Meditations

Kilotons per kilogram

by Alex Wellerstein, published December 23rd, 2013

Nuclear weapons can be made to have pretty much as much of a bang as one wants to make them, but with increased explosive yield comes an increased weapon weight. We always talk vaguely about being able to make H-bombs to arbitrarily high yields, but recently I’ve been mulling over this fact somewhat quantitatively. I gave a talk last month at the History of Science Society Meeting on US interest in 50-100 MT bombs around the time of the Limited Test Ban Treaty, and while working on this paper I got  slightly obsessed with what is known as the yield-to-weight ratio.

Little Boy — a big bang compared to a conventional bomb, but still a very crude nuclear bomb.

Little Boy — a big bang compared to a conventional bomb, but still a very crude nuclear bomb.

What makes nuclear weapons impressive and terrible is that their default yield-to-weight ratio — that is, the amount of bang per mass, usually expressed in terms of kilotons per kilogram (kt/kg) — is much, much higher than conventional explosives. Take TNT for example. A ton of TNT weighs, well, a ton. By definition. So that’s 0.001 kilotons per 1,000 kilograms; or 0.000001 kt/kg. By comparison, even a crude weapon like the Little Boy bomb that was dropped on Hiroshima was about 15 kilotons in a 4,400 kg package: 0.003 kt/kg. That means that the Little Boy bomb had an energy density three orders of magnitude higher than a regular TNT bomb would. Now, TNT isn’t the be-all and end-all of conventional explosives, but no conventional explosive gets that much boom for its buck compared to a nuke.

The Little Boy yield is much lower than the hypothetical energy density of uranium-235. For every kilogram of uranium-235 that completely fissions, it releases about 17 kt/kg. That means that less than a kilogram of uranium-235 fissioned in the Little Boy bomb to release its 15 kilotons of energy. Knowing that there was 64 kg of uranium in the bomb, that means that something like 1.3% of the uranium in the weapon actually underwent fission. So right off the bat, one could intuit that this is something that could probably be improved upon.

Fat Man — a lot better use of fissile material than Little Boy, but no more efficient in terms of yield-to-weight.

Fat Man — a lot better use of fissile material than Little Boy, but no more efficient in terms of yield-to-weight.

The Fat Man bomb had a much better use of fissile material than Little Boy. Its yield wasn’t that much better (around 20 kilotons), but it managed to squeeze that (literally) out of only 6.2 kilograms of plutonium-239. Pu-239 releases around 19 kilotons per kilogram that completely fissions, so that means that around 15% of the Fat Man core (a little under 1 kg of plutonium) underwent fission. But the bomb itself still weighed 4,700 kg, making its yield-to-weight ratio a mere 0.004 kt/kg. Why, despite the improve efficiency and more advanced design of Fat Man, was the yield ratio almost identical to Little Boy? Because in order to get that 1 kg of fissioning, it required a very heavy apparatus. The explosive lenses weighed something like 2,400 kilograms just by themselves. The depleted uranium tamper that held the core together and reflected neutrons added another 120 kilograms.  The aluminum sphere that held the whole apparatus together weighed 520 kilograms. The ballistic case (a necessary thing for any actual weapon!) weighed another 1,400 kg or so. All of these things were necessary to make the bomb either work, or be a droppable bomb.

So it’s unsurprising to learn that improving yield-to-weight ratios was a high order of business in the postwar nuclear program. Thermonuclear fusion ups the ante quite a bit. Lithium-deuteride (LiD), the most common and usable fusion fuel, yields 50 kilotons for every kilogram that undergoes fusion — so fusion is nearly 3 times more energetic per weight than fission. So the more fusion you add to a weapon, the better the yield-to-weight ratio, excepting for the fact that all fusion weapons require a fission primary and usually also have very heavy tampers.

I took all of the reported American nuclear weapon weights and yields from Carey Sublette’s always-useful website, put them into the statistical analysis program R, and created this semi-crazy-looking graph of American yield-to-weight ratios:

Yield-to-weight ratios of US nuclear weapons

The horizontal (x) axis is the yield in kilotons (on a logarithmic scale), the vertical (y) axis is the weight in kilograms (also on a log scale). In choosing which of the weights and yields to use, I’ve always picked the lowest listed weights and the highest listed yields — because I’m interested in the optimal state of the art. The individual scatter points represent models of weapons. The size of each point represents how many of them were produced; the color of them represents when they were first deployed. Those with crosses over them are still in the stockpile. The diagonal lines indicate specific yield-to-weight ratio regions.

A few points of interest here. You can see Little Boy (Mk-1), Fat Man (Mk-3), and the postwar Fat Man improvements (Mk-4 — same weight, bigger yield) at the upper left, between 0.01 kt/kg and 0.001 kt/kg. This is a nice benchmark for fairly inefficient fission weapons. At upper right, you can see the cluster of the first H-bomb designs (TX-16, EC-17, Mk-17, EC-24, Mk-24) — high yield (hence far to the right), but very heavy (hence very high). Again, a good benchmark for first generation high-yield thermonuclear weapons.

What a chart like this lets you do, then, is start to think in a really visual and somewhat quantitative way about the sophistication of late nuclear weapon designs. You can see quite readily, for example, that radical reductions in weight, like the sort required to make small tactical nuclear weapons, generally results in a real decrease in efficiency. Those are the weapons in the lower left corner, pretty much the only weapons in the Little Boy/Fat Man efficiency range (or worse). One can also see that there are a few general trends in design development over time if one looks at how the colors trend.

First there is a movement down and to the right (less weight, more yield — improved fission bombs); there is also a movement sharply up and to the right (high weight, very high yield — thermonuclear weapons) which then moves down and to the left again (high yield, lower weight — improved thermonuclear weapons). There is also the splinter of low-weight, low-yield tactical weapons as well that jots off to the lower left. In the middle-right is what appears to be a sophisticated “sweet spot,” the place where all US weapons currently in the stockpile end up, in the 0.1-3 kt/kg range, especially the 2-3 kt/kg range:

Yield-to-weight ratios -- trends

These are the bombs like the W-76 or the B-61 — bombs with “medium” yield warheads (100s rather than 1,000s of kilotons) in relatively low weight packages (100s rather than 1000s of kilograms). These are the weapons take advantage of the fact that they are expected to be relatively accurate (and thus don’t need to be in the multi-megaton range to have strategic implications), along with what are apparently sophisticated thermonuclear design tricks (like spherical secondaries) to squeeze a lot of energy out of what is a relatively small amount of material. Take the W-76 for example: its manages to get 100 kilotons of yield out of 164 kilograms. If we assume that it is a 50/50 fission to fusion ratio, that means that it manages to fully fission about 5 kilograms of fissionable material, and to fully fuse about 2 kilograms of fusionable material. And it takes just 157 kg of other apparatus (and unfissioned or unfused material) to produce that result — which is just a little more than Shaquille O’Neal weighs.

Such weapons aren’t the most efficient. Weapon designer Theodore Taylor wrote in 1987 that 6 kiloton/kilogram had been pretty much the upper limit of what had even been achieved.1 Only a handful of weapons got close to that. The most efficient weapon in the US stockpile was the Mk-41, a ridiculously high yield weapon (25 megatons) that made up for its weight with a lot of fusion energy.

The components of the B-61 nuclear weapon — the warhead is the bullet-shape in the mid-left. The B-61 was designed for flexibility, not miniaturization, but it's still impressive that it could get 20X the Hiroshima bomb's output out of that garbage-can sized warhead.

The components of the B-61 nuclear weapon — the warhead is the bullet-shape in the mid-left. The B-61 was designed for flexibility, not miniaturization, but it’s still impressive that it could get 20X the Hiroshima bomb’s output out of that garbage-can sized warhead.

But given that high efficiency is tied to high yields — and relatively high weights — it’s clear that the innovations that allowed for the placing of warheads on MIRVed, submarine-launched platforms are still pretty impressive. The really magical range seems to be for weapons that in the hundred kiloton range (more than 100 kilotons but under a megaton), yet under 1,000 kilograms. Every one of those dates from after 1962, and probably involves the real breakthroughs in warhead design that were first used with the Operation Dominic  test series (1962). This is the kind of strategic miniaturization that makes war planners happy.

What’s the payoff of thinking about these kinds of numbers? One is that it allows you to see where innovations have been made, even if you know nothing about how the weapon works. In other words, yield-to-weight ratios can provide a heuristic for making sense of nuclear design sophistication, comparing developments over time without caring about the guts of the weapon itself. It also allows you to make cross-national comparisons in the same fashion. The French nuclear arsenal apparently developed weapons in that same miniaturized yield-to-weight range of the United States by the 1970s — apparently with some help from the United States — and so we can probably assume that they know whatever the United States figured out about miniaturized H-bomb design in the 1960s.

The Tsar Bomba: a whole lot of boom, but a whole lot of weight. The US thought they could make the same amount of boom for half the weight.

The Tsar Bomba: a whole lot of boom, but a whole lot of weight. The US thought they could make the same amount of boom for half the weight.

Or, to take another tack, and returning to the initial impetus for me looking at this topic, we know that the famous “Tsar Bomba” of the Soviet Union weighed 27,000 kilograms and had a maximum yield of 100 Mt, giving it a yield-to-weight ratio of “only” 3.43 kilotons/kilograms. That’s pretty high, but not for a weapon that used so much fusion energy. It was clear to the Atomic Energy Commission that the Soviets had just scaled up a traditional H-bomb design and had not developed any new tricks. By contrast, the US was confident in 1961 that they could make a 100 Mt weapon that weighed around 13,600 kg (30,000 lb) — an impressive 7.35 kiloton/kilogram ratio, something well above the 6 kt/kg achieved maximum. By 1962, after the Dominic series, they thought they might be able to pull off 50 Mt in only a 4,500 kg (10,000 lb) package — a kind of ridiculous 11 kt/kg ratio. (In this estimate, they noted that the weapon might have an impractically large diameter as a result, perhaps because the secondary was spherical as opposed to cylindrical.) So we can see, without really knowing much about the US had in mind, that it was planning something very, very different from what the Soviets set off.

It’s this black box approach that I find so interesting about these ratios. It’s a crude tool, to be sure, but a tool nonetheless. By looking at the broad trends, we get insights into the specifics, and peel back the veil just a tiny bit.

  1. Theodore B. Taylor, “Third Generation Nuclear Weapons,” Scientific American 256, No. 4 (April 1987), 30-39, on 34: “The yield-to-weight ratios of pure fission warheads have ranged from a low of about .0005 kiloton per kilogram to a high of about .1 kiloton per kilogram. […] The overall yield-to-weight ratio of strategic thermonuclear warheads has been as high as about six kilotons per kilogram. Although the maximum theoretical ratios are 17 and 50 kilotons per kilogram respectively for fission and fusion reactions, the maximum yield-to-weight ratio for U.S. weapons has probably come close to the practical limit owing to various unavoidable inefficiencies in nuclear weapon design (primarily arising from the fact that it is impossible to keep the weapon from disintegrating before complete fission or fusion of the nuclear explosive has taken place.” []
Visions

Art, Destruction, Entropy

by Alex Wellerstein, published December 13th, 2013

Are nuclear explosions art? Anyone who has taken even a glance into modern and contemporary art knows that the official mantra might as well be “anything goes,” but I found myself wondering this while visiting the exhibition “Damage Control: Art and Destruction since 1950” that is currently at the Hirshhorn Museum. The first thing one sees upon entering is a juxtaposition of two very different sorts of “work.” On the right is a fairly long loop of EG&G footage of nuclear test explosions, broadcast in high definition over an entirety of a wall. On the left is a piano that has been destroyed with an axe. This, I thought, is at least a provocative way to start things off.

Edgerton, Germeshausen, and Grier (EG&G) was a contractor for the federal government during the Cold War, responsible for documenting nuclear test explosions. Quite a lot of the famous Cold War nuclear detonation footage was taken by EG&G. They are perhaps most famous for their “Rapatronic” photographs, the ultimate expression of MIT engineer Harold “Doc” Edgerton’s work of slowing down time through photography, but this was only a part of their overall contribution. The film they have at the Hirshhorn is something of an EG&G “greatest hits” reel from the 1950s, and its affect on the others in the audience was palpable. Adults and children alike were drawn to the blasts, displayed one after another without commentary or explanation.1 Their reactions didn’t strike me as one of disgust or horror, but of amazement and awe. Most of the footage was from the Nevada Test Site, so the bombs were generally just blowing up desert scrub, and occasionally houses constructed for effects testing.

The destroyed piano, by contrast, got reactions of shock and disgust. It was the remains of a piece of performance art conducted by Raphael Montañez Ortiz, one of several he’s done, apparently. My wife, a piano player and a nuclear historian, also found it disturbing. “If you know what goes into making a piano…,” she started to say. “But then again, if you know what goes into making a city…,” she caught herself. I overheard other people say similar things.

The difference in reactions isn’t too surprising — it’s a common theme that it is easy to appreciate the destruction of something at a human scale, difficult to appreciate it at the scale of nuclear bomb. A lot of what I’ve spent time doing, with the NUKEMAP and my writing, is to try to understand, and to impart, the scale of a nuclear explosion. A lot of this has involved looking at the attempts of others, as well, from official Cold War visualizations made for secret committees to popular films, as they have tried to communicate this to their target audiences. The hardest thing is that our brains appear only to be wired for empathy at the individual level, and don’t readily apply it to large groups or large areas. The best work in these areas conveys both the broad scope of destruction, but then ties it into the personal. They individualize the experience of mass ruination.

And the EG&G footage isn’t trying to do that. It was data meant for very specific technical purposes. It was developed in order to further the US nuclear program, and defense against Soviet nuclear weapons. Which is why I somewhat question its inclusion, or, at least, its decontextualization. It is art only in the sense that it has aesthetics and it has been put into an art gallery. One can read into it whatever one wants, of course, but it wasn’t created to have deep meaning and depth in that sense. (Whether one cares about authorial intention, of course, is its own can of modern art worms.) Just as a small example of what I mean, Andy Warhol famously made a print of mushroom clouds for his own “disaster” series (a few of which, but not this print, were featured in the exhibit):

"Atomic Bomb," Andy Warhol, 1965.

“Atomic Bomb,” Andy Warhol, 1965.

Now Warhol is a complicated character, but since he was explicitly an artist I think it is always fair game to talk about his possible intentions, the aesthetics of the piece, the deeper meanings, and so on. Warhol’s art has generally been interpreted to be about commercialization and commodification. The mushroom cloud in repetition becomes a statement about our culture and its fascination with mass destruction, perhaps. Coming in the mid-1960s, after the close-call terrors of the early years of the decade, perhaps it was maybe too-little too-late, but still, it has an ominous aesthetic appeal, perhaps now more than then.

Because I don’t think this image was widely circulated at the time, I doubt that Warhol knew that Berlyn Brixner, the Trinity test photographer, had made very similar sorts of images of the world’s first nuclear fireball at “Trinity”:

TR-NN-11, Berlyn Brixner, 1945.

“TR-NN-11,” Berlyn Brixner, 1945.

Brixner appreciated the aesthetics and craft of his work, to be sure. But the above photograph is explicitly a piece of technical data. It is designed to show the Trinity fireball’s evolution over the 15-26 millisecond range. Warhol’s instrument of choice was the silkscreen printer; Brixner’s was the 10,000 fps “Fastax” camera. There’s a superficial similarity in their atomic repetition. You could make a statement by putting them next to each other — as I am doing here! — but properly understood, I think, they are quite different sorts of works.

Don’t get me wrong. Re-appropriating “non-art” into “art” has been a common move over much of the 20th century at the very least. But the problem for me is not that people shouldn’t appreciate the aesthetics of the “non-art.” It’s that focusing on the aesthetics makes it easy to lose sight of the context. (As well as the craft — Brixner’s work was exponentially more difficult to produce than Warhol’s!) The EG&G footage in the exhibit doesn’t explain much of how, or why, it was made. It seems to be asking the viewer to appreciate it solely on its aesthetic grounds. Which I think is the real problem. Many of the tests they show resulted in significant downwind fallout for the populations near the Nevada Test Site. Many of them involved the development of new, ever-more elaborate ways of mass destruction. Many of them were the product of years of top scientific manpower, untold riches, and a deep political context. To appreciate them as simply big, bright booms robs them of something — no matter how aesthetically beautiful those big, bright booms actually are. 

Gustav Metzger's "auto-destructive" art.

Gustav Metzger’s “auto-destructive” art.

What makes it more ironic is that the exhibit actually does give considerable context to some of the works that are explicitly “art.” You have to explain the context of Gustav Metzger’s “auto-destructive” art — it involves him filming himself painting on canvases with a strong acid, so the artwork destroys itself in the process. Without the context there, what is left is just a boring, not-very-comprehensible movie of a man destroying a blank canvas. But anyway.

In terms of the audience at the exhibit, which was fairly well-attended when I was there with my wife, the most interesting part was the handling of children. The Smithsonian museums are of course explicitly places that people take their children while visiting the city, so it’s no surprise that you probably find more of them at the Hirshhorn than you would at MOMA or other similar institutions. But children add a level of anxiety to an exhibit about destruction. They were wowed by the wall-o’-bombs but not, it seemed, by the piano. Parents seemed to let them wander free through most of it, but there were several films where I saw kids get yanked out by their parents once the parents realized the content was going to be disturbing. In one of these films, the “disturbing” content was of a variety that might have been hard for the children to directly understand — the famous film of the Hindenburg going up in flame, for example, where the violence was real but seen from enough of a distance to keep you from seeing actual injuries or bodies. The one I saw the kids getting really removed from (by their parents, not the museum) was footage of the 2011 Vancouver riots. I wasn’t impressed too much with the footage itself (its content was interesting in a voyeuristic way, but there seemed to be nothing special about the filming or editing), but the immediacy of its violence was much more palpable than the violence-at-a-distance that one saw in most of the other such works. It’s cliche to trot out that old quote attributed (probably wrongly) to Stalin that one death is a tragedy, a million is a statistic, but there’s something deeply true to it about how we perceive violence and pain.

Damage Control exhibit site

There are a lot of works in the exhibit. As one would expect, some hew to the theme very closely, some are a bit more tenuous. Overall, though, it was pretty interesting, and if you’re in town, you ought to check it out. The original comment my wife made about pianos and cities stuck with me as I looked at all of the various meditations on “destruction.” In it, I kept coming back to the second law of thermodynamics. On the face of it, it is a very clinical, statistical law: “the entropy of an isolated system never decreases.” It is actually quite profound, something that the 19th-century physicists who developed it knew. Entropy can be broadly understood as “disorder.” The second law of thermodynamics says, in essence, that without additional energy being put into it, everything eventually falls apart. It takes work to keep things “organized,” whether they are apartments, bodies, or cities.2 Ludwig Boltzmann, who helped formulate the law, stated gnomically in 1886 that:

The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy, which exists in plenty in any body in the form of heat Q, but of a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.

In other words, life itself is a struggle against entropy. Our bodies are constantly taking disordered parts of the world (heat energy, for example, and the remains of other living things) and using them to fuel the work of keeping us from falling apart.

But the other way to think about this law is that generally it is easier to take things apart than it is to keep them together. It is easier to convert a piano into a low-energy state (through an axe, or perhaps a fire) than it is to make a piano in the first place. It is easier to destroy a city than it is to make a city. The three-year effort of the half-a-million people on the Manhattan Project was substantial, to be sure, but still only a fraction of the work it took to make the cities of Hiroshima and Nagasaki, and all that they contained, biological and material, in the first place.

Of course, the speed at which entropy increases is often controllable. The universe will eventually wear out — but not for a long time. Human civilization will necessarily go extinct — but it doesn’t have to happen anytime soon. What hits home with the “Damage Control” exhibit is how we as a species have to work so hard to keep everything together, while simultaneously working so hard to find ways to make everything fall apart. And in this, perhaps, it is a success, even if I left with many niggling questions about the presentation of some of the works in particular.

  1. Various guys in the audience would occasionally try to give explanation to their loved ones, and they were generally incorrect, alas. “That must be at Alamogordo… That’s got to be an H-bomb…” no, no, no. Of course, I was there with my wife, and I was talking up my own little storm (though less loudly than the wrong guys), but at least I know my stuff for the most part… []
  2. The key, confusing part about the second law is the bit about the “isolated system.” It doesn’t say that entropy always increases. It says that in an isolated system — that is, a system with no energy being input into it — entropy always increases. For our planet, the Sun is the source of that input, and you can trace, through a long series of events, its own negative entropy to the Big Bang itself. []