Posts Tagged ‘Graphic design’

Visions

Firebombs, U.S.A.

Wednesday, March 12th, 2014

After the atomic bombs were dropped on Japanese cities, it didn’t take long for the US public, to start drawing what it would look like if atomic bombs went off over their own cities. PM, a New York City newspaper, may have inaugurated the genre with its August 7, 1945, issue, when it took what scant facts were known about Hiroshima and superimposed the data onto the Manhattan skyline:

PM - NYC atomic bomb - August 1945

This impulse — to see what the bomb did to others, and then to apply it to one’s own cities — worked on at least two levels. In once sense it was about making sense of the damage in intuitive terms, because maps of Hiroshima don’t make a lot of intuitive sense unless you know Hiroshima, the city. Which very few Americans would.

But it’s also a recognition that atomic bombs could possibly be dropped on the USA in the future. The atomic bomb was immediately seen as a weapon of the next war as well as the present one. It was a weapon that would, eventually, make the United States very vulnerable.

Considering how many non-atomic bombs the US dropped on Japan during the war, it’s a little interesting that nobody has spent very much time worrying about what would happen if someone firebombed the United States. Why not? Because the U.S. has never imagined that any other nation would have the kind of air superiority to pull off sustained operations like that. No, if someone was going to bomb us, it would be a one-time, brief affair.

When the US did invoke American comparisons for firebombing, it was to give a sense of scale. So the Arnold report in 1945 included this evocative diagram of Japanese cities bombed, with American cities added to give a sense of relative size:

Arnold map - Japan firebombing

So I was kind of interested to find that in the final, late-1945 issue of IMPACT, a US Army Air Forces magazine, contained a really quite remarkable map. They took the same data of the above map — the Japanese cities and their equivalent US cities — and projected them not on Japan, but on the continental United States.

It’s the only attempt I’ve seen to make a visualization that showed the damage of the ruinous American air campaign against Japan in such a vivid way:1

Click to enlarge.

Click to enlarge.

The correspondences between US and Japanese cities were chosen based on the US Census of 1940 and presumably a Japanese census from around the same period. The above map isn’t, the text emphasizes, a realistic attack scenario. Rather, it is meant to show this:

If the 69 U.S. cities on the map at right had been mattered by Jap bombers free to strike any time and anywhere in this country, you can vividly imagine the frightful impact it would have had upon our morale and war potential. Yet this is precisely what the B-29s did to Japan.

What’s remarkable is that this isn’t some kind of anti-bombing screed; it’s pro-bombing propaganda. Both of these images are bragging. The text goes on to emphasize that if someone were really targeting the US, they’d hit industrial centers like Detroit, Philadelphia, and Pittsburgh — to say nothing of Washington, DC, which is conspicuously absent and unmentioned.

IMPACT was classified “confidential” during the war, meaning it had a circulation of about 10,000 airmen. It’s a pretty wonderful read in general — it’s a vociferously pro-Air Forces rag, and is all about the importance of strategic bombing. As one might expect, it de-emphasizes the atomic bombings, in part to push back against the very public perception that we have today, where the last two major bombings are emphasized and the other 67 are forgotten. On the above maps, Hiroshima and Nagasaki are unremarkable, easily in the crowd.

I thought it would be interesting to copy out all of the data (city names, damage percentages, and look up the US Census data) and put it into an interactive visualization using a Javascript toolkit called D3. If you have a reasonably modern browser (one that supports SVG images), then check it out here:

Firebombs, USA, interactive

One thing you notice quickly when putting it this way is how large some of the metropolises were versus the relatively modest of most of the other cities. The idea of someone bombing out 55% of Sacramento, or 64% of Stockton, or 96% of Chattanooga, is kind of mind-melting. Much less to consider that a New York City minus 40% of its land area would look like.2

You can also see how cramped Japan is compared to the USA (they are at the same scale in the above image, though the projections are a bit tweaked for the layout). Even that could be more emphasized, as the text does: because Japan is so mountainous, its inhabited area is only roughly the size of Montana. So it’s even smaller than it looks.

Still, for me it’s just remarkable that this mode of visualization would be used in an official publication. These guys wanted people to understand what they had done. They wanted people to know how bad it had been for Japan. They wanted credit. And I get why — I’m not naive here. They saw it as necessary for the fighting of the war. But it also shouldn’t have been surprising, or unexpected, to those at the time that people in the future might be taken aback by the scale of the burning. Even Robert McNamara, who helped plan the firebombing operations, later came to see them as disproportionate to the US aims in the war:

This sequence, from Errol Morris’s Fog of War, has been one of my favorites for a long time. But it wasn’t until recently that I realized its source was one of these maps used for postwar boasting. It’s an incredible re-appropriation, when looked at in that light. A document meant to impress an audience, now being used to horrify a different one.

Notes
  1. Regarding the image, I scanned it out of a reprint of the IMPACT issue. Because of the crease in the center of the pages I had to do some Photoshop wizardry to make it even — so there is a lot of cleaning up around the center of the image. The data hasn’t been changed, but some of the state outlines were retouched and things like that. Similar Photoshop wizardly was also applied to the Arnold Report image to make it look clean. I suspect that the IMPACT image may have come first and the Arnold report image was derived from it, just because the IMPACT caption goes into details about methodology whereas the Arnold report does not. []
  2. But don’t confuse “destroyed” with casualties — I don’t have those numbers on hand, though if I can find them, I’ll add them to the visualization. The nice thing about D3 is that once you’ve got the basics set up, adding or tweaking the data is easy, since it is just read out of spreadsheet file. The maddening thing about D3 is that getting the basics set up is much harder than you might expect, because the documentation is really not aimed at beginners. If you are interested in a copy of the data, here is the file. []
Visions

Death dust, 1941

Friday, March 7th, 2014

One of the biggest misconceptions that people have about the Manhattan Project is that prior to Hiroshima, all knowledge of atomic energy and nuclear fission was secret — that the very idea of nuclear weapons was unthought except inside classified circles. This is a side-effect of the narratives we tell about Manhattan Project secrecy, which emphasize how extreme and successful these restrictions on information were. The reality is, as always, more complicated, and more interesting. Fission had been discovered in 1939, chain reactions were talked about publicly a few months later, and by the early 1940s the subject of atomic power and atomic bombs had become a staple of science journalists and science fiction authors.

Campbell's magazine, Cartmill's story. Image source.

Leaks or speculation? Campbell’s magazine, Cartmill’s story. Image source.

John W. Campbell, Jr., was a prolific editor and publisher of science fiction throughout the mid-20th century. In the annals of nuclear weapons history, he is best known for publishing Cleve Cartmill’s story “Deadline” in March 1944, which talks about forming an atomic bomb from U-235. This got Cartmill and Campbell visitors from the FBI, trying to figure out whether they had access to classified information. They found nothing compromising (and, indeed, if you read Cartmill’s story, you can see that while it gets — as did many — that you can make atomic bombs from separated U-235, it doesn’t really have much truth in the specifics), but told Campbell to stop talking about atomic bombs.

But Campbell’s flirtation with the subject goes a bit deeper than that. Gene Dannen, who runs the wonderful Leo Szilard Online website, recently sent me a rare article from his personal collection. In July 1941, Campbell authored an article in PIC magazine with the provocative title, Is Death Dust America’s Secret Weapon?” It’s a story about radiological warfare in what appears to be rather middle-brow publication about entertainment. Click here to download the PDF. I don’t know anything about PIC, and haven’t been able to find much on it, but from the cover one wouldn’t necessarily expect it to be a source for people looking for hard-hitting science reporting — though the juxtaposition of DEATH DUST, “world’s strangest child,” and the “DAY DREAM” woman is a wonderfully American tableau.


PIC magazine 1941 - Campbell - Death Dust - cover

The story itself starts off with what has even by then become a clichéd way of talking about atomic energy (“A lump of U-235 the size of an ordinary pack of cigarettes would supply power enough to run the greatest bomb in the world three continuous years of unceasing flight“), other than the fact that it is one of the many publications that points out that after an exciting few years of talk about fission, by 1941 the scientists of the United States had clamped themselves up on the topic. The article itself admits none of this is really a secret, though — that all nations were interested in atomic energy to some degree. It vacillates between talking about using U-235 as a power source and using it to convert innocuous chemicals into radioactive ones.

Which is itself interesting — it doesn’t seem to be talking about fission products here, but “synthetic radium powders.” It’s a dirty bomb, but probably not that potent of one. Still, pretty exciting copy for 1941. (Campbell would much later write a book about the history of atomic energy, The Atomic Story, where he also spent a lot of time talking about “death dust.”)

The article contains a really wonderful, lurid illustration of what a city that had been sprayed with “horrible ‘death dust’” would look like:

"Even rats wouldn't survive the blue, luminescent radioactive dust. Vultures would be poisoned by their own appetites."

“Even rats wouldn’t survive the blue, luminescent radioactive dust. Vultures would be poisoned by their own appetites.”

The most interesting parts of the article are when it veers into speculation about what the United States might be doing:

With all the world seeking frantically for the secret of that irresistible weapon, what are America’s chances in the race?

It is a question of men and brains and equipment. Thanks to Hitler’s belief that those who don’t agree with him must be wrong, America now has nearly all the first-rank theoretical physicists of the world. Mussolini’s helped us somewhat, too, by exiling his best scientists. Niels Bohr, father of modern atomic theory, is at Princeton, along with Albert Einstein and others of Europe’s greatest.

The National Defense Research Committee is actively and vigorously supporting the research in atomic physics that seeks the final secrets of atomic power. Actively, because the world situation means that they must, yet reluctantly because they know better than anyone else can the full and frightful consequences of success. Dr. Vannevar Bush, Chairman of the Committee, has said: “I hope they never succeed in tapping atomic power. It will be a hell of a thing for civilization.”

Bohr was in fact still in occupied Denmark in July 1941 — he had his famous meeting with Heisenberg in September 1941 and wouldn’t be spirited out of the country until 1943. The photographs identify Harold Urey and Ernest Lawrence as American scientists who were trying to harness the power of atomic energy. Since Urey and Lawrence were, in fact, trying to do that, and since Vannevar Bush was, in fact, ostensibly in charge of the Uranium Committee work at this point, this superficially looks rather suggestive.

PIC magazine 1941 - death dust - scientists

But I think it’s just a good guess. Urey had worked on isotope separation years before fission was discovered (he got his Nobel Prize in 1934 for learning how to separate deuterium from regular hydrogen), so if you know that isotope separation is an issue, he’s your man. Lawrence was by that point known worldwide for his “atom smashing” particle accelerators, and had snagged the 1939 Nobel Prize for the work done at his Radiation Laboratory. If you were going to pick two scientists to be involved with nuclear weapons, those are the two you’d pick. As for Bush — he coordinated all of the nation’s scientific defense programs. So of course, if the US was working on atomic energy as part of their defense research, Bush would have to be in charge of it.

The other illustrations seem to be just generically chosen. They are particle accelerators of various sorts; one cyclotron and many electrostatic (e.g. Van De Graff) accelerators. Cyclotrons did have relevance to isotope separation — they were used to develop the Calutrons used at Y-12 — but the captions don’t indicate that this is why these machines are featured.

I’ve never seen any evidence that Campbell’s story in PIC came to any kind of official attention. Why not? In the summer of 1941, there was a lot of talk about U-235 and atomic energy — and Campbell’s article really isn’t the most provocative of the bunch. There wasn’t any official press secrecy of any form on the topic yet. “Voluntary censorship” of atomic energy issues, which is what would get Cartmill and Campbell in trouble later, didn’t start up until early 1943. Mid-1941 was still a time when a journalist could speculate wildly on these topics and not get visits from the FBI.

The irony is, there were official fears of a German dirty bomb, but they didn’t really crop up until 1942. But the American bomb effort was starting to get rolling in the late summer of 1941. By the end of 1941, Bush would be a convert to the idea of making the bomb and would start trying to accelerate the program greatly. It wasn’t the Manhattan Project, yet, but it was on its way. Campbell’s article was, in this sense, a bit ahead of its time.

A Campbell publication from 1947 — where he apparently has a better understanding of atomic power. Here he seems to have just scaled down a Hanford-style "pile" and added a turbine to it. It took a little more effort than that in reality...

A Campbell publication from 1947 — where he apparently has a better understanding of atomic power. Here he seems to have just scaled down a Hanford-style “pile” and added a turbine to it. It took a little more effort than that in reality…

What I find most interesting about Campbell’s article is that it reveals what the informed, amateur view of atomic energy was like in this early period. Some aspects of it are completely dead-on — that U-235 is the important isotope, that isotope separation is going to matter, that places with particle accelerators are going to play a role, that the acquisition of uranium ore was about to get important, that fears of German use of atomic energy existed. But parts of it are completely wrong — not only would dirty bombs not play a role, he doesn’t seem to understand that fission products, not irradiated substances, would play the strongest role. He doesn’t really seem to understand how nuclear power would be harnessed in a reactor. He doesn’t really seem to get fission bombs at all.

This mixture of accuracy and confusion, of guess and folly, tells us a lot about the state of public knowledge at the time. Atomic energy was a topic, it was an idea — but it wasn’t yet something tangible, a reality. So when people found out, in 1945, that the United States had made and detonated atomic fission bombs, they were primed to understand this as the beginning of a “new era,” as the realization of something they had been talking about for a long time — even if the details had been secret.

Meditations

Kilotons per kilogram

Monday, December 23rd, 2013

Nuclear weapons can be made to have pretty much as much of a bang as one wants to make them, but with increased explosive yield comes an increased weapon weight. We always talk vaguely about being able to make H-bombs to arbitrarily high yields, but recently I’ve been mulling over this fact somewhat quantitatively. I gave a talk last month at the History of Science Society Meeting on US interest in 50-100 MT bombs around the time of the Limited Test Ban Treaty, and while working on this paper I got  slightly obsessed with what is known as the yield-to-weight ratio.

Little Boy — a big bang compared to a conventional bomb, but still a very crude nuclear bomb.

Little Boy — a big bang compared to a conventional bomb, but still a very crude nuclear bomb.

What makes nuclear weapons impressive and terrible is that their default yield-to-weight ratio — that is, the amount of bang per mass, usually expressed in terms of kilotons per kilogram (kt/kg) — is much, much higher than conventional explosives. Take TNT for example. A ton of TNT weighs, well, a ton. By definition. So that’s 0.001 kilotons per 1,000 kilograms; or 0.000001 kt/kg. By comparison, even a crude weapon like the Little Boy bomb that was dropped on Hiroshima was about 15 kilotons in a 4,400 kg package: 0.003 kt/kg. That means that the Little Boy bomb had an energy density three orders of magnitude higher than a regular TNT bomb would. Now, TNT isn’t the be-all and end-all of conventional explosives, but no conventional explosive gets that much boom for its buck compared to a nuke.

The Little Boy yield is much lower than the hypothetical energy density of uranium-235. For every kilogram of uranium-235 that completely fissions, it releases about 17 kt/kg. That means that less than a kilogram of uranium-235 fissioned in the Little Boy bomb to release its 15 kilotons of energy. Knowing that there was 64 kg of uranium in the bomb, that means that something like 1.3% of the uranium in the weapon actually underwent fission. So right off the bat, one could intuit that this is something that could probably be improved upon.

Fat Man — a lot better use of fissile material than Little Boy, but no more efficient in terms of yield-to-weight.

Fat Man — a lot better use of fissile material than Little Boy, but no more efficient in terms of yield-to-weight.

The Fat Man bomb had a much better use of fissile material than Little Boy. Its yield wasn’t that much better (around 20 kilotons), but it managed to squeeze that (literally) out of only 6.2 kilograms of plutonium-239. Pu-239 releases around 19 kilotons per kilogram that completely fissions, so that means that around 15% of the Fat Man core (a little under 1 kg of plutonium) underwent fission. But the bomb itself still weighed 4,700 kg, making its yield-to-weight ratio a mere 0.004 kt/kg. Why, despite the improve efficiency and more advanced design of Fat Man, was the yield ratio almost identical to Little Boy? Because in order to get that 1 kg of fissioning, it required a very heavy apparatus. The explosive lenses weighed something like 2,400 kilograms just by themselves. The depleted uranium tamper that held the core together and reflected neutrons added another 120 kilograms.  The aluminum sphere that held the whole apparatus together weighed 520 kilograms. The ballistic case (a necessary thing for any actual weapon!) weighed another 1,400 kg or so. All of these things were necessary to make the bomb either work, or be a droppable bomb.

So it’s unsurprising to learn that improving yield-to-weight ratios was a high order of business in the postwar nuclear program. Thermonuclear fusion ups the ante quite a bit. Lithium-deuteride (LiD), the most common and usable fusion fuel, yields 50 kilotons for every kilogram that undergoes fusion — so fusion is nearly 3 times more energetic per weight than fission. So the more fusion you add to a weapon, the better the yield-to-weight ratio, excepting for the fact that all fusion weapons require a fission primary and usually also have very heavy tampers.

I took all of the reported American nuclear weapon weights and yields from Carey Sublette’s always-useful website, put them into the statistical analysis program R, and created this semi-crazy-looking graph of American yield-to-weight ratios:

Yield-to-weight ratios of US nuclear weapons

The horizontal (x) axis is the yield in kilotons (on a logarithmic scale), the vertical (y) axis is the weight in kilograms (also on a log scale). In choosing which of the weights and yields to use, I’ve always picked the lowest listed weights and the highest listed yields — because I’m interested in the optimal state of the art. The individual scatter points represent models of weapons. The size of each point represents how many of them were produced; the color of them represents when they were first deployed. Those with crosses over them are still in the stockpile. The diagonal lines indicate specific yield-to-weight ratio regions.

A few points of interest here. You can see Little Boy (Mk-1), Fat Man (Mk-3), and the postwar Fat Man improvements (Mk-4 — same weight, bigger yield) at the upper left, between 0.01 kt/kg and 0.001 kt/kg. This is a nice benchmark for fairly inefficient fission weapons. At upper right, you can see the cluster of the first H-bomb designs (TX-16, EC-17, Mk-17, EC-24, Mk-24) — high yield (hence far to the right), but very heavy (hence very high). Again, a good benchmark for first generation high-yield thermonuclear weapons.

What a chart like this lets you do, then, is start to think in a really visual and somewhat quantitative way about the sophistication of late nuclear weapon designs. You can see quite readily, for example, that radical reductions in weight, like the sort required to make small tactical nuclear weapons, generally results in a real decrease in efficiency. Those are the weapons in the lower left corner, pretty much the only weapons in the Little Boy/Fat Man efficiency range (or worse). One can also see that there are a few general trends in design development over time if one looks at how the colors trend.

First there is a movement down and to the right (less weight, more yield — improved fission bombs); there is also a movement sharply up and to the right (high weight, very high yield — thermonuclear weapons) which then moves down and to the left again (high yield, lower weight — improved thermonuclear weapons). There is also the splinter of low-weight, low-yield tactical weapons as well that jots off to the lower left. In the middle-right is what appears to be a sophisticated “sweet spot,” the place where all US weapons currently in the stockpile end up, in the 0.1-3 kt/kg range, especially the 2-3 kt/kg range:

Yield-to-weight ratios -- trends

These are the bombs like the W-76 or the B-61 — bombs with “medium” yield warheads (100s rather than 1,000s of kilotons) in relatively low weight packages (100s rather than 1000s of kilograms). These are the weapons take advantage of the fact that they are expected to be relatively accurate (and thus don’t need to be in the multi-megaton range to have strategic implications), along with what are apparently sophisticated thermonuclear design tricks (like spherical secondaries) to squeeze a lot of energy out of what is a relatively small amount of material. Take the W-76 for example: its manages to get 100 kilotons of yield out of 164 kilograms. If we assume that it is a 50/50 fission to fusion ratio, that means that it manages to fully fission about 5 kilograms of fissionable material, and to fully fuse about 2 kilograms of fusionable material. And it takes just 157 kg of other apparatus (and unfissioned or unfused material) to produce that result — which is just a little more than Shaquille O’Neal weighs.

Such weapons aren’t the most efficient. Weapon designer Theodore Taylor wrote in 1987 that 6 kiloton/kilogram had been pretty much the upper limit of what had even been achieved.1 Only a handful of weapons got close to that. The most efficient weapon in the US stockpile was the Mk-41, a ridiculously high yield weapon (25 megatons) that made up for its weight with a lot of fusion energy.

The components of the B-61 nuclear weapon — the warhead is the bullet-shape in the mid-left. The B-61 was designed for flexibility, not miniaturization, but it's still impressive that it could get 20X the Hiroshima bomb's output out of that garbage-can sized warhead.

The components of the B-61 nuclear weapon — the warhead is the bullet-shape in the mid-left. The B-61 was designed for flexibility, not miniaturization, but it’s still impressive that it could get 20X the Hiroshima bomb’s output out of that garbage-can sized warhead.

But given that high efficiency is tied to high yields — and relatively high weights — it’s clear that the innovations that allowed for the placing of warheads on MIRVed, submarine-launched platforms are still pretty impressive. The really magical range seems to be for weapons that in the hundred kiloton range (more than 100 kilotons but under a megaton), yet under 1,000 kilograms. Every one of those dates from after 1962, and probably involves the real breakthroughs in warhead design that were first used with the Operation Dominic  test series (1962). This is the kind of strategic miniaturization that makes war planners happy.

What’s the payoff of thinking about these kinds of numbers? One is that it allows you to see where innovations have been made, even if you know nothing about how the weapon works. In other words, yield-to-weight ratios can provide a heuristic for making sense of nuclear design sophistication, comparing developments over time without caring about the guts of the weapon itself. It also allows you to make cross-national comparisons in the same fashion. The French nuclear arsenal apparently developed weapons in that same miniaturized yield-to-weight range of the United States by the 1970s — apparently with some help from the United States — and so we can probably assume that they know whatever the United States figured out about miniaturized H-bomb design in the 1960s.

The Tsar Bomba: a whole lot of boom, but a whole lot of weight. The US thought they could make the same amount of boom for half the weight.

The Tsar Bomba: a whole lot of boom, but a whole lot of weight. The US thought they could make the same amount of boom for half the weight.

Or, to take another tack, and returning to the initial impetus for me looking at this topic, we know that the famous “Tsar Bomba” of the Soviet Union weighed 27,000 kilograms and had a maximum yield of 100 Mt, giving it a yield-to-weight ratio of “only” 3.43 kilotons/kilograms. That’s pretty high, but not for a weapon that used so much fusion energy. It was clear to the Atomic Energy Commission that the Soviets had just scaled up a traditional H-bomb design and had not developed any new tricks. By contrast, the US was confident in 1961 that they could make a 100 Mt weapon that weighed around 13,600 kg (30,000 lb) — an impressive 7.35 kiloton/kilogram ratio, something well above the 6 kt/kg achieved maximum. By 1962, after the Dominic series, they thought they might be able to pull off 50 Mt in only a 4,500 kg (10,000 lb) package — a kind of ridiculous 11 kt/kg ratio. (In this estimate, they noted that the weapon might have an impractically large diameter as a result, perhaps because the secondary was spherical as opposed to cylindrical.) So we can see, without really knowing much about the US had in mind, that it was planning something very, very different from what the Soviets set off.

It’s this black box approach that I find so interesting about these ratios. It’s a crude tool, to be sure, but a tool nonetheless. By looking at the broad trends, we get insights into the specifics, and peel back the veil just a tiny bit.

Notes
  1. Theodore B. Taylor, “Third Generation Nuclear Weapons,” Scientific American 256, No. 4 (April 1987), 30-39, on 34: “The yield-to-weight ratios of pure fission warheads have ranged from a low of about .0005 kiloton per kilogram to a high of about .1 kiloton per kilogram. [...] The overall yield-to-weight ratio of strategic thermonuclear warheads has been as high as about six kilotons per kilogram. Although the maximum theoretical ratios are 17 and 50 kilotons per kilogram respectively for fission and fusion reactions, the maximum yield-to-weight ratio for U.S. weapons has probably come close to the practical limit owing to various unavoidable inefficiencies in nuclear weapon design (primarily arising from the fact that it is impossible to keep the weapon from disintegrating before complete fission or fusion of the nuclear explosive has taken place.” []
Visions

Art, Destruction, Entropy

Friday, December 13th, 2013

Are nuclear explosions art? Anyone who has taken even a glance into modern and contemporary art knows that the official mantra might as well be “anything goes,” but I found myself wondering this while visiting the exhibition “Damage Control: Art and Destruction since 1950” that is currently at the Hirshhorn Museum. The first thing one sees upon entering is a juxtaposition of two very different sorts of “work.” On the right is a fairly long loop of EG&G footage of nuclear test explosions, broadcast in high definition over an entirety of a wall. On the left is a piano that has been destroyed with an axe. This, I thought, is at least a provocative way to start things off.

Edgerton, Germeshausen, and Grier (EG&G) was a contractor for the federal government during the Cold War, responsible for documenting nuclear test explosions. Quite a lot of the famous Cold War nuclear detonation footage was taken by EG&G. They are perhaps most famous for their “Rapatronic” photographs, the ultimate expression of MIT engineer Harold “Doc” Edgerton’s work of slowing down time through photography, but this was only a part of their overall contribution. The film they have at the Hirshhorn is something of an EG&G “greatest hits” reel from the 1950s, and its affect on the others in the audience was palpable. Adults and children alike were drawn to the blasts, displayed one after another without commentary or explanation.1 Their reactions didn’t strike me as one of disgust or horror, but of amazement and awe. Most of the footage was from the Nevada Test Site, so the bombs were generally just blowing up desert scrub, and occasionally houses constructed for effects testing.

The destroyed piano, by contrast, got reactions of shock and disgust. It was the remains of a piece of performance art conducted by Raphael Montañez Ortiz, one of several he’s done, apparently. My wife, a piano player and a nuclear historian, also found it disturbing. “If you know what goes into making a piano…,” she started to say. “But then again, if you know what goes into making a city…,” she caught herself. I overheard other people say similar things.

The difference in reactions isn’t too surprising — it’s a common theme that it is easy to appreciate the destruction of something at a human scale, difficult to appreciate it at the scale of nuclear bomb. A lot of what I’ve spent time doing, with the NUKEMAP and my writing, is to try to understand, and to impart, the scale of a nuclear explosion. A lot of this has involved looking at the attempts of others, as well, from official Cold War visualizations made for secret committees to popular films, as they have tried to communicate this to their target audiences. The hardest thing is that our brains appear only to be wired for empathy at the individual level, and don’t readily apply it to large groups or large areas. The best work in these areas conveys both the broad scope of destruction, but then ties it into the personal. They individualize the experience of mass ruination.

And the EG&G footage isn’t trying to do that. It was data meant for very specific technical purposes. It was developed in order to further the US nuclear program, and defense against Soviet nuclear weapons. Which is why I somewhat question its inclusion, or, at least, its decontextualization. It is art only in the sense that it has aesthetics and it has been put into an art gallery. One can read into it whatever one wants, of course, but it wasn’t created to have deep meaning and depth in that sense. (Whether one cares about authorial intention, of course, is its own can of modern art worms.) Just as a small example of what I mean, Andy Warhol famously made a print of mushroom clouds for his own “disaster” series (a few of which, but not this print, were featured in the exhibit):

"Atomic Bomb," Andy Warhol, 1965.

“Atomic Bomb,” Andy Warhol, 1965.

Now Warhol is a complicated character, but since he was explicitly an artist I think it is always fair game to talk about his possible intentions, the aesthetics of the piece, the deeper meanings, and so on. Warhol’s art has generally been interpreted to be about commercialization and commodification. The mushroom cloud in repetition becomes a statement about our culture and its fascination with mass destruction, perhaps. Coming in the mid-1960s, after the close-call terrors of the early years of the decade, perhaps it was maybe too-little too-late, but still, it has an ominous aesthetic appeal, perhaps now more than then.

Because I don’t think this image was widely circulated at the time, I doubt that Warhol knew that Berlyn Brixner, the Trinity test photographer, had made very similar sorts of images of the world’s first nuclear fireball at “Trinity”:

TR-NN-11, Berlyn Brixner, 1945.

“TR-NN-11,” Berlyn Brixner, 1945.

Brixner appreciated the aesthetics and craft of his work, to be sure. But the above photograph is explicitly a piece of technical data. It is designed to show the Trinity fireball’s evolution over the 15-26 millisecond range. Warhol’s instrument of choice was the silkscreen printer; Brixner’s was the 10,000 fps “Fastax” camera. There’s a superficial similarity in their atomic repetition. You could make a statement by putting them next to each other — as I am doing here! — but properly understood, I think, they are quite different sorts of works.

Don’t get me wrong. Re-appropriating “non-art” into “art” has been a common move over much of the 20th century at the very least. But the problem for me is not that people shouldn’t appreciate the aesthetics of the “non-art.” It’s that focusing on the aesthetics makes it easy to lose sight of the context. (As well as the craft — Brixner’s work was exponentially more difficult to produce than Warhol’s!) The EG&G footage in the exhibit doesn’t explain much of how, or why, it was made. It seems to be asking the viewer to appreciate it solely on its aesthetic grounds. Which I think is the real problem. Many of the tests they show resulted in significant downwind fallout for the populations near the Nevada Test Site. Many of them involved the development of new, ever-more elaborate ways of mass destruction. Many of them were the product of years of top scientific manpower, untold riches, and a deep political context. To appreciate them as simply big, bright booms robs them of something — no matter how aesthetically beautiful those big, bright booms actually are. 

Gustav Metzger's "auto-destructive" art.

Gustav Metzger’s “auto-destructive” art.

What makes it more ironic is that the exhibit actually does give considerable context to some of the works that are explicitly “art.” You have to explain the context of Gustav Metzger’s “auto-destructive” art — it involves him filming himself painting on canvases with a strong acid, so the artwork destroys itself in the process. Without the context there, what is left is just a boring, not-very-comprehensible movie of a man destroying a blank canvas. But anyway.

In terms of the audience at the exhibit, which was fairly well-attended when I was there with my wife, the most interesting part was the handling of children. The Smithsonian museums are of course explicitly places that people take their children while visiting the city, so it’s no surprise that you probably find more of them at the Hirshhorn than you would at MOMA or other similar institutions. But children add a level of anxiety to an exhibit about destruction. They were wowed by the wall-o’-bombs but not, it seemed, by the piano. Parents seemed to let them wander free through most of it, but there were several films where I saw kids get yanked out by their parents once the parents realized the content was going to be disturbing. In one of these films, the “disturbing” content was of a variety that might have been hard for the children to directly understand — the famous film of the Hindenburg going up in flame, for example, where the violence was real but seen from enough of a distance to keep you from seeing actual injuries or bodies. The one I saw the kids getting really removed from (by their parents, not the museum) was footage of the 2011 Vancouver riots. I wasn’t impressed too much with the footage itself (its content was interesting in a voyeuristic way, but there seemed to be nothing special about the filming or editing), but the immediacy of its violence was much more palpable than the violence-at-a-distance that one saw in most of the other such works. It’s cliche to trot out that old quote attributed (probably wrongly) to Stalin that one death is a tragedy, a million is a statistic, but there’s something deeply true to it about how we perceive violence and pain.

Damage Control exhibit site

There are a lot of works in the exhibit. As one would expect, some hew to the theme very closely, some are a bit more tenuous. Overall, though, it was pretty interesting, and if you’re in town, you ought to check it out. The original comment my wife made about pianos and cities stuck with me as I looked at all of the various meditations on “destruction.” In it, I kept coming back to the second law of thermodynamics. On the face of it, it is a very clinical, statistical law: “the entropy of an isolated system never decreases.” It is actually quite profound, something that the 19th-century physicists who developed it knew. Entropy can be broadly understood as “disorder.” The second law of thermodynamics says, in essence, that without additional energy being put into it, everything eventually falls apart. It takes work to keep things “organized,” whether they are apartments, bodies, or cities.2 Ludwig Boltzmann, who helped formulate the law, stated gnomically in 1886 that:

The general struggle for existence of animate beings is not a struggle for raw materials – these, for organisms, are air, water and soil, all abundantly available – nor for energy, which exists in plenty in any body in the form of heat Q, but of a struggle for [negative] entropy, which becomes available through the transition of energy from the hot sun to the cold earth.

In other words, life itself is a struggle against entropy. Our bodies are constantly taking disordered parts of the world (heat energy, for example, and the remains of other living things) and using them to fuel the work of keeping us from falling apart.

But the other way to think about this law is that generally it is easier to take things apart than it is to keep them together. It is easier to convert a piano into a low-energy state (through an axe, or perhaps a fire) than it is to make a piano in the first place. It is easier to destroy a city than it is to make a city. The three-year effort of the half-a-million people on the Manhattan Project was substantial, to be sure, but still only a fraction of the work it took to make the cities of Hiroshima and Nagasaki, and all that they contained, biological and material, in the first place.

Of course, the speed at which entropy increases is often controllable. The universe will eventually wear out — but not for a long time. Human civilization will necessarily go extinct — but it doesn’t have to happen anytime soon. What hits home with the “Damage Control” exhibit is how we as a species have to work so hard to keep everything together, while simultaneously working so hard to find ways to make everything fall apart. And in this, perhaps, it is a success, even if I left with many niggling questions about the presentation of some of the works in particular.

Notes
  1. Various guys in the audience would occasionally try to give explanation to their loved ones, and they were generally incorrect, alas. “That must be at Alamogordo… That’s got to be an H-bomb…” no, no, no. Of course, I was there with my wife, and I was talking up my own little storm (though less loudly than the wrong guys), but at least I know my stuff for the most part… []
  2. The key, confusing part about the second law is the bit about the “isolated system.” It doesn’t say that entropy always increases. It says that in an isolated system — that is, a system with no energy being input into it — entropy always increases. For our planet, the Sun is the source of that input, and you can trace, through a long series of events, its own negative entropy to the Big Bang itself. []
News and Notes | Visions

The NUKEMAPs are here

Thursday, July 25th, 2013

I’m super excited to announce that last Thursday, at an event hosted by the Center for Nonproliferation Studies at the Monterey Institute for International Study, I officially launched NUKEMAP2 and NUKEMAP3D. I gave a little talk, which I managed to record, but I haven’t had the time (more details below on why!) to get that up on YouTube yet. Soon, though.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

NUKEMAP2 is an upgraded version of the original NUKEMAP, with completely re-written effects simulations codes that allow one a huge amount of flexibility in the nuclear detonation one is trying to model. It also allows fallout mapping and casualty counts, among other things. I wanted to make it so that the NUKEMAP went well beyond any other nuclear mapping tools on the web — I wanted it to be a tool that both the layman and the wonk could use, a tool that rewarded exploration, and a tool that, despite the limitations of a 2D visualization, could work to deeply impress people with the power of a nuclear explosion.

The codes that underly the model are all taken from Cold War effects models. At some point, once it has been better documented than it is now, I’ll probably release the effects library I’ve written under an open license. I don’t think there’s anything quite like it out there at the moment available for the general public. For the curious, there are more details about the models and their sources here.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

NUKEMAP3D uses Google Earth to allow “3D” renderings of mushroom clouds and the nuclear fireball. Now, for the first time, you can visualize what a mushroom cloud from a given yield might look like on any city in the world, viewed from any vantage-point you can imagine. I feel like it is safe to say that there has never been a nuclear visualization tool of quite this nature before.

I got the idea for NUKEMAP3D while looking into a story for the Atlantic on a rare photo of the Hiroshima mushroom cloud. One of the issues I was asked about was how long after the detonation the photograph was taken — the label on the back of the photograph said 30 minutes, but there was some doubt. In the process of looking into this, I started to dig around the literature on mushroom cloud formation and the height of the Hiroshima cloud at various time intervals. I realized that I had no sense for what “20,000 feet” meant in terms of a cloud, so I used Google Earth to model a simple 20,000 foot column above the modern-day city of Hiroshima.

I was stunned at the size of it, when viewed from that perspective — it was so much larger than it even looked in photographs, because the distance that such photographs were taken from makes it very hard to get a sense of scale. I realized that modeling these clouds in a 3D environment might really do something that a 2D model could not. It seems to switch on the part of the brain that judges sizes and areas in a way that a completely flat, top-down overlay does not. The fact that I was surprised and shocked by this, despite the fact that I look at pictures of mushroom clouds probably every day (hey, it’s my job!), indicated to me that this could be a really potent educational tool.

That same 20 kiloton cloud, as viewed from airplane height.

That same 20 kiloton cloud, as viewed from airplane height.

I’m also especially proud of the animated mode, which, if I’m allowed to say, was a huge pain in the neck to program. Even getting a somewhat “realistic”-looking cloud model was a nontrivial thing in Google Earth, because its modeling capabilities are somewhat limited, and because it isn’t really designed to let you manipulate models in a detailed way. It lets you scale model sizes along the three axes, it allows you to rotate them, and it allows you to change their position in 3D space. So I had to come up with ways of manipulating these models in realtime so that they would approximate a semi-realistic view of a nuclear explosion, given these limitations.

It’s obviously not quite as impressive as an actual nuclear explosion (but what is?), and my inability to use light as a real property (as you could in a “real” 3D modeling program) diminishes things a bit (that is, I can’t make it blinding, and I can’t make it cast shadows), but as a first go-around I think it is still a pretty good Google Earth hack. And presumably Google Earth, or similar tools, will only get better and more powerful in the future.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

If you’ve been following my Twitter feed, you also probably have picked up that this has been a little bit of a saga. I tried to launch it on last Thursday night, but the population database wasn’t really working very well. The reason is that it is very, very large — underneath it is a population density map of the entire planet, in a 1km by 1km grid, and that means it is about 75 million records (thank goodness for the oceans!). Optimizing the queries helped a bit, and splitting the database up helped a bit. I then moved the whole database to another server altogether, just to make sure it wasn’t dragging down the rest of the server. But on Monday,just when the stories about NUKEMAP started to go up, my hosting company decided it was too much traffic and that I had, despite “unlimited bandwidth” promises, violated the Terms of Service by having a popular website (at that point it was doing nothing but serving up vanilla HTML, Javascript, and CSS files, so it wasn’t really a processing or database problem). Sigh. So I frantically worked to move everything to a different server, learned a lot about systems administration in the process, and then had the domain name issue a redirect from the old hosting company. And all of that ended up taking a few days to finalize (the domain name bit was frustratingly slow, due to settings chosen by the old hosting company).

But anyway. All’s well that ends well, right? Despite the technical problems, since moving the site to the new server, there have been over 1.2 million new “detonations” with the new NUKEMAPs, which is pretty high for one week of sporadic operation! 62% of them are with NUKEMAP3D, which is higher than I’d expected, given the computer requirements required to run the Google Earth plugin. The new server works well most of the time, so that’s a good thing, though there are probably some tweaks that still need to be done for it to happily run the blog and the NUKEMAPs. There is, though I don’t want to make it too intrusive or seem too irritating, a link now on the NUKEMAP for anyone who wanted to chip in to the server fund. Completely optional, and not expected, but if you did want to chip in, I can promise you a very friendly thank-you note at the very least.

Now that this is up and “done” for now, I’m hoping to get back to a regular blogging schedule. Until then, try out the new NUKEMAPs!