News and Notes | Visions

The NUKEMAPs are here

by Alex Wellerstein, published July 25th, 2013

I’m super excited to announce that last Thursday, at an event hosted by the Center for Nonproliferation Studies at the Monterey Institute for International Study, I officially launched NUKEMAP2 and NUKEMAP3D. I gave a little talk, which I managed to record, but I haven’t had the time (more details below on why!) to get that up on YouTube yet. Soon, though.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

NUKEMAP2 is an upgraded version of the original NUKEMAP, with completely re-written effects simulations codes that allow one a huge amount of flexibility in the nuclear detonation one is trying to model. It also allows fallout mapping and casualty counts, among other things. I wanted to make it so that the NUKEMAP went well beyond any other nuclear mapping tools on the web — I wanted it to be a tool that both the layman and the wonk could use, a tool that rewarded exploration, and a tool that, despite the limitations of a 2D visualization, could work to deeply impress people with the power of a nuclear explosion.

The codes that underly the model are all taken from Cold War effects models. At some point, once it has been better documented than it is now, I’ll probably release the effects library I’ve written under an open license. I don’t think there’s anything quite like it out there at the moment available for the general public. For the curious, there are more details about the models and their sources here.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

NUKEMAP3D uses Google Earth to allow “3D” renderings of mushroom clouds and the nuclear fireball. Now, for the first time, you can visualize what a mushroom cloud from a given yield might look like on any city in the world, viewed from any vantage-point you can imagine. I feel like it is safe to say that there has never been a nuclear visualization tool of quite this nature before.

I got the idea for NUKEMAP3D while looking into a story for the Atlantic on a rare photo of the Hiroshima mushroom cloud. One of the issues I was asked about was how long after the detonation the photograph was taken — the label on the back of the photograph said 30 minutes, but there was some doubt. In the process of looking into this, I started to dig around the literature on mushroom cloud formation and the height of the Hiroshima cloud at various time intervals. I realized that I had no sense for what “20,000 feet” meant in terms of a cloud, so I used Google Earth to model a simple 20,000 foot column above the modern-day city of Hiroshima.

I was stunned at the size of it, when viewed from that perspective — it was so much larger than it even looked in photographs, because the distance that such photographs were taken from makes it very hard to get a sense of scale. I realized that modeling these clouds in a 3D environment might really do something that a 2D model could not. It seems to switch on the part of the brain that judges sizes and areas in a way that a completely flat, top-down overlay does not. The fact that I was surprised and shocked by this, despite the fact that I look at pictures of mushroom clouds probably every day (hey, it’s my job!), indicated to me that this could be a really potent educational tool.

That same 20 kiloton cloud, as viewed from airplane height.

That same 20 kiloton cloud, as viewed from airplane height.

I’m also especially proud of the animated mode, which, if I’m allowed to say, was a huge pain in the neck to program. Even getting a somewhat “realistic”-looking cloud model was a nontrivial thing in Google Earth, because its modeling capabilities are somewhat limited, and because it isn’t really designed to let you manipulate models in a detailed way. It lets you scale model sizes along the three axes, it allows you to rotate them, and it allows you to change their position in 3D space. So I had to come up with ways of manipulating these models in realtime so that they would approximate a semi-realistic view of a nuclear explosion, given these limitations.

It’s obviously not quite as impressive as an actual nuclear explosion (but what is?), and my inability to use light as a real property (as you could in a “real” 3D modeling program) diminishes things a bit (that is, I can’t make it blinding, and I can’t make it cast shadows), but as a first go-around I think it is still a pretty good Google Earth hack. And presumably Google Earth, or similar tools, will only get better and more powerful in the future.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

If you’ve been following my Twitter feed, you also probably have picked up that this has been a little bit of a saga. I tried to launch it on last Thursday night, but the population database wasn’t really working very well. The reason is that it is very, very large — underneath it is a population density map of the entire planet, in a 1km by 1km grid, and that means it is about 75 million records (thank goodness for the oceans!). Optimizing the queries helped a bit, and splitting the database up helped a bit. I then moved the whole database to another server altogether, just to make sure it wasn’t dragging down the rest of the server. But on Monday,just when the stories about NUKEMAP started to go up, my hosting company decided it was too much traffic and that I had, despite “unlimited bandwidth” promises, violated the Terms of Service by having a popular website (at that point it was doing nothing but serving up vanilla HTML, Javascript, and CSS files, so it wasn’t really a processing or database problem). Sigh. So I frantically worked to move everything to a different server, learned a lot about systems administration in the process, and then had the domain name issue a redirect from the old hosting company. And all of that ended up taking a few days to finalize (the domain name bit was frustratingly slow, due to settings chosen by the old hosting company).

But anyway. All’s well that ends well, right? Despite the technical problems, since moving the site to the new server, there have been over 1.2 million new “detonations” with the new NUKEMAPs, which is pretty high for one week of sporadic operation! 62% of them are with NUKEMAP3D, which is higher than I’d expected, given the computer requirements required to run the Google Earth plugin. The new server works well most of the time, so that’s a good thing, though there are probably some tweaks that still need to be done for it to happily run the blog and the NUKEMAPs. There is, though I don’t want to make it too intrusive or seem too irritating, a link now on the NUKEMAP for anyone who wanted to chip in to the server fund. Completely optional, and not expected, but if you did want to chip in, I can promise you a very friendly thank-you note at the very least.

Now that this is up and “done” for now, I’m hoping to get back to a regular blogging schedule. Until then, try out the new NUKEMAPs!

News and Notes

Presenting NUKEMAP2 and NUKEMAP3D

by Alex Wellerstein, published July 22nd, 2013

A longer post is coming later today, but in the meantime, I just wanted to make sure anyone on here knows that NUKEMAP2 and NUKEMAP3D are now online:

  • NUKEMAP2: sequel to the original NUKEMAP, with newly-derived effects equations and lots of brand-new options, including crater size, radioactive fallout plumes (with adjustable wind speeds and fission fractions!), and casualty counts! 
  • NUKEMAP3D: the next dimension of nuclear effects mapping, with 3D modeling and real-time animations of custom-built mushroom clouds and nuclear fireballs.
The mushroom cloud from a 20 kiloton explosion, centered on downtown San Francisco, as viewed from my old house in the Berkeley Hills. Estimated fatalities: 75,200.

The mushroom cloud from a 20 kiloton explosion, centered on downtown San Francisco, as viewed from my old house in the Berkeley Hills. Estimated fatalities: 75,200.

Technically the sites went live last Thursday, July 18, but there were some technical issues that took until the weekend to finalize (if they are, indeed, finalized) due to the heavy database usage of the new features (e.g. the casualty database). But I’ve moved things around a bit, optimized some sloppy queries, and now things seem to be doing pretty good despite being under a very heavy user load. More information soon!

News and Notes | Visions

The new NUKEMAP is coming

by Alex Wellerstein, published July 12th, 2013

I’m excited to announce, that after a long development period, that the new NUKEMAP is going to debut on Thursday, July 18th, 2013. There will be an event to launch it, hosted by the James Martin Center for Nonproliferation Studies of the Monterey Institute of International Studies in downtown Washington, DC, from 10-11:30 am, where I will talk about what it can do, why I’ve done it, and give a demonstration of how it works. Shortly after that, the whole thing will go live for the entire world.

Nukemap preview - fallout

Radioactive fallout dose contours from a 2.3 megaton surface burst centered on Washington, DC, assuming a 15 mph wind and 50% yield from fission. Colors correspond to 1, 10, 100, and 1,000 rads-per-hour at 1 hour. This detonation is modeled after the Soviet weapons in play during the Cuban Missile Crisis.

I don’t want to spill all of the beans early, but here’s a teaser. There is not just one new NUKEMAP. There are two new NUKEMAPs. One of them is a massive overhaul of the back-end of the old NUKEMAP, with much more flexible effects calculations and the ability to chart all sorts of other new phenomena — like radioactive fallout (finally!), casualty estimates, and the ability to specify airbursts versus ground bursts. All of these calculations are based on models developed by people working for the US government during the Cold War for use in government effects planning. So you will have a lot of data at your instant disposal, should you want it, but all within the smooth, easy-t0-use NUKEMAP interface you know and love.

This has been a long time in development, and has involved me chasing down ancient government reports, learning how to interpret their equations, and converting them to Javascript and the Google Maps API. So you can imagine how “fun” (read: not fun) that was, and how Beautiful Mind my office and home got in the process. And as you’ve no doubt noticed in the last few weeks, doing obsessive, detailed, mathematical technical work in secret all week did not give me a lot of inspiration for historical blog posts! So I’ll be glad to move on from this, and to get it out in the light of day. (Because unlike the actual government planners, my work isn’t classified.)

Above is an image from the report which I used to develop the fallout model. Getting a readable copy of this involved digging up an original copy at the National Library of Medicine, because the versions available in government digital databases were too messed up to reliably read the equations. Some fun: none of this was set up for easy translation into a computer, because nobody had computers in the 1960s. So it was designed to help you draw these by hand, which  made translating them into Javascript all the more enjoyable. More fun: many of these old reports had at least one typo hidden in their equations that I had to ferret out. Well, perhaps that was for the best — I feel I truly grok what these equations are doing at this point and have a lot more confidence in them than the old NUKEMAP scaling models (which, by the way, are actually not that different in their radii than the new equations, for all of their simplifications).

But the other NUKEMAP is something entirely new. Entirely different. Something, arguably, without as much historical precedent — because people today have more calculation and visualization power at their fingertips than ever before. It’s one thing for people to have the tools to map the bomb in two dimensions. There were, of course, even websites before the NUKEMAP that allowed you to do that to one degree or another. But I’ve found that, even as much as something like the NUKEMAP allows you to visualize the effects of the bomb on places you know, there was something still missing. People, myself included, were still having trouble wrapping their heads around what it would really look like for something like this to happen. And while thinking about ways to address this, I stumbled across a new approach. I’ll go into it more next week, but here’s a tiny teaser screenshot to give you a bit of an indication of what I’m getting about.

Nukemap preview

That’s the cloud from a 10 kiloton blast — the same yield as the North Korean’s 2013 test, and the model the US government uses for a terrorist nuclear weapon — on mid-town Manhattan, as viewed from New York harbor. Gives you a healthy respect for even a “small” nuclear weapon. And this is only part of what’s coming.

Much more next week. July 18th, 2013 — two days after the 68th-anniversary of the Trinity test — the new NUKEMAPs are coming. Tell your friends, and stay tuned.

Redactions

Shurcliff on Secrecy

by Alex Wellerstein, published July 5th, 2013

William A. Shurcliff is one of my favorite Manhattan Project dramatis personaeI’ve written about him before on here, some time back. In a nutshell, Shurcliff was a physicist who worked as a technical advisor to Vannevar Bush in the Office of Scientific Research and Development, and was connected to the bomb project only peripherally. In fact, his value to Bush was that he wasn’t really steeped in the work to make the bomb: he was a trusted, technically-competent outsider. So he was the person they called, for example, when they needed a censor for atomic patents, because he could be “read in” on the secrets but wasn’t otherwise in a position to have conflicted interests. Among his other roles on the bomb project was to be the copyeditor of the Smyth Report, and he later was the “official historian” for Operation Crossroads.

William Shurcliff, age 39, 1948, 29 x 22.5 inches, Oil.

A painting of William A. Shurcliff from 1948 by his father-in-law, the American artist Charles Hopkinson.

What I love about the Shurcliff one finds in the Manhattan Project files is that he shows up in the most unusual, unsought places, and he loved to write unsolicited memos. I imagine him sitting around, thinking about some core problem related to the social and political future of atomic energy, and writing his thoughts out in a methodical fashion and sending them to the top. Occasionally there is evidence that these memos were read and circulated, though none were ever obviously used as the basis of policy going forward. Still, what’s really wonderful about someone like Shurcliff is that he wasn’t being exposed to all of the other scientists on the project, so he had a relatively independent outlook. This makes him a nice “barometer” for what kinds of thoughts were thinkable at the time, outside of the standard range of positions that the scientists took on the issues in front of them.

One of the issues that Shurcliff chimed in on was the prospects of long-term scientific secrecy. Late in the project (i.e. late 1944 and early 1945), the scientists at the University of Chicago had largely finished up their portion of the work (helping getting the Hanford reactors designed and running), and had more extra time for contemplation of long-term issues than those who were at Los Alamos. So they did things like write the Franck Report and other studies into the long-term prospects of nuclear energy, secrecy, the use of the bomb, and so on. A repeating theme in all of these reports is that long-term, postwar nuclear secrecy would not work. It is a position you will be familiar with from discussions today: secrecy would not prevent foreign nations (or “enemies” more broadly) from getting the bomb,  it would inhibit and slow future American work, and the worst thing imaginable would be a “secret arms race” between nations.

Vannevar Bush and James Conant, despite being key people behind the secrecy procedures of the Manhattan Project (which started well before the Army got involved), thoroughly embraced the anti-secrecy line. As Bush put it to President Truman in September 1945: “A secret race on atomic bombs can lead to a very unhappy world.”  In fact, almost every discussion I’ve found of postwar secrecy made during the Manhattan Project takes more or less this sort of position.

Shurcliff, however, approached it differently. I’m not sure how he picked up that these thing were “in the air,” though he was in limited doses exposed to the Chicago scientists while doing his patent work. In December 1944, he wrote a seven-page memo to Richard Tolman, another OSRD scientist who worked as a personal technical advisor to General Groves (among other things), with the lengthy subject heading of: “Analysis of the theses: (A) Maintaining secrecy on the details of the present weapon will not insure security. (B) Secrecy will come from ‘keeping ahead.'”1

Click on the image to view the full document.

Click on the image above to view the full memo. Shurcliff’s memo was itself classified “Secret — Limited” which basically meant that only the very top-top level of administrators and advisors were allowed to read it. The “Top Secret” classification was only just starting to be used in this period, and probably only would have been used here if the memo had any insight onto when the United States would have a bomb ready to use.

Keeping to his form, Shurcliff’s memo is highly-structured and carefully argued. He starts it off with a statement of his motivations and his conclusions:

Explanation: Some analysis of these theses appears called for since they lie at the heart of the general secrecy policy which, in turn, is fundamental to the entire postwar policy. These theses have been endorsed by many persons heard by the [Interim] Committee.2 The writer knows of no one-who has disagreed with these theses.

Conclusions: While it can be said that the theses are “more true than false,” it is apparent that they are seriously inadequate and to an appreciable extent misleading, since:

With regard to Thesis A, maintaining secrecy will make for security for a good many years at least — especially with respect to the many smaller countries incapable of developing nucleonics weapons independently.3 To place one’s faith in secrecy may be rash, but appreciably to dispense with secrecy may be even more rash.

With regard to Thesis B, even “keeping ahead” may prove futile when even “obsolete” nucleonics weapons can be employed by an enemy to wipe out our major centers, including nucleonics centers, in a single hour before declaration of war.

If you’re a regular reader of this blog, you’re probably recoiling from Shurcliff’s pro-secrecy arguments. They are pretty far distant from the “there is no secret” mantra of the postwar atomic scientists, but they are not bad arguments. Shurcliff’s approach is eminently pragmatic, not ideological. His memo is one about  technology transfer between nations, with an eye beyond seeing things as just a competition between two powers. Of course, he says, you can’t maintain such secrets indefinitely. But if you can maintain them for a few decades, that’s not nothing — time is a valuable commodity. 

Shurcliff also augments his analysis with the practical experience of technical espionage. Shurcliff’s main job at the OSRD was to be a liaison with other branches regarding information seized about enemy technology. So if the Allied soldiers found reports about, say, German tanks, they would send them to Shurcliff, and he’d figure out which of the OSRD divisions could make the most use of it. So unlike the scientists at Chicago, he actually knew a little bit about how difficult it was to construct technology based solely on knowledge alone:

Parenthetical note: The writer recalls many instances during 1943 and 1944 where, despite a wealth of fragmentary information from cooperative enemy prisoners, neutrals, and allied agents, the really significant technical engineering data on enemy devices remained wanting until uncomfortably late dates. Examples are: (a) technical characteristics of German infra-red search receivers and image tubes; (b) control frequencies for the German HS-293 glider bomb; (c) launching means, fuels, and radio control means of the German V-1 flying bomb. In-all these cases the serious gaps in our knowledge were not filled until reasonably intact specimens of the weapons in question had been captured. The abundance of such situations is believed to show that there is a good chance that appreciable amounts of highly-technical engineering data on secret devices may be kept out of enemy hands for years — perhaps decades.

Shurcliff’s estimates on the possibility of real espionage were, in the end, more optimistic than the reality. Neither he nor anyone else suspected that Los Alamos was full of a number of relatively high-level spies, and that direct design information on the bombs would be so immediately and thoroughly compromised. But it is worth noting that Shurcliff’s above discussion about the difficulty of reconstructing a physical technology from design information alone is, in fact, shown to be reasonably on the mark when we look at the history of the Soviet program. Even though the Soviets did have very detailed design information on the atomic bomb, it still took a tremendous effort to turn that into an actual bomb, and it has become much more clear over the years that information was not the primary determinant of when the Soviets developed their first nuclear weapons.

William Shurcliff, 50 years later.

William Shurcliff, 50 years later.

Lastly, Shurcliff’s views on “staying ahead” feel remarkably relevant to our modern day, as well. Nukes, he argues, are not weapons were there is such a significant difference between the “best” and the “second-best.” Getting hit with an “obsolete” weapon is still going to be a disastrous thing. Does it matter that the North Korean’s largest test was 10 kilotons, whereas the largest bomb in the US arsenal is megaton-range? To most people, probably not — 10 kilotons will still ruin your day.

Shurcliff ends his memo with a set of “Concluding Remarks”:

We are entering an age (starting, say, in 1960) in which even inferior arms (e.g., 1950 nucleonics bombs) any be used suddenly to cripple and perhaps conquer the most advanced country. The coming age may be further characterised (in the following over-simplified and over-dramatic terms:!) thus:

An age in which surprise aggression can laugh at military defense;
An age in which nucleonics is the grand currency of military negotiations;
An age in which our scientists will no longer be able to contribute to the defense of the country;
An age in which international physical compulsion is possible, but in which international physical conflict is impossible;
An age in which international conflicts can only be moral conflicts;
An age in which the line separating international disagreement between two countries from sudden devastation of one of them may become vanishingly thin;
An age in which “balance of power” and “threat” are merely historical terms.

If the last war was a chemists’ war and  the present war is a physicists’  war, the next war may be an “administrator’s war” — a war whose outcome may be determined by the mere formulation and concealment of the administrative decision as to whether and when to strike.

What a conclusion!

So what became of Shurcliff’s analysis? He sent it to Tolman, who forwarded it to Bush, and Bush in turn forwarded it to Harvey H. Bundy, an assistant to Secretary of War Stimson (and father of McGeorge Bundy), with the following note attached:

Here is the pessimistic viewpoint, and I think you ought to read it. I would add 1) while scientific interchange is inevitable, transmission of details of weapons is not. 2) A sudden strike will not prevent a riposte, if stores of weapons are well protected underground. The case as between two nations with hidden and ample supplies is of most interest, as it will be the case probably, and is not here treated.

I doubt Shurcliff ever knew that his memo had been forwarded up the chain like this — the secrecy, ironically, meant that he rarely had any indication of what was going on other than his own little corner of things. And perhaps even more ironically, that never kept him from speculating and dreaming about the possibilities of the future.

I don’t think anything more came of his memo. But I do treasure it, not because I necessarily agree with it — though I do find it better rooted in the realities of technology and epistemology than many of the statements of the anti-secrecy scientists of the time — but because it is a little indication of the fact that there were some nuclear physicists in 1944 who could find ways to defend secrecy (a rare thing!), and also find ways to see, arguably with some clarity, the shape of things to come.

  1. William A. Shurcliff to Richard C. Tolman (8 December 1944), Harrison-Bundy Files Relating to the Development of the Atomic Bomb, 1942-1946, microfilm publication M1108 (Washington, D.C.: National Archives and Records Administration, 1980), Roll 6, Target 4, Folder 75, “Interim Committee — Publicity.” []
  2. The Interim Committee was the main administrative body planning for what to do once the bomb was a matter of public record, i.e. after it had been used on Japan. []
  3. “Nucleonics” was at this time being floated as a new name for the entire field of nuclear technology, in analogy to “electronics.” It didn’t take off. []
Redactions | Visions

Castle Bravo revisited

by Alex Wellerstein, published June 21st, 2013

No single nuclear weapons test did more to establish the grim realities of the thermonuclear age than Castle BRAVO. On March 1, 1954, it was the highest yield test in the United States’ highest-yield nuclear test series, exploding with a force of 15 million tons of TNT. It was also the greatest single radiological disaster in American history. 

Castle BRAVO, 3.5 seconds after detonation, photo taken from a distance of 75 nautical miles from ground zero, from an altitude of 12,500 feet. From DTRIAC SR-12-001.

Castle BRAVO, 3.5 seconds after detonation, photo taken from a distance of 75 nautical miles from ground zero, from an altitude of 12,500 feet. From DTRIAC SR-12-001.

Among BRAVO’s salient points:

  • It was the first “dry fuel” hydrogen bomb test by the United States, validating that lithium-deuteride would work fine as a fusion fuel and making thermonuclear weapons relatively easy to deploy.
  • It had a maximum predicted yield of only 6 megatons — so it was 250% as explosive than was expected.
  • And, of course, it became famous for raining nuclear fallout down on inhabited islands over a hundred miles downwind, and exposing a crew of Japanese fishermen to fatal levels of radiation.

It was this latter event that made BRAVO famous — so famous that the United States had to admit publicly it had a hydrogen bomb. And accidentally exposing the Japanese fishing supply to radiation, less than a decade after Hiroshima and Nagasaki, has a way of making the Japanese people understandably upset. So the shot led to some almost frank discussion about what fallout meant — that being out of the direct line of fire wasn’t actually good enough.

Animation showing the progression of BRAVO's fallout exposure, at 1, 3, 6, 12, and 18 hours. Original source.

Animation showing the progression of BRAVO’s fallout exposure, at 1, 3, 6, 12, and 18 hours. Original source.

I say “almost frank” because there was some distinct lack of frankness about it. Lewis Strauss, the secrecy-prone AEC Chairman at the time and an all-around awful guy, gave some rather misleading statements about the reasons for the accident and its probable effects on the exposed native populations. His goal was reassurance, not truth. But, as with so many things in the nuclear age, the narrative got out of his control pretty quickly, and the fear of fallout was intensified whether he wanted it to be or not.

We now know that the Marshallese suffered quite a lot of long-term harm from the exposures, and that the contaminated areas were contaminated for a lot longer than the AEC guessed they would be. Some of this discrepancy comes from honest ignorance — the AEC didn’t know what they didn’t know about fallout. But a lot of it also came from a willingness to appear on top of the situation, when the AEC was anything but.

"Jabwe, the Rongelap health practitioner, assists Nurse Lt. M. Smith and Dr. Lt. J. S. Thompson, during a medical examination on Kwajalein, 11 March 1954." From DTRIAC SR-12-001.

“Jabwe, the Rongelap health practitioner, assists Nurse Lt. M. Smith and Dr. Lt. J. S. Thompson, during a medical examination on Kwajalein, 11 March 1954.” From DTRIAC SR-12-001.

I’ve been interested in BRAVO lately because I’ve been interested in fallout. It’s no secret that I’ve been working on a big new NUKEMAP update (I expect it to go live in a month or so) and that fallout is but one of the new amazing features that I’m adding. It’s been a long-time coming, since I had originally wanted to add a fallout model a year ago, but it turned out to be a non-trivial thing to implement. It’s not hard to throw up a few scaled curves, but coming up with a model that satisfies the aesthetic needs of the general NUKEMAP user base (that is, the people who want it to look impressive but aren’t interested in the details) and also has enough technical chops so that the informed don’t just immediately dismiss it (because I care about you, too!) involved digging up some rather ancient fallout models from the Cold War (even going out to the National Library of Medicine to get one rare one in its original paper format) and converting them all to Javascript so they can run in modern web browsers. But I’m happy to say that as of yesterday, I’ve finally come up with something that I’m pleased with, and so I can now clean up my Beautiful Mind-style filing system from my office and living room.

Why yes, you can

The most famous version of BRAVO’s total-dose exposure contours, from Glasstone and Dolan. It looks great on a mug, by the way.

Recently I was sent a PDF of a recent report (January 2013) by the Defense Threat Reduction Information Analysis Center (DTRIAC) that looked back on the history of BRAVO. It doesn’t seem to be easily available online (though it is unclassified), so I’ve posted it here: “Castle Bravo: Fifty Years of Legend and Lore (DTRIAC SR-12-001).” I haven’t had time to read the whole thing, but skipping around has been rewarding — it takes a close look at the questions of fallout prediction, contamination, and several “myths” that have circulated since 1954. It notes that the above fallout contour plot, for example, was originally created by the USAF Air Research and Development Command (ARDC), and that “it is unfortunate that this illustration has been so widely distributed, since it is incorrect.” The plume, they explain, actually under-represents the extent of the fallout — the worst of the fallout went further and wider than in the above diagram.

You can get a sense of the variation by looking at some of the other plots created of the BRAVO plume:

BRAVO fallout contours produced by the AFSWP, NRDL, and RAND Corp. Source.

BRAVO fallout contours produced by the Armed Forces Special Weapons Project, Naval Radiological Defense Laboratory, and the RAND Corporation. Source. Click image to enlarge.

The AFSWP diagram on the left is relatively long and narrow; the NRDL one in the middle is fat and horrible. The RAND one at the right is something of a compromise. All three, though, show the fallout going further than the ADRC model — some 50-100 miles further. On the open ocean that doesn’t matter so much, but apply that to a densely populated part of the world and that’s pretty significant!

DTRIAC SR-12-001 is also kind of amazing in that it has a lot of photographs of BRAVO and the Castle series that I’d never seen before, some of which you’ll see around this post. One of my favorites is this one, of Don Ehlher (from Los Alamos) and Herbert York (from Livermore) in General Clarkson’s briefing room on March 17, 1954, with little mockups of the devices that were tested in Operation Castle:

Ehler and York - Operation Castle devices

There’s nothing classified there — the shapes of the various devices have long been declassified — but it’s still kind of amazing to see of their bombs on the table, as it were. They look like thermoses full of coffee. (The thing at far left might be a cup of coffee, for all that I can tell —  unfortunately the image quality is not great.)

It also has quite a lot of discussion of several persistent issues regarding the exposure of the Japanese crew and the Marshallese natives. I didn’t see anything especially new here, other than the suggestion that the fatality from the Fortunate Dragon fishing boat might have been at least partially because of the very aggressive-but-ineffective treatment regime prescribed by the Japanese physicians, which apparently included the very dubious procedure of repeatedly drawing his blood and then re-injecting it into muscle tissue. I don’t know enough of the details to know what to think of that, but at least they do a fairly good job of debunking the notion that BRAVO’s contamination of the Marshallese was deliberate. I’ve seen that floating around, even in some fairly serious forums and publications, and it’s just not supported by real evidence.

Castle BRAVO, 62 seconds after detonation. "This image was take at a distance of 50 [nautical miles] north GZ from an altitude of 10,000 feet. The lines running upward to the left of the stem and below the fireball are smoke trails from small rockets. At this time the cloud stem was about 4 mi in diameter." From DTRIAC SR-12-001.

Castle BRAVO, 62 seconds after detonation. “This image was take at a distance of 50 [nautical miles] north GZ from an altitude of 10,000 feet. The lines running upward to the left of the stem and below the fireball are smoke trails from small rockets. At this time the cloud stem was about 4 mi in diameter.” From DTRIAC SR-12-001.

One thing that I hadn’t appreciated as well before is that BRAVO is pretty much a worst-case scenario from a radiological point of view. It was a very high-yield weapon that was very “dirty” right out of the box: 10 of its 15 megatons (67%) were from fission.1

It was detonated as a surface burst, which automatically means quite a significant fallout problem. Nuclear weapons that detonate so that their fireball does not come into contact with the ground release “militarily insignificant” amounts of fallout, even if their yields are very high. (They are not necessarily “humanly insignificant” amounts, but they are far, far, far less than surface bursts — it is not a subtle difference.2 )

But even worse, it was a surface burst in a coral reef, which is just a really, really bad idea. Detonating nuclear weapons on a desert floor, like in Nevada, still presents significant fallout issues. But a coral reef is really an awful place to set them off, and not just because coral reefs are awesome and shouldn’t be blown up. They are an ideal medium for creating and spreading contamination: they break apart with no resistance, but do so in big enough chunks that they rapidly fall back to Earth. Particle size is a big deal when it comes to fallout; small particles go up with the fireball and stay aloft long enough to lose most of their radioactive energy and diffuse into the atmosphere, while heavy particles fall right back down again pretty quickly, en masse. So blowing up and irradiating something like coral is just the worst possible thing.3

Castle BRAVO, 16 minutes after detonation, seen from a distance of 50 nautical miles, at an altitude of 10,000 feet. From DTRIAC SR-12-001.

Castle BRAVO, 16 minutes after detonation, seen from a distance of 50 nautical miles, at an altitude of 10,000 feet. From DTRIAC SR-12-001.

Note that the famous 50 Mt “Tsar Bomba” lacked a final fission stage and so only 3% of its total yield — 1.5 Mt — was from fission. So despite the fact that the Tsar Bomba was 3.3 times as explosive than Castle Bravo, it had almost 7 times fewer fission products. And its fireball never touched the ground (in fact, it was reflected upwards by its own shock wave, which is kind of amazing to watch), so it was a very “clean” shot radiologically. The “full-sized,” 100 Mt Tsar Bomba would have been 52% fission — a very dirty bomb indeed.

In the end, what I’ve come to take away from BRAVO is that it actually was a mistake even more colossal than one might have originally thought. It was a tremendously bad idea from a human health standpoint, and turned into a public relations disaster that the Atomic Energy Commission never really could kick. 

In retrospect the entire “event” seems to have been utterly avoidable as a radiological disaster, even with all of the uncertainties about yield and weather. It’s cliché to talk about nuclear weapons in terms of playing with “forces of nature beyond our comprehension,” but I’ve come to feel that BRAVO is a cautionary tale about hubris and incompetence in the nuclear age — scientists setting off a weapon whose size they did not know, whose effects they did not correctly forecast, whose legacy will not soon be outlived.

  1. Chuck Hansen, Swords of Armageddon, IV-299. []
  2. The count difference is about three orders of magnitude or so less, judging by shots like Redwing CHEROKEE. That’s still a few rads, but the difference between 1,000 and 1 rad/hr is pretty significant. []
  3. Couldn’t they have foreseen this? In theory, yes — they had already blown up a high-yield, “dirty” fission hydrogen bomb on a coral reef in 1952, the MIKE test. But somewhere a number of AEC planners seem to have gotten their wires crossed, because a lot of them thought that MIKE had very little fallout, when in fact it also produced a lot of very similar contamination. Unlike BRAVO, however, MIKE’s fallout blew out over open sea. The only radiation monitoring seems to have been done on the islands, and so they don’t seem to have ever drawn up one of those cigar-shaped plumes for it. See e.g. the discussion here on page 51. []