Archive for the ‘News and Notes’ Category

Webcast: “What’s become of our nuclear golden age?”

Monday, September 9th, 2013

A 1959 advertisement for Union Carbide in the Saturday Evening Post.

We no longer live in the nuclear age, or, at least, we don’t think we do — so I concluded awhile back. But that won’t stop me from talking about it! This Wednesday, September 11th, 2013, I will be participating in a live webcast at the Chemical Heritage Foundation in Philadelphia:

On Sept. 11, 2013 the Chemical Heritage Foundation will present a live online video discussion, “Power and Promise: What’s become of our nuclear golden age?” Guests Alex Wellerstein and Linda Richards will take stock of our turbulent nuclear past and look at how it has shaped our current attitudes, for better and for worse.

Some say we are on the verge of a bright nuclear future in which nuclear power will play a major role in responding to climate change. Others say that we should expect more Fukushimas. Whichever way our nuclear future goes, there will be energy and environmental tradeoffs. On CHF’s blog you can decide on the tradeoffs you are willing to make. Tweet to vote your choices. Viewers can also tweet questions to the guests before or during the show by using the hashtag #HistChem.

“Power and Promise: What’s Become of Our Nuclear Golden Age?” will air at 6 p.m. EST.  Watch the livecast episode at www.chemheritage.org/live.

Guest Bios:

Alex Wellerstein is an associate historian at the Center for History of Physics at the American Institute of Physics. He holds a Ph.D. in the history of science from Harvard University and his research interests include the history of Cold War technology, including nuclear technology. He blogs at http://blog.nuclearsecrecy.com/.

Linda M. Richards is a former CHF fellow and will be returning in 2014 as a Doan Fellow. She is working on a Ph.D. on nuclear history at Oregon State University. Her dissertation is titled “Rocks and Reactors: The Origins of Radiation Exposure Disparity, 1941-1979.” In 2012 she received a National Science Foundation grant that took her to the International Atomic Energy Agency (IAEA) in Vienna, UN agencies and archives in Geneva, and to North American indigenous uranium mining sites.

About the Show:

#HistChem is a monthly interactive livestreamed show produced by the Chemical Heritage Foundation. It features topically compelling issues that intersect science, history and culture. Hosts are Michal Meyer, editor of Chemical Heritage Magazine, and Bob Kenworthy, a CHF staff member and chemist. The first episode, “How We Learned to Stop Worrying and Love the Zombie Apocalypse,” debuted in August, 2013. Follow the show and related news at chemheritage.org/media

About the Chemical Heritage Foundation:

The Chemical Heritage Foundation is a collections-based nonprofit organization that preserves the history and heritage of chemistry, chemical engineering, and related sciences and technologies. The collections are used to create a body of original scholarship that illuminates chemistry’s role in shaping society. In bridging science with the humanities, arts, and social sciences, CHF is committed to building a vibrant, international community of scholars; creating a rich source of traditional and emerging media; expanding the reach of our museum; and engaging the broader society through inventive public events.

This should be a fun thing, as Linda and I take somewhat different approaches (both interesting) to many nuclear issues, and the CHF team asks great questions. You can Tweet in questions for the show with the right hashtag (#HistChem) and it may somehow magically get to us while we’re talking. And hey, I’ll be wearing a suit!

Update: The video has been posted online, enjoy!

The NUKEMAPs are here

Thursday, July 25th, 2013

I’m super excited to announce that last Thursday, at an event hosted by the Center for Nonproliferation Studies at the Monterey Institute for International Study, I officially launched NUKEMAP2 and NUKEMAP3D. I gave a little talk, which I managed to record, but I haven’t had the time (more details below on why!) to get that up on YouTube yet. Soon, though.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

A Soviet weapon from the Cuban Missile Crisis, centered on Washington, DC, with fallout and casualties shown.

NUKEMAP2 is an upgraded version of the original NUKEMAP, with completely re-written effects simulations codes that allow one a huge amount of flexibility in the nuclear detonation one is trying to model. It also allows fallout mapping and casualty counts, among other things. I wanted to make it so that the NUKEMAP went well beyond any other nuclear mapping tools on the web — I wanted it to be a tool that both the layman and the wonk could use, a tool that rewarded exploration, and a tool that, despite the limitations of a 2D visualization, could work to deeply impress people with the power of a nuclear explosion.

The codes that underly the model are all taken from Cold War effects models. At some point, once it has been better documented than it is now, I’ll probably release the effects library I’ve written under an open license. I don’t think there’s anything quite like it out there at the moment available for the general public. For the curious, there are more details about the models and their sources here.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

The mushroom cloud from a 20 kiloton detonation, centered on downtown DC, as viewed from one of my common stomping grounds, the Library of Congress.

NUKEMAP3D uses Google Earth to allow “3D” renderings of mushroom clouds and the nuclear fireball. Now, for the first time, you can visualize what a mushroom cloud from a given yield might look like on any city in the world, viewed from any vantage-point you can imagine. I feel like it is safe to say that there has never been a nuclear visualization tool of quite this nature before.

I got the idea for NUKEMAP3D while looking into a story for the Atlantic on a rare photo of the Hiroshima mushroom cloud. One of the issues I was asked about was how long after the detonation the photograph was taken — the label on the back of the photograph said 30 minutes, but there was some doubt. In the process of looking into this, I started to dig around the literature on mushroom cloud formation and the height of the Hiroshima cloud at various time intervals. I realized that I had no sense for what “20,000 feet” meant in terms of a cloud, so I used Google Earth to model a simple 20,000 foot column above the modern-day city of Hiroshima.

I was stunned at the size of it, when viewed from that perspective — it was so much larger than it even looked in photographs, because the distance that such photographs were taken from makes it very hard to get a sense of scale. I realized that modeling these clouds in a 3D environment might really do something that a 2D model could not. It seems to switch on the part of the brain that judges sizes and areas in a way that a completely flat, top-down overlay does not. The fact that I was surprised and shocked by this, despite the fact that I look at pictures of mushroom clouds probably every day (hey, it’s my job!), indicated to me that this could be a really potent educational tool.

That same 20 kiloton cloud, as viewed from airplane height.

That same 20 kiloton cloud, as viewed from airplane height.

I’m also especially proud of the animated mode, which, if I’m allowed to say, was a huge pain in the neck to program. Even getting a somewhat “realistic”-looking cloud model was a nontrivial thing in Google Earth, because its modeling capabilities are somewhat limited, and because it isn’t really designed to let you manipulate models in a detailed way. It lets you scale model sizes along the three axes, it allows you to rotate them, and it allows you to change their position in 3D space. So I had to come up with ways of manipulating these models in realtime so that they would approximate a semi-realistic view of a nuclear explosion, given these limitations.

It’s obviously not quite as impressive as an actual nuclear explosion (but what is?), and my inability to use light as a real property (as you could in a “real” 3D modeling program) diminishes things a bit (that is, I can’t make it blinding, and I can’t make it cast shadows), but as a first go-around I think it is still a pretty good Google Earth hack. And presumably Google Earth, or similar tools, will only get better and more powerful in the future.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

Screen captures of the animation for a 20 kt detonation over DC. These screenshots were taken in 10 second intervals, but are accelerated 100X here. The full animation takes about five minutes to run, which is roughly how the cloud would grow in real life.

If you’ve been following my Twitter feed, you also probably have picked up that this has been a little bit of a saga. I tried to launch it on last Thursday night, but the population database wasn’t really working very well. The reason is that it is very, very large — underneath it is a population density map of the entire planet, in a 1km by 1km grid, and that means it is about 75 million records (thank goodness for the oceans!). Optimizing the queries helped a bit, and splitting the database up helped a bit. I then moved the whole database to another server altogether, just to make sure it wasn’t dragging down the rest of the server. But on Monday,just when the stories about NUKEMAP started to go up, my hosting company decided it was too much traffic and that I had, despite “unlimited bandwidth” promises, violated the Terms of Service by having a popular website (at that point it was doing nothing but serving up vanilla HTML, Javascript, and CSS files, so it wasn’t really a processing or database problem). Sigh. So I frantically worked to move everything to a different server, learned a lot about systems administration in the process, and then had the domain name issue a redirect from the old hosting company. And all of that ended up taking a few days to finalize (the domain name bit was frustratingly slow, due to settings chosen by the old hosting company).

But anyway. All’s well that ends well, right? Despite the technical problems, since moving the site to the new server, there have been over 1.2 million new “detonations” with the new NUKEMAPs, which is pretty high for one week of sporadic operation! 62% of them are with NUKEMAP3D, which is higher than I’d expected, given the computer requirements required to run the Google Earth plugin. The new server works well most of the time, so that’s a good thing, though there are probably some tweaks that still need to be done for it to happily run the blog and the NUKEMAPs. There is, though I don’t want to make it too intrusive or seem too irritating, a link now on the NUKEMAP for anyone who wanted to chip in to the server fund. Completely optional, and not expected, but if you did want to chip in, I can promise you a very friendly thank-you note at the very least.

Now that this is up and “done” for now, I’m hoping to get back to a regular blogging schedule. Until then, try out the new NUKEMAPs!

Presenting NUKEMAP2 and NUKEMAP3D

Monday, July 22nd, 2013

A longer post is coming later today, but in the meantime, I just wanted to make sure anyone on here knows that NUKEMAP2 and NUKEMAP3D are now online:

  • NUKEMAP2: sequel to the original NUKEMAP, with newly-derived effects equations and lots of brand-new options, including crater size, radioactive fallout plumes (with adjustable wind speeds and fission fractions!), and casualty counts! 
  • NUKEMAP3D: the next dimension of nuclear effects mapping, with 3D modeling and real-time animations of custom-built mushroom clouds and nuclear fireballs.
The mushroom cloud from a 20 kiloton explosion, centered on downtown San Francisco, as viewed from my old house in the Berkeley Hills. Estimated fatalities: 75,200.

The mushroom cloud from a 20 kiloton explosion, centered on downtown San Francisco, as viewed from my old house in the Berkeley Hills. Estimated fatalities: 75,200.

Technically the sites went live last Thursday, July 18, but there were some technical issues that took until the weekend to finalize (if they are, indeed, finalized) due to the heavy database usage of the new features (e.g. the casualty database). But I’ve moved things around a bit, optimized some sloppy queries, and now things seem to be doing pretty good despite being under a very heavy user load. More information soon!

The new NUKEMAP is coming

Friday, July 12th, 2013

I’m excited to announce, that after a long development period, that the new NUKEMAP is going to debut on Thursday, July 18th, 2013. There will be an event to launch it, hosted by the James Martin Center for Nonproliferation Studies of the Monterey Institute of International Studies in downtown Washington, DC, from 10-11:30 am, where I will talk about what it can do, why I’ve done it, and give a demonstration of how it works. Shortly after that, the whole thing will go live for the entire world.

Nukemap preview - fallout

Radioactive fallout dose contours from a 2.3 megaton surface burst centered on Washington, DC, assuming a 15 mph wind and 50% yield from fission. Colors correspond to 1, 10, 100, and 1,000 rads-per-hour at 1 hour. This detonation is modeled after the Soviet weapons in play during the Cuban Missile Crisis.

I don’t want to spill all of the beans early, but here’s a teaser. There is not just one new NUKEMAP. There are two new NUKEMAPs. One of them is a massive overhaul of the back-end of the old NUKEMAP, with much more flexible effects calculations and the ability to chart all sorts of other new phenomena — like radioactive fallout (finally!), casualty estimates, and the ability to specify airbursts versus ground bursts. All of these calculations are based on models developed by people working for the US government during the Cold War for use in government effects planning. So you will have a lot of data at your instant disposal, should you want it, but all within the smooth, easy-t0-use NUKEMAP interface you know and love.

This has been a long time in development, and has involved me chasing down ancient government reports, learning how to interpret their equations, and converting them to Javascript and the Google Maps API. So you can imagine how “fun” (read: not fun) that was, and how Beautiful Mind my office and home got in the process. And as you’ve no doubt noticed in the last few weeks, doing obsessive, detailed, mathematical technical work in secret all week did not give me a lot of inspiration for historical blog posts! So I’ll be glad to move on from this, and to get it out in the light of day. (Because unlike the actual government planners, my work isn’t classified.)

Above is an image from the report which I used to develop the fallout model. Getting a readable copy of this involved digging up an original copy at the National Library of Medicine, because the versions available in government digital databases were too messed up to reliably read the equations. Some fun: none of this was set up for easy translation into a computer, because nobody had computers in the 1960s. So it was designed to help you draw these by hand, which  made translating them into Javascript all the more enjoyable. More fun: many of these old reports had at least one typo hidden in their equations that I had to ferret out. Well, perhaps that was for the best — I feel I truly grok what these equations are doing at this point and have a lot more confidence in them than the old NUKEMAP scaling models (which, by the way, are actually not that different in their radii than the new equations, for all of their simplifications).

But the other NUKEMAP is something entirely new. Entirely different. Something, arguably, without as much historical precedent — because people today have more calculation and visualization power at their fingertips than ever before. It’s one thing for people to have the tools to map the bomb in two dimensions. There were, of course, even websites before the NUKEMAP that allowed you to do that to one degree or another. But I’ve found that, even as much as something like the NUKEMAP allows you to visualize the effects of the bomb on places you know, there was something still missing. People, myself included, were still having trouble wrapping their heads around what it would really look like for something like this to happen. And while thinking about ways to address this, I stumbled across a new approach. I’ll go into it more next week, but here’s a tiny teaser screenshot to give you a bit of an indication of what I’m getting about.

Nukemap preview

That’s the cloud from a 10 kiloton blast — the same yield as the North Korean’s 2013 test, and the model the US government uses for a terrorist nuclear weapon — on mid-town Manhattan, as viewed from New York harbor. Gives you a healthy respect for even a “small” nuclear weapon. And this is only part of what’s coming.

Much more next week. July 18th, 2013 — two days after the 68th-anniversary of the Trinity test — the new NUKEMAPs are coming. Tell your friends, and stay tuned.

NUKEMAP at one year and 10 million blasts

Friday, February 8th, 2013

A year ago this week, I launched the NUKEMAP. It’s perhaps fitting that this week, NUKEMAP also (coincidentally) hit 10 million “detonations.” That corresponds with just over 2.25 million pageviews (1.96 million unique). Which is pretty crazy. I attribute a lot of the success I’ve had with this blog to the NUKEMAP, as a driver of traffic. A few percent of the visitors look at the blog; a few percent of them become regular readers. A few percent of two million is a lot of people.

The mapping of where people bombed doesn’t look significantly different than did the first million, so I won’t post another one of those images. But here’s some fun-with-data for you: below is a heatmap of all of the 10 million detonations. The “hotter” it is (e.g. red or orange), the more times a given place or region was nuked. I shaved off a few decimal places from the latitude and longitude coordinates so that repeated nukes in the same basic area were lumped together (and so you don’t have to worry if you nuked your neighbor’s house a million times), but it is still pretty granular.

NUKEMAP at 10 million

If you click on the image, you’ll go to an interactive version.1

For people who are into metrics, here are the daily, weekly, and monthly pageview graphs of the NUKEMAP from Google Analytics. After an initial big burst, it died down a bit (to 2,000 hits or so a day, mind you), punctuated by occasional new big bursts as it occasionally landed on the Reddit front page every once in awhile.

Hey, even Jon Stewart was into it:

"sinc when"

John asks: “When did lower Manhattan become the standard unit of destruction measurement?” Answer: Certainly by the late 1940s, probably even earlier.

OK, so Jon Stewart posted something that was originally from ABC News, so technically ABC News was into it, but it’s still Jon Stewart! I’ll take what I can get in that department!

Awhile back I did a write-up of NUKEMAP usage patterns for WMD Junction, an online journal: So Long, Mom, I’m Off to Drop the Bomb: A Case Study in Public Usage of an Educational Tool. Check it out if you are curious about who-bombed-who.

People have also done some pretty cool things with it. The infographic shown by Jon Stewart derives  from a setting that was sent around on Reddit and elsewhere showing the effects of a 6 kiloton bomb on lower Manhattan, with 6 kilotons being one of the yield estimates of the 2009 North Korean test. 6 kilotons doesn’t sound like a lot by modern standards, unless you happen to be right underneath it, and then it’s probably worth taking seriously.

An engineer in the U.K. (who has asked to be credited only as “RLBH”)  recently made and sent me an incredibly elaborate map modeling  ”Probable Nuclear Targets in the United Kingdom” as imagined by the Joint Intelligence Committee of the British Ministry of Defence in 1967:2

NUKEMAP UK targets, 1967

That’s pretty neat, and is actually very much related to the original project of which NUKEMAP was originally a spin-off (dubbed as TARGETMAP, which I’ve put indefinitely on hold for the moment for lack of time).

There’s only one lesson that I’ve been a little disturbed by. An awful lot of people are amazed at how small the Hiroshima and Nagasaki bombs were compared to thermonuclear weapons. That’s true — but it’s because the megaton-range weapons were insane, not because the Hiroshima and Nagasaki bombs were small. By human standards, 10-20 kilotons should still be horrifying. From a view of 100,000 feet, though, it’s a lot less impressive than the Tsar Bomba, even though the latter was a lot less of a realistic threat than weapons of “smaller” yields, and is certainly a lot less of a threat today. When you put “small” nukes next to monstrous nukes, it is easy to lose perspective. That’s not my goal — my goal is to help people get a sense of scale, something that I think is even more important in a post-Cold War age.

So I’m excited to announce that I’m deep in the coding of a successor to NUKEMAP. It isn’t quite ready for prime time, yet, but it’s well past the proof-of-concept stage. It works. I’m trying to incorporate the lessons I learned with the use and reception of the first NUKEMAP into the new one, and trying to provide a very different sort of user experience. The details are still hush-hush. I’ve told a handful of people about it in person, to gauge reactions, and have a few beta testers lined up, but I’m confident enough to say that this is something entirely new. The new NUKEMAP will do things that no other online nuclear effects simulator does. So keep an eye out for it. There is no estimated-time-of-arrival — it’ll be up when it’s good and ready — but it will probably be up by the end of spring 2013.

Notes
  1. Note: the underlying dataset for the 10 Million browser is static. So it would not be worth your time trying to influence how it looks at this point by bombing all over the place. []
  2. RLBH sent me some details on how he made his map:

    I’m sure you’re familiar with Professor Peter Hennessy’s book The Secret State: Preparing for the Worst, 1945-2010 (London: Penguin, 2003), which contains (amongst other things) a list of ‘Probable Nuclear Targets in the United Kingdom’ drawn up by the Joint Intelligence Committee of the British Ministry of Defence in 1967. This list suggests the use of some 377 nuclear devices against 100 targets in the United Kingdom, none of less than 500 kilotons yield and with a total yield between 272.5 and 362.5 megatons.

    I know that a Swedish gent has used your NUKEMAP tool to generate his own targeting plan against Sweden, but I’ve not heard of it being used to illustrate a ‘real’ war plan before. For my own elucidation, I’ve modelled the JIC’s targeting plan for the UK in NUKEMAP, with the following caveats applying to my method.

    - Where multiple devices are programmed for a single point target, I’ve only modelled the largest. Some such targets were overkilled to a remarkable extent, even allowing for delivery system unreliability – most command & control centres, for instance, were allocated two missiles warheads of 3 megatons each, and two 1 megaton gravity bombs.

    - For the industrial area targets, I’ve selected DGZs on the basis of my own best judgment, generally seeking to maximise the industry receiving 20 psi of overpressure. Unsurprisingly, this results in significant overkill against the housing and population of the targeted cities. This also means that some surprisingly large cities are totally untouched by the initial strike, which would certainly be targeted in a pure countervalue ‘dehousing’ strike. I’ve similarly eyeballed the attack on London, assuming here that the eight one-megaton warheads would be dual-targeted on four DGZs.

    - I’ve not made any allowance for devices initiating over other than their programmed DGZ. This means, in effect, that two or three devices are ‘wasted’ against some targets, which could in fact be more profitably used elsewhere. This is especially the case, of course, for the bomber-carried devices, as these can more readily be retargeted.

    - Where the yield of devices is specified as a range, I’ve used the simple arithmetic mean of the maximum and minimum. This means there are a few unusual sized weapons used.

    - I’ve treated all devices as airbursts, because of the limitations of NUKEMAP. This isn’t meant as a criticism, it’s far and away the best tool of its’ kind that I’ve seen, and there’s obviously a tradeoff between usability and flexibility. In any case, some 140 devices directed against 70 targets (bunkers, dockyards and airfields) ought to be ground bursts.

    - I’ve also interpreted the central government target at Cheltenham to mean the BURLINGTON bunker at Corsham, rather than GCHQ as Hennesy does. Both would be viable targets, but GCHQ is out of keeping with the rest of the list, whereas BURLINGTON was thought highly likely to be compromised and it’s unlikely that RSGs would be hit and the Government bunker ignored. []