Posts Tagged ‘Declassification’

Redactions

General Groves’ secret history

Friday, September 5th, 2014

The first history of the Manhattan Project that was ever published was the famous Smyth Report, which was made public just three days after the bombing of Nagasaki. But the heavily-redacted Smyth Report understandably left a lot out, even if it did give a good general overview of the work that had been done to make the bomb. Deep within the secret files of the Manhattan Project, though, was another, classified history of the atomic bomb. This was General Leslie Groves’ Manhattan District History. This wasn’t a history that Groves ever intended to publish — it was an internal record-keeping system for someone who knew that over the course of his life, he (and others) would need to be able to occasionally look up information about the decisions made during the making of the atomic bomb, and that wading through the thousands of miscellaneous papers associated with the project wouldn’t cut it.

Manhattan District History - Book 2 - Vol 5 - cover

Groves’ concern with documentation warms this historian’s heart, but it’s worth noting that he wasn’t making this for posterity. Groves repeatedly emphasized both during the project and afterwards that he was afraid of being challenged after the fact. With the great secrecy of the Manhattan Project, and its “black” budget, high priority rating, and its lack of tolerance for any external interference, came a great responsibility. Groves knew that he had made enemies and was doing controversial things. There was a chance, even if everything worked correctly (and help him if it didn’t!), that all of his actions would land him in front of Congress, repeatedly testifying about whether he made bad decisions, abused public trust, and wasted money. And if he was asked, years later, about the work of one part of the project, how would he know how to answer? Better to have a record of decisions put into one place, should he need to look it up later, and before all of the scientists scattered to the wind in the postwar. He might also have been thinking about the memoir he would someday write: his 1962 book, Now it Can Be Told, clearly leans heavily on his secret history in some places.

Groves didn’t write the thing himself, of course. Despite his reputation for micromanagement, he had his limits. Instead, the overall project was managed by an editor, Gavin Hadden, a civil employee for the Army Corps of Engineers. Individual chapters and sections were written by people who had worked in the various divisions in question. Unlike the Smyth Report, the history chapters were not necessarily written near-contemporaneously with the work — most of the work appears to have been started after the war ended, some parts appear to have not been finished until 1948 or so.

General Groves not amused

In early August 1945 — before the bombs had been dropped — a guide outlining the precise goals and form of the history was finalized. It explained that:

Tho purpose of the history is to serve as a source of historical information for War Department officials and other authorized individuals. Accordingly, the viewpoint of the writer should be that of General Groves and the reader should be considered as a layman without any specialized knowledge of the subject who may be critical of the Department or the project.

Which is remarkably blunt: write as if Groves himself was saying these things (because someday he might!), and write as if the reader is someone looking for something to criticize. Later the guide gives some specific examples on how to spin problematic things, like the chafing effect of secrecy:

For example, the rigid security restrictions of the project in many cases necessitated the adoption of unusual measures in the attainment of a local objective but the maintenance of security has been recognized throughout as an absolute necessity. Consequently, instead of a statement such as, “This work was impeded by the rigid security regulations of the District,” a statement such as, “The necessity of guarding the security of the project required that operations be carried on in — etc.” would be more accurate.1

This was the history that Groves grabbed whenever he did get hauled in front of Congress in the postwar (which happened less than he had feared, but it still happened). This was the history that the Atomic Energy Commission relied upon whenever it needed to find out what its predecessor agencies had done. It was a useful document to have around, because it contains all manner of statistics, technical details, legal details, and references to other documents in the archive.

"Dante's Inferno: A Pocket Mural" by Louis C. Anderson, a rather wonderful and odd drawing of the Calutron process. From Manhattan District History, Book 5, "Electromagnetic Project," Volume 6.

“Dante’s Inferno: A Pocket Mural” by Louis C. Anderson, a rather wonderful and odd drawing of the Calutron process. From Manhattan District History, Book 5, “Electromagnetic Project,” Volume 6.

The Manhattan District History became partially available to the general public in 1977, when a partial version of it was made available on microfilm through the National Archives and University Publications of America as Manhattan Project: Official History and Documents. The Center for Research Libraries has a digital version that you can download if you are part of a university that is affiliated with them (though its quality is sometimes unreadable), and I’ve had a digital copy for a long time now as a result.2 The 1977 microfilm version was missing several important volumes, however, including the entire book on the gaseous diffusion project, a volume on the acquisition of uranium ore, and many technical volumes and chapters about the work done at Los Alamos. All of this was listed as “Restricted” in the guide that accompanied the 1977 version.3

I was talking with Bill Burr of the National Security Archive sometime in early 2013 and it occurred to me that it might be possible to file a Freedom of Information Act request for the rest of these volumes, and that this might be something that his archive would want to do. I helped him put together a request for the missing volumes, which he filed. The Department of Energy got back pretty promptly, telling Bill that they were already beginning to declassify these chapters and would eventually put them online.

Manhattan Project uranium production flow diagram, from book 7, "Feed materials."

Manhattan Project uranium production flow diagram, from Manhattan District History, Book 7, “Feed materials.”

The DOE started to release them in chunks in the summer of 2013, and got the last files up this most recent summer. You can download each of the chapters individually on their website, but their file names are such that they won’t automatically sort in a sensible way in your file system, and they are not full-text searchable. The newly-released files have their issues — a healthy dose of redaction (and one wonders how valuable that still is, all these years — and proliferations — later), and some of the images have been run through a processor that has made them extremely muddy to the point of illegibility (lots of JPEG artifacts). But don’t get me started on that. (The number of corrupted PDFs on the NNSA’s FOIA website is pretty ridiculous for an agency that manages nuclear weapons.) Still, it’s much better than the microfilm, if only because it is rapidly accessible.

But you don’t need to do that. I’ve downloaded them all, run them through a OCR program so they are searchable, and gave them sortable filenames. Why? Because I want people — you — to be able to use these (and I do not trust the government to keep this kind of thing online). They’ve still got loads of deletions, especially in the Los Alamos and diffusion sections, and the pro-Groves bent to things is so heavy-handed it’s hilarious at times. And they are not all necessarily accurate, of course. I have found versions of chapters that were heavily marked up by someone who was close to the matter, who thought there were lots of errors. In the volumes I’ve gone the closest over in my own research (e.g. the “Patents” volume), I definitely found some places that I thought they got it a little wrong. But all of this aside, they are incredibly valuable, important volumes nonetheless, and I keep finding all sorts of unexpected gems in them.

You can download all of the 79 PDF files in one big ZIP archive on Archive.org. WARNING: the ZIP file is 760MB or so. You can also download the individual files below, if you don’t want them all at once.

Statistics on the ages of Los Alamos employees, from Ted Hall (19) to Niels Bohr (59). From Manhattan District History, Book 8.

Statistics on the ages of Los Alamos employees, May 1945, from the young spy, Ted Hall (19), to the old master, Niels Bohr (59). From Manhattan District History, Book 8.

What kinds of gems are hidden in these files? Among other things:

And a lot more. As you can see, I’ve drawn on this history before for blog and Twitter posts — I look through it all the time, because it offers such an interesting view into the Manhattan Project, and one that cuts through a lot of our standard narratives about how it worked. There are books and books worth of fodder in here, spread among some tens of thousands of pages. Who knows what might be hidden in there? Let’s shake things up a bit, and find something strange.


Below is the full file listing, with links to my OCR’d copies, hosted on Archive.org. Again, you can download all of them in one big ZIP file by clicking here, (760 MB) or pick them individually from below. Read the full post »

Notes
  1. E.H. Marsden, “Manhattan District History Preparation Guide,” (1 August 1945), copy in the Nuclear Testing Archive, Las Vegas, Nevada, accession number NV0727839. []
  2. In fact, I used portions of it — gasp! — on actual microfilm very early on my grad school career, when you still had to do that sort of thing. The volume on the patenting program was extremely useful when I wrote on Manhattan Project patent policies. []
  3. Some of the Los Alamos chapters were later published in redacted form as Project Y: The Los Alamos Story, in 1983. []
Meditations | Redactions

The Problem of Redaction

Friday, April 12th, 2013

Redaction is one of those practices we take for granted, but it is actually pretty strange if you think about it. I mean, who would imagine that the state would say, “well, all of this is totally safe for public consumption, except for a part right here, which is too awful to be legally visible, so I’ll just blot out that part. Maybe I’ll do it in black, maybe in white, maybe I’ll add DELETED in big bold letters, just so I know that you saw that I deleted it.”

From Hans Bethe's "Memorandum on the History of the Thermonuclear Program" (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From Hans Bethe’s “Memorandum on the History of the Thermonuclear Program” (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From a security perspective, it’s actually rather generous. The redactor is often giving us the context of the secret, the length of the material kept from us (a word? a sentence? a paragraph? a page?), and helpfully drawing our eye to the parts of the document that still contain juicy bits. The Onion’s spoof from a few years back, “CIA Realizes It’s Been Using Black Highlighters All These Years,” is only slightly off from the real truth. Blacking something out is only a step away from highlighting its importance, and the void makes us curious. In fact, learning what was actually in there can be quite anticlimactic, just as learning how a magician does their trick (“the guy in the audience is in on the trick”).

And, of course, the way the US declassification system is set up virtually guarantees that multiple, differently-redacted copies of documents will eventually exist. Carbon copies of the same documents exist in multiple agencies, and each agency can be separately petitioned for copies of their files, and they will send them to individual reviewers, and they will each review their guides and try and interpret them. There’s very little centralization, and lots of individual discretion in interpreting the guides.

The National Security Archive recently posted an Electronic Briefing Book that was very critical of this approach. In their case, they pointed out that a given paragraph in a once-secret document that was deemed by the redactor to be completely safe in 2001 was in 2003 deemed secret again, and then, in 2007, reaffirmed safe, and then, in 2012, again secret. “There often seems little logic to redaction decisions, which depend on the whim of the individual reviewer, with no appreciation of either the passage of time or the interests of history and accountability,” writes Michael Dobbs.

This sort of thing happens all the time, of course. In the National Security Archive’s Chuck Hansen papers there are bundles of little stapled “books” he would create of multiply, differently-redacted copies of the same document. They are a fun thing to browse through, viewing four different versions of the same page, each somewhat differently hacked up.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

In the case of Hansen’s papers, these differences came about because he was filing Freedom of Information Act requests (or looking at the results of other’s requests) over extended periods of time to different agencies. The passage of time is important, because guides change in the meantime (usually towards making things less secret; “reclassification” is tricky). And the multiple sites means you are getting completely different redactors looking at it, often with different priorities or expertise.

Two different redactors, working with the exact same guides, can come up with very different interpretations. This is arguably inherent to any kind of classifying system, not just one for security classifications. (Taxonomy is a vicious profession.) The guides that I have seen (all historical ones, of course) are basically lists of statements and classifications. Sometimes the statements are very precise and technical, referencing specific facts or numbers. Sometimes they are incredibly broad, referencing entire fields of study. And they can vary quite a bit — sometimes they are specific technical facts, sometimes they are broad programmatic facts, sometimes they are just information about meetings that have been held. There aren’t any items that, from a distance, resemble flies, but it’s not too far off from Borges’ mythical encyclopedia.

The statements try to be clear, but if you imagine applying them to a real-life document, you can see where lots of individual discretion would come into the picture. Is fact X implied by sentence Y? Is it derivable, if paired with sentence Z? And so on. And there’s a deeper problem, too: if two redactors identify the same fact as being classified, how much of the surrounding context do they also snip out with it? Even a stray preposition can give away information, like whether the classified word is singular or plural. What starts as an apparently straightforward exercise in cutting out secrets quickly becomes a strange deconstructionist enterprise.

One of my favorite examples of differently redacted documents came to me through two Freedom of Information Act requests to the same agency at about the same time. Basically, two different people (I presume) at the Department of Energy looked at this document from 1970, and this was the result:

1970 AEC declassification guide redactions

In one, the top excerpt is deemed declassified and the bottom classified. In the other, the reverse. Put them together, and you have it all.  (While I’m at it, I’ll also just add that a lot of classified technical data looks more or less like the above: completely opaque if you aren’t a specialist. That doesn’t mean it isn’t important to somebody, of course. It is one of the reasons I am resistant to any calls for “common sense” classification, because I think we are well beyond the “common” here.) In this case, the irony is double, because what they’re de/classifying are excerpts from classification guides… very meta, no?1

What’s going on here? Did the redactors really interpret their guidelines in exactly the opposite ways? Or are both of these borderline cases where discretion was required? Or was it just an accident? Any of these could be plausible explanations, though I suspect they are each borderline cases and their juxtaposition is just a coincidence. I don’t actually see this as a symptom of dysfunction, though. I see it as a natural result of the kind of declassification system we have. It’s the function, not the dysfunction — it’s just that the function is set up to have these kinds of results.

The idea that you can slot all knowledge into neat little categories that perfectly overlap with our security concerns is already a problematic one, as Peter Galison has argued. Galison’s argument is that security classification systems assume that knowledge is “atomic,” which is to say, comes in discrete bundles that can be disconnected from other knowledge (read “atomic” like “atomic theory” and not “atomic bomb”). The study of knowledge (either from first principles or historically) shows exactly the opposite — knowledge is constituted by sending out lots of little tendrils to other bits of knowledge, and knowledge of the natural world is necessarily interconnected. If you know a little bit about one thing you often know a little bit about everything similar to it.

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version...

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version. In some cases, numbers like these were initially omitted in drawing up the original documents, and a separate sheet of numbers would be kept in a safe, to be produced only when necessary.

This is a good philosophical point, one that arguably is a lot stronger for scientific facts than many others (the number of initiators, for example, is a lot less easily connected to other facts than is, say, the chemistry of plutonium), but I would just add that layered on top of this is the practical problem of trying to get multiple human beings to agree on the implementations of these classifications. That is, the classification are already problematic, and now you’re trying to get people to interpret them uniformly? Impossible… unless you opt for maximum conservatism and a minimum of discretion. Which isn’t what anybody is calling for.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren't just for show; they tell you what it's been through, and how to regard it now.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren’t just for show; they tell you what it’s been through, and how to regard it now.

Declassification can be arbitrary, or at least appear arbitrary to those of us locked outside of the process. (It is one of the symptoms of secrecy that the logic of the redactor is itself usually secret.) But to me, the real sin of our current system is the lack of resources put towards it, which makes the whole thing run slow and leads to huge backlogs. When the system is running at a swift pace, you can at least know what it is they’re holding back from you, compare it to other sources, file appeals, draw attention to it, and so on. When it takes years to start processing requests (as is the case with the National Archives, in my experience; it varies a lot by agency), much less actually declassify them, there is a real impediment to research and public knowledge. I’d rather declassification be arbitrary and fast than conservative and slow.

That individual redactors individually interpreting the guidelines according to the standards they are told to use come up with different results doesn’t bother me as much. There is going to be a certain amount of error in any large system, especially one that deals with borderline cases and allows individual discretion. Sometimes you win, sometimes you lose, but it’s being able to play the game in the first place that matters the most to me.

Notes
  1. The document is a discussion of instances in which classification guidelines are based on strict numerical limits, as opposed to general concepts. Citation is: Murray L. Nash to Theos Thomson (3 November 1970), “AEC Classification Guidance Based on Numerical Limits,” part of SECY-625, Department of Energy Archives, RG 326, Collection 6 Secretariat, Box 7832, Folder 6, “O&M 7 Laser Classification Panel. The top was received as the response to a FOIA request I made in 2008, the bottom another one in 2010. Both were part of FOIA requests relating to declassification decisions relating to inertial confinement fusion; the memo in question was part of information given to a panel of scientists regarding creating new fusion classification guidelines. []
Redactions

Bethe on SUNSHINE and Fallout (1954)

Wednesday, June 27th, 2012

Project SUNSHINE definitely takes the prize for “most intentionally-misleading title of a government program.” The goal of SUNSHINE (co-sponsored by the Atomic Energy Commission and RAND) was to figure out what the impact radioactive fallout from American nuclear testing was on the world population. The initial study was started in 1953, and involved checking biological material for the the radioactive fission product Strontium-90, with an attempt to correlate Sr-90 levels with various known nuclear test series. Not exactly what you think of when you hear the term “sunshine,” eh?

It actually gets much creepier than just the confusing name. The “biological material” they were studying was, well, dead organic matter. What kind of organic matter, specifically? The dataset for a “pre-pilot” study on Strontium-90 intake, was a real witches brew:

  • “Wisconsin cheese (1 month old)”
  • “clam shells (Long Island)”
  • “Wisconsin cat bone”
  • “Montana cat (6 months, fed on milk from free-range cows)”
  • “stillborn, full term baby (Chicago)”
  • “rib from a Harvard man” 

Pardon me while I count my ribs… and cats… and… well… yuck. You can’t make this stuff up. Well, I can’t, anyway. Here’s your creepy meeting transcript of the week, from the planning of SUNSHINE: “Dr. Libby commented on the difficulty of obtaining human samples, and suggested that stillborn babies, which are often turned over to the physician for disposal, might be a practical source.”1

As an aside to an aside, in the full study, they did use samples from corpses — corpses of children in particular seemed of particular interest — in getting their data. It’s a bit gory to read through their data sets as they describe the Sr-90 they found in the ribs or vertebrae of the dead. US scientist Shields Warren in particular seemed to have quite a lot of access to the bones of young children through the Cancer Research Institute in Boston, Massachusetts. Not a job I’d envy.2

Anyway — the document I wanted to share had nothing to do with the sample sources, but I got a little distracted while poking around in the SUNSHINE literature, and couldn’t not pass that on.

Hans Bethe and W.F. Libby

The letter in question comes from 1954, after SUNSHINE had been completed. It’s a request from December 1954 from the well-coifed Hans Bethe to the aforementioned Willard F. Libby, the physical chemist best known as the inventor of radiocarbon dating (for which he would win a Nobel Prize, in 1960), and in 1954 one of the five Commissioners of the AEC.3 In the letter, Bethe is arguing in favor of SUNSHINE’s declassification — and his justifications are not necessarily what you might expect.4

Click to view PDF (yes, it’s in color!)

Bethe started out by noting that even in the summer of 1953, when SUNSHINE was being finished up, they (it seems that Bethe and Libby were both there) thought that it would “be highly desirable to declassify a large part of project SUNSHINE.” Bethe thought the matter has gotten rather urgent:

I still feel the same way about this, and I think the arguments for declassification have become far stronger than they were in 1953. There is real unrest both in this country and abroad concerning the long-range as well as short-range radioactivity, and it would, in my opinion, greatly allay the fears of the public if the truth were published.

There’s the kicker: Bethe was convinced that SUNSHINE will show that fallout from testing isn’t as big a problem as people thought it was. Releasing SUNSHINE wouldn’t be a matter of propaganda (and holding it back wasn’t a matter of covering it up), in Bethe’s mind — it would simply be getting the facts out.

And why might people suddenly be getting concerned about nuclear fallout?

Map showing points (X) where contaminated fish were caught or where the sea was found to be excessively radioactive, following the Castle Bravo nuclear test.

No doubt because of all of the attention that the Castle BRAVO nuclear test had gotten with respects to high amounts of fallout finding its way into all sorts of biological systems far from its source — like the radioactive tuna that was caught for weeks afterwards off the waters of Japan.

Bethe understood, though, that the classification reasons holding back the publication of SUNSHINE were non-trivial. SUNSHINE studies the deposition of fission products following testing, and to make much sense of that, you had to know the fission yields from the tests. If you knew the fission yields, you’d know quite a lot about American nuclear weapons — especially if you knew the fission yield of the Ivy MIKE test, the first H-bomb.

Why? Because knowing the fission component of the first H-bomb test would possibly give away all sorts of information about the Teller-Ulam design. Multi-stage H-bombs have a reasonably large fission trigger that ignites the fusion fuel, which then again induces more fission in a “natural” uranium tamper. In the case of MIKE, 77% of the total 10.4 megaton yield came from the final fission stage. Knowing that would be a good hint as to the composition of the American H-bombs, and was not something they wanted to share with the USSR.

But Bethe thought you could get around this:

I believe the story of SUNSHINE could be published without giving away any information about our H-bombs: it is merely necessary to put the permissible accumulated yield in terms of fission yield rather than total yield.

In other words, if you just talked of fission yield — and didn’t give the total yield — you wouldn’t be able to figure out how much of the yield was not fission, and thus the high disparity (which would be a big red flag for a weapons designer) would be hidden.

Bethe also thought that they should publish the fallout data from the H-bomb tests (likely including those from the CASTLE series). Bethe didn’t think that information would give away any design information, but it was clear that others were suspicious. Bethe put the question to a test: he asked Philip Morrison to try and figure out how an H-bomb worked from just published stories about the Castle BRAVO fallout accident.

A youngish Philip Morrison, courtesy of the Emilio Segrè Visual Archives.

Morrison at that point had no access to classified information. He had been part of the Manhattan Project, and so knew quite a bit about fission weapons, but had been cut out of the classified world by the time the H-bomb had come along. (More on Morrison’s security clearance another time — lots of interesting stories there.)

Morrison’s conclusions (oddly title “FISSION ENERGY IN IVY,” even though it was about BRAVO) are attached to Bethe’s letter. In many ways it is an analysis typical of a somewhat cocky physicist: things are described as “easy” and conclusions are lead to “clearly” and everything is stated as if it is pretty obvious and pretty straightforward. Morrison concludes that the total fission yield of BRAVO (again, misidentified as IVY) is between 0.2Mt and 0.6Mt, and that most of the fission must have been from the fission primary that started the reactions. In reality, 10Mt of the 15Mt total yield was from fission, which is why it was such a “dirty” shot.

Bethe took this as evidence that indeed, looking at just the fallout alone, you couldn’t figure out how much of the explosion was from fission yield, and thus the design information was safe: “As Morrison’s report shows, it seems to be easy to draw entirely wrong conclusions from the fall-out data.”

Why Morrison got this wrong is a little mysterious to me. Ralph Lapp had managed to conclude, more or less correctly, that there was a third “dirty” fission stage, and had popularized the idea enough that it trickled into  Life magazine in December 1955. But Bethe thought Morrison’s analysis was more or less sound, given his lack of detailed information. It’s a weird thing to conclude, based on one study, that some piece of information is fundamentally unknowable, when you already know what the piece of information is.

Life magazine, 1955: not quite right, not entirely wrong.

Speaking of speculating based on missing information, part of Bethe’s letter is redacted, for reasons I do not know. His conclusion makes it pretty clear it has to do with this absolute vs. fission yield/fallout issue, though.

Bethe concludes: “I believe it would greatly improve international feeling about our Pacific tests if we were to publish the correct story of SUNSHINE and of fall-out.”

Libby would come around to Bethe’s position and push for declassification. In Libby’s mind, like Bethe’s, SUNSHINE showed that the world wasn’t going to become mutated just because of a little testing in the Pacific. Furthermore, he also came to believe that you could shut down a lot of the anti-nuclear testing demands by just showing people that you were paying close attention to this sort of thing — by the time of Operation Redwing (1956), he felt that this sort of disclosure had already made the international community more friendly to US testing.

It wasn’t until 1956 that the declassification eventually occurred, however, and even then, a lot of things were removed. (The “Amended*” in the RAND report cover page above is because it was “Amended to remove classified data; otherwise the report remains unchanged and represents the 1953 estimate of the fallout problem.”) Of course, by that point it was clear that the Soviets had already figured out how to make an H-bomb work.


Also! I will be giving a talk this Friday at the annual meeting of the Society for Historians of American Foreign Relations (SHAFR) in Hartford, CT. Just putting that out there.

Notes
  1. Minutes of the 36th Meeting of the General Advisory Committee to the U.S. Atomic Energy Commission (17, 18, and 19 August 1953), copy in the OHP Marshall Islands Document Collection. []
  2. E.g. E.A. Martell, “Strontium-90 Concentration Data for Biological Materials, Soils, Waters and Air Filters,” Project Sunshine Bulletin No. 12, [AECU-3297(Rev.)], (1 August 1956); human bone data listings start on page 29. []
  3. Libby was also the husband of Leona Woods, which I didn’t realize. Marshall was the only woman who had a role in the development of CP-1, the first nuclear reactor, and stands out quite conspicuously in the Met Lab photographs. []
  4. Citation: Hans Bethe to W.F. Libby (17 December 1954), copy in Nuclear Testing Archive, Las Vegas, NV, document NV0032161. []
Redactions

Declassifying ARGUS (1959)

Wednesday, May 23rd, 2012

One of the strangest — and perhaps most dangerous — nuclear tests ever conducted was Operation ARGUS, in late 1958.

The basic idea behind them was proposed by the Greek physicist Nicholas Christofilos, then at Livermore. If you shot a nuclear warhead off in the upper atmosphere, Christofilos argued, it would create an artificial radiation field similar to the Van Allen radiation belts that surround the planet. In essence, it would create a “shell” of electrons around the planet.

Frame from an government film showing the electron shell going around the planet

The tactical advantage to such a test is that hypothetically you could use this knowledge to knock out enemy missiles and satellites that were coming in. So they gave it a test, and indeed, it worked! (With some difficulty; it involved shooting nuclear weapons very high into the atmosphere on high-altitude rockets off of a boat in the middle of the rough South Atlantic Ocean. One wonders what happened to the warheads on them. They also had some difficulty positioning the rockets. The video linked to discusses this around the 33 minute point. Also, around the 19 minute mark is footage of various Navy equator-crossing hazing rituals, with pirate garb!)

It created artificial belts of electrons that surrounded the planet for weeks. Sound nutty yet? No? Well, just hold on — we’ll get there.

(Aside: Christofilos is an interesting guy; he had worked as an elevator repairman during World War II, studying particle physics in his spare time. He independently came up with the idea for the synchrotron and eventually was noticed by physicists in the United States. He later came up with a very clever way to allow communication with submerged submarines deep under water which was implement in the late 20th century.)

James Van Allen kissing Explorer IV (a satellite used in Argus) good-bye

In early 1959 — not long after the test itself — none other than James Van Allen (of the aforementioned Van Allen radiation belts) argued that the United States should rapidly declassify and release information on the Argus experiment.1

Click for the PDF.

Van Allen wanted it declassified because he was a big fan of the test, and thought the US would benefit from the world knowing about it:

As you will note, my views are (a) that continued security classification of the Argus-Hardtack tests is of little practical avail, (b) that a prompt and full public report of the tests and observations will contribute greatly to the international prestige of the United States as a leader in the application of atomic devices to scientific purposes, and (c) that if we fail to do (b) the U.S. will be quite likely be again ‘Sputniked’ in the eyes of the world by the Soviets.

Basically, Van Allen argued, the idea of doing an Argust-type experiment was widely known, even amongst uncleared scientists, and that the Soviets could pull off the same test themselves and get all the glory.

But here’s the line that makes me cringe: “The U.S. tests, already carried out successfully, undoubtedly constitute the greatest geophysical experiment ever conducted by man.” 

This was an experiment that affected the entire planet — “the greatest geophysical experiment ever conducted by man” — that were approved, vetted, and conducted under a heavy, heavy veil of secrecy. What if the predictions had been wrong? It’s not an impossibility that such a thing could have been the case: the physics of nuclear weapons are in a different energy regime than most other terrestrial science, and as a result there have been some colossal miscalculations that were only revealed after the bombs had gone off and, oh, contaminated huge swathes of the planet, or, say, accidentally knocked out satellite and radio communications. (The latter incident linked to, Starfish-Prime, was a very similar test that did cause a lot of accidental damage.)

There’s some irony in that the greatest praise, in this case, is a sign of how spooky the test was. At least to me, anyway.

This is the same sort of creepy feeling I get when I read about geoengineering, those attempts to purposefully use technology to affect things at the global scale, now in vogue again as a last-ditch attempt to ameliorate the effects of climate change. It’s not just the hubris — though, as an historian, that’s something that’s easy to see as an issue, given that unintended consequences are ripe even with technologies that don’t purposefully try to remake the entire planet. It’s also the matter of scale. Something happens when you go from small-scale knowledge (produced in the necessarily artificial conditions that laboratory science requires) to large-scale applications. Unpredicted effects and consequences show up with a vengeance, and you get a rapid education in how many collective and chaotic effects you do not really understand. It gives me the willies to ramp things up into new scales and new energy regimes without the possibility of doing intermediate stages. 

(Interestingly, my connection between Argus and geoengineering has been the subject of at least one talk by James R. Fleming, a space historian at Colby College, who apparently argued that Van Allen later regretted disrupting the Earth’s natural magnetosphere. Fleming has a paper on this in the Annals of Iowa, but I haven’t yet tracked down a copy.)

As for Argus’s declassification: while the Department of Defense was in the process of declassifying Argus, per Van Allen’s recommendations, they got a call from the New York Times saying that they were about to publish on it. (The Times claimed to have known about Argus well before the tests took place.) It’s not clear who leaked it, but leaked it did. The DOD decided that they should declassify as much as they could and send it out to coincide with this, and the news of Argus hit the front pages in March 1959.

Notes
  1. Citation: James Van Allen to James R. Killian (21 February 1959), copy in the Nuclear Testing Archive, Las Vegas, NV, document NV0309054. []
Redactions

Declassifying the Ivy Mike film (1953)

Wednesday, February 8th, 2012

Every good nuclear wonk has seen the delightfully over-the-top film that the government made about the Operation Ivy test in 1952. If you’ve seen any films involving nuclear test footage, you’ve probably seen parts of it, even if you didn’t recognize it as such. It ranks probably second in the all-time-most-viewed nuclear weapons films.1

Ivy Mike was, of course, the first test of a staged, multi-megaton thermonuclear weapon: the first hydrogen bomb. With an explosive yield of 10.4 million tons of TNT, it was a grim explication how tremendously destructive nuclear arms could be. Even Congressmen had difficulty making sense its power.

A 17-minute version (down from 28 minutes, which is already down from the hour-plus version now available from Archive.org, embedded above) of the Operation Ivy film was released for American citizens on April 1, 1954. The domestic and international reactions were immediate. The Soviet Union warned its people that these weapons could destroy “the fruits of a thousand years of human toil”; Premier Nehru of India called for the US and USSR to cease all hydrogen bomb tests. It was replayed two days later in the United Kingdom with an estimated 8 million viewers, even though supposedly the film was not meant to be distributed overseas, to avoid inflaming international opinion against nuclear testing.

The New York Times’ television critic, Jack Gould, reviewed it negatively: “A turning point in history was treated like another installment of ‘Racket Squad.'”2 The problem, Gould explained, was that it used “theatrical tricks” to talk down to the audience. Now the irony here is that the Operation Ivy film wasn’t made for a television audience. It was made for the President of the United States and top military brass and folks like that. Which makes the “talking down” even more disturbing, no?

This week’s document concerns the internal deliberations by the Atomic Energy Commission regarding the declassification and sanitizing of the Operation Ivy film. This report, AEC 483/47, outlines the opinion of the AEC directors of Classification and the Information Service about whether the film could and should be declassified.3

Click the image for the full PDF.

This isn’t the story of how it ends up on American television, but it is moving in that direction. The document goes over a proposal to release an edited (sanitized) version of the film for usage at a Conference of Mayors that President Eisenhower had assembled. The goal was to convince the mayors that Civil Defense was important: you’d better act now, before your city gets nuked.

The problem: the AEC didn’t really want to release the precise yield of the Mike shot. That’s a hard thing to hide when you’re obliterating an island with it. They also weren’t keen on releasing the fact that this wasn’t a deliverable weapon yet, but they couldn’t see a way of getting around that without seriously cutting it down to nothing. But at least they managed to cut out everything about its design, and the Ivy King shot (the largest pure-fission explosion, at half a megaton).

Read the full post »

Notes
  1. First likely goes to the Crossroads Baker test, which aside from being used everywhere is featured very prominently, repeatedly, at the end of Dr. Strangelove. []
  2. Note that the Operation Ivy narrator was Reed Hadley, from the aforementioned “Racket Squad.” []
  3. Citation: Report by the Directors of Classification and Information Service regarding the Film on Operation Ivy (AEC 483/47), (8 December 1953), copy in Nuclear Testing Archive, Las Vegas, NV, document #NV0074012. []