Posts Tagged ‘Declassification’


Oppenheimer, Unredacted: Part I – Finding the Lost Transcripts

Friday, January 9th, 2015

I wrote this piece up several months ago, and was thinking about what to do with it, where to try and publish it, and so forth. Eventually I came to the conclusion that it would require a whole lot of cutting for anyone to take it up, especially as the "news" aspect of it slipped away. So I've decided to publish it on the blog, in a series of two parts. Click here for Part II.

Oppenheimer photo courtesy of the of the Emilio Segrè Visual Archive; photo of the hearing transcript by Alex Wellerstein.

Last October, the US Department of Energy released the full, un-redacted, uncensored transcripts from J. Robert Oppenheimer’s 1954 security board hearing. Oppenheimer, the “father of the atomic bomb,” had his security clearance revoked in late 1953 after accusations were made that he had been a Communist spy. He appealed the revocation, and set into motion the trial of his life. Over the course of four weeks, the details of Oppenheimer’s actions, allegiances, opinions, and personal failures were rehashed and scrutinized under the pretense of evaluating his “character, associations, and loyalty.” At issue was whether Oppenheimer could have continued access to atomic secrets. The government’s judgment was negative, effectively excluding Oppenheimer from any further government service. The transcripts were published shortly thereafter, but with considerable deletions made in the name of security. Does the unmasking of 60-year-old secrets change our understanding of Oppenheimer and his hearing? And why did it take until now for them to be released?

The Oppenheimer security hearing took place behind closed doors, in a temporary building on the National Mall. But the world soon learned of their contents when they were published by the US Government Printing Office (GPO). This was rather remarkable: normally the contents of a security board review were considered confidential information, for fairly obvious reasons relating to both privacy and national security. Each of the forty witnesses called to testify (including Oppenheimer himself) was told that what he or she said was going to be treated as “strictly confidential,” and that the Atomic Energy Commission (AEC) would “take no initiative in the public release of any information relating to these proceedings.” And yet, within days of the conclusion of the hearings, they had become part of public record.

The circumstances behind the publication of the hearings were unusual. Shortly after the Oppenheimer hearing concluded, the Atomic Energy Commission thought they had lost a copy of the transcripts. Assuming they would be leaked to the press anyway, they decided to preemptively publish them. But just before publication, the lost copy was located, yet they decided to publish them anyway. The real reason for the publication was that the primary antagonist of the Oppenheimer affair — AEC Chairman Lewis Strauss — thought that Oppenheimer was getting too much public sympathy. In his mind, if the public could actually see what the decision to deny Oppenheimer’s clearance had been based on, they would see Oppenheimer as the unreliable villain that Strauss felt he was. Strauss’ view was, in retrospect, shortsighted. Almost nobody has read the entire hearing (it is nearly 1,000 pages of dense Government Printing Office typeface, often with no indication of who is answering questions at any given time, and is very repetitive), but the overall tone of the thing is that of an inquisitional character assassination.

First page of the Government Printing Office edition of the Oppenheimer security hearing transcript, which was published soon after the final decision had been made.

First page of the Government Printing Office edition of the Oppenheimer security hearing transcript, which was published soon after the final decision had been made.

Some of the antagonism was inherent to the nature of this sort of hearing. Oppenheimer wasn’t on trial for anything he had specifically done; rather, it was his “character” that was being explicitly evaluated. But some of it was because of dirty dealing on behalf of the Atomic Energy Commission: they were treating it as a prosecutorial trial, except that Oppenheimer was not given any of the legal protections that normally exist in actual criminal prosecutions, such as the assumption of innocence or even prior knowledge of which witnesses would be called. Even worse, as the historian Priscilla McMillan documented in her 2005 book, The Ruin of J. Robert Oppenheimer, the FBI was wiretapping conversations between Oppenheimer and his counsel and giving them, illegally, to the prosecutor. It is not a coincidence that the overall impression one gets from the hearing transcript is that Oppenheimer was set up.

Oppenheimer himself was not even entitled to a full, unexpurgated copy of the hearing transcript for his own personal use during the hearing. This is perhaps one of the most curious aspects of the hearing and its subsequent release and recent declassification. The hearing was a consequence of Oppenheimer’s appealing the suspending of his security clearance. Because his security clearance was suspended, including during the hearing itself, he was not allowed to have access to classified information — even classified information that he was involved in producing. So while a stenographer recorded every utterance said during the hearing itself, a censored copy had to be made daily for use by Oppenheimer. That Oppenheimer was in the room when most of these “classified” utterances occurred made no difference. There were a few “classified sessions” where Oppenheimer was not present, but otherwise he was present throughout the hearing sessions — including the one that took up the entirety of his 50th birthday.

So in principle the hearings were meant to be unclassified, as the defendant, Oppenheimer, was no longer cleared to hear classified information. The information was meant to be “confidential” but not legally secret (it did not require a security clearance to hear). Given the nature of the material under discussion, which involved at times quite subtle technical disagreements over the history of the American thermonuclear program and the military’s plans for using nuclear weapons against the Soviet Union, quite a lot of classified information did end up being discussed, and thus had to be deleted from the transcripts before Oppenheimer could see them. These deletions were initially made by means of a razor blade or pen-knife, literally cut out of the transcript pages themselves. At the National Archives facility in College Park, Maryland, there is a folder that contains these little cut out secrets — the detritus of Cold War secrecy.

I.I. Rabi denouncing the suspension of Oppenheimer's clearance. "We have an A-bomb and a whole series of it, * * * and what more do you want, mermaids?" The asterisks indicate a removal by the unnamed and unseen censor.

I.I. Rabi denouncing the suspension of Oppenheimer's clearance, as seen in the GPO version of the hearing transcripts. "We have an A-bomb and a whole series of it, * * * and what more do you want, mermaids?" The asterisks indicate a removal by the unnamed and unseen censor.

When the decision was made to publish the hearings in 1954, the classification officers of the Atomic Energy Commission went over the transcripts one more time. The version released to the public contains many conspicuous deletions, indicated with a series of asterisks. For example, in a supporter of Oppenheimer’s, I.I. Rabi, is famously recorded in the published transcript as saying: “We have an A-bomb and a whole series of it, * * * and what more do you want, mermaids?” Rabi was expressing exasperation of the persecution of Oppenheimer. The asterisks, here, denote something that did not survive the censor’s blade — whether the removal was minor or major could not be known.

That the hearings contained omissions was of course noticed by commentators and later historians. What was missing from the Oppenheimer hearing transcript? Did the censors remove only technical information, or much more? Were the censors themselves biased in their operation? Were the technical omissions crucial or minor? Without access to the originals, one could never know. The problem is, nobody seemed to know where the original, unexpurgated transcript was, or whether it had even been kept.

* * *

I first started looking for the uncensored transcript in 2004. I was at a conference on the 100th anniversary of Oppenheimer’s birth, held at the University of California, Berkeley, the spring before I started graduate school. One of the speakers was Richard Polenberg, a historian who had edited an abridged version of the Oppenheimer hearings. One of Polenberg’s remarks before the conference audience was that the original, uncensored version of the transcript appeared to be lost, and he issued something of an open challenge for people to find it. As a budding historian, I was interested in such challenges. Five years later, in 2009, I was doing research at the National Archives facility (“Archives II”) in College Park, Maryland, where most of the records of US federal agencies are kept. By this point I was in the final stages of writing a dissertation on the history of nuclear secrecy in the United States, and had been on many trips to the National Archives and was used to its idiosyncrasies.

The inner storage carousels at the National Archives II facility, where most of the US federal records are kept. These stacks are off-limit to researchers. Image source.

The inner storage carousels at the National Archives II facility, where most of the US federal records are kept. These stacks are off-limit to researchers. Image source.

People who have not done research in the National Archives either imagine that it is organized and user-friendly or that it is similar to the sprawling warehouse shown at the end of Raiders of the Lost Ark. The reality is somewhere in between these extremes. The archives are indeed vast and sprawling, though they are kept in neat, clean, moveable, high-volume storage shelving. Researchers are not allowed to browse the stacks (I was allowed to see them once, briefly). Instead, the researcher consults paper “Finding Aids” that are bound (sometimes haphazardly) in three-ring binders. The Finding Aids give some hints at what is kept in the stacks, but they can be a crude metric. They are often photocopies-of-photocopies of documents prepared decades previously using typewriters, with handwritten annotations.

Once one has identified something useful from the Finding Aid, one then has to cross-reference the entry with something called a “Master Location Register,” a different set of materials in a different three-ring binder. The Master Location Register tells researchers and archivists which shelves the boxes in question are supposed to be on. Having acquired that information, the researcher can then fill out a records request (“pull”) form, which has its own idiosyncratic rules for how it should be filled out. Having written out the request, the researcher then presents it to a reference librarian who scrutinizes it for formal adherence to a set of unwritten basic requirements. If it is judged to be formally sound, it is time-stamped and put into a bin.

The researcher, mind you, cannot simply request as many boxes as they want. There are limits to how many boxes of records will be retrieved, how many collections you can request records from at once, and how many “pull” requests you can make over the course of a day. There are five designated pull times on weekdays. The earliest is at 10:00am, the last is at 3:30pm. Any requests submitted between the hours of 11:05am and 1:30pm will not start to be processed until the 1:30pm pull time. Miss the 3:30pm pull time, and your records will not be pulled until 10:00am the next day. It can take between 20 minutes and an hour to actually see the fruits of any given records request. Sometimes the results are the records you asked for. Sometimes they are yellow slips of paper indicating that the records were not found in the place you said they should be, or that you violated a rule in filling out the request form, or that another researcher is already using the records (sometimes said researcher is a government employee working in a separate, inaccessible part of the facility).

A familiar sight to Archives II researchers: "You done messed up."

A familiar sight to Archives II researchers: "You done messed up."

If this sounds like a complicated system where a lot can go wrong, it is, and it is unusual among American archives in its complexity. When novice researchers ask me about using the records at the College Park facility, I tell them to factor in about twice as much research time as they might take at a “normal” archive, and to expect to spend the entirety of the first day learning how the system works to the point where they can actually file research requests that get useful records as a result. Of course, for the researcher, the real work only begins once the records have arrived. The downsides of such a system are obvious. The upside for a historian, though, is that in such a maze of paper there are sometimes still treasures to be found.

I had been at the archives for a week, and had exhausted all of my pull requests, except one. Because a set of records I had wanted to see proved useless, I found myself unexpectedly with some free time. Rather than going back to my hotel, I decided to do a little “fishing.” Sometimes Finding Aids are wrong, and sometimes they are out of sync with the MLR records. Sometimes records in the MLR lack Finding Aids, making them rarely used by researchers. A trick I had found over the years was to go over the MLR very carefully and look for anomalies: things that were in one database but not another, or were mislabeled. Doing this for the records of the Atomic Energy Commission, I found an unusual entry of files relating to the Oppenheimer hearing that was labeled as classified (and thus off-limits to me), but it was housed in a part of the facility that was for unclassified or declassified records. The Finding Aid provided no useful information about it.

I thought it was worth a chance to try and request it, because it seemed like the MLR might just have incorrect information in it, and it wasn’t at all clear what these files actually were. In the worst-case scenario, the pull request would come back as invalid, or it might just be one of the many copious files relating to Oppenheimer that scholars had looked at a dozens of times before over the years.

The cover and first page of the original Oppenheimer hearing transcript. In the left photo, I am holding back a "Top Secret/Restricted Data" cover sheet. I have cropped out my declassification slug. The color photos of the transcripts are from a 2011 follow-up trip to NARA I made; the original photos I took in 2009 were grayscale (as is my usual archival practice), which is why I am illustrating this post with the 2011 photographs.

The cover and first page of the original Oppenheimer hearing transcript. In the left photo, I am holding back a "Top Secret/Restricted Data" cover sheet. I have cropped out my declassification slug. The color photos of the transcripts are from a 2011 follow-up trip to NARA I made; the original photos I took in 2009 were grayscale (as is my usual archival practice), which is why I am illustrating this post with the 2011 photographs. Note that the transcriber, on the first page at right, got Oppenheimer's name wrong at the top — "Oppenheim" plus a handwritten "er."

When the “pull” came back, I was genuinely surprised to find that this mysterious, erroneous entry contained many of the original, un-redacted volumes of the Oppenheimer hearing. These were small blue stenography books produced for the use of the security hearing board itself by the Alderson Reporting Company, not for publication. On their covers were the stamps that characterize government document: “ORIGINAL,” “SECRET,” “RESTRICTED DATA.” Hold your breath, open the cover: instead of asterisks denoting classified omissions, they contained the missing text, circled in the red-orange pencil of the censor. I took photographs of all of the pages with removals on them, glancing over m shoulder the whole time, not wanting to let on my excitement.

There was only one problem: not all of the volumes had been declassified. Of 25 books worth of material, I had 17. Enough to see that I had found something wonderful, but not enough to do anything with it — nobody cares about finding most of the original Oppenheimer transcripts. Those that were missing had in their place “Withdrawal Notices,” pieces of card stock which say, in essence, “Sorry, you can’t see this.”  Fortunately they contain notations on them that can tell the archive where the still-officially-secret originals are being kept in in some other, more tightly-guarded part of the archive, and can be used to aid in Freedom of Information Act (FOIA) requests that compel the government to review the materials for release.

In theory, all classified documents contain the record of their classification status, and how it changes over the years, stamped on them. (This procedure, which is now common throughout the American classification system, was begun with Manhattan Project records in early 1946 at the recommendation of a committee of scientists that included Oppenheimer.) A close look at the documents and their containers revealed that the Department of Energy had transferred them to the National Archives in 1991, and that in May 1992, someone had started declassifying them. But after 17 volumes, they stopped. Why? It isn’t clear. The early 1990s were a period of classification reform and “Openness” under President Clinton’s Secretary of Energy, Hazel O’Leary, and it would poetic if the Oppenheimer transcript fell to the wayside because they were too busy declassifying other things (and, eventually, fighting back against Congressional Republicans who eventually stopped the “Openness Initiative” in its tracks). In any case, it looks like things got stopped mid-declassification and that this was responsible for the sort of “limbo” these records ended up in — with an incorrect MLR entry and nobody quite knowing what had happened to them, until I stumbled across them 17 years later.

The stamps on a declassified document can tell you its classification history, if you know how to decode them. The cover of the Oppenheimer transcript indicates evidence of its original review in 1954 for publication, the record of it being catalogued by the AEC office of history (where it stayed until 1991, when it was transferred to NARA), and evidence of its declassification in 1992 by a DOE contractor.

The stamps on a declassified document can tell you its classification history, if you know how to decode them. The cover of the Oppenheimer transcript indicates evidence of its original review in 1954 for publication, the record of it being catalogued by the AEC office of history (where it stayed until 1991, when it was transferred to NARA), and evidence of its declassification in 1992 by a DOE contractor. The "X"s through the "Restricted Data" stamp and the original 1954 note are meant to indicate that they are not longer valid (they should have drawn a line through the "SECRET" stamp, too, but this is often neglected).

When I got back from my archive trip I immediately filed a Freedom of Information Act request for the remaining volumes. I knew this entailed a little bit of risk of being scooped. FOIA requests are not confidential; there are people who file FOIA requests to find out what other people filing FOIA requests on. In principle there is nothing wrong with this. Scholars have no proprietary claims on information that the government created, and once the government declassifies something it is available to everyone. But as with scientists, priority matters for historians: we like to take credit for what we find. But until I had the missing volumes, I felt I to be fairly quiet about the entire thing, telling only a few trusted colleagues.

The speed of response to a FOIA request can vary by the material and by the agency. The FBI, in my experience, is quite fast, despite (or maybe because) of their reputation for secrecy. They manage to process even quite large files usually within six months to a year. The Department of Energy is also relatively efficient. Waiting a year or two when you are trying to finish a dissertation or a book is a long time, but one cultivates a sense of patience about these things. Unfortunately, to get records that are already in the National Archives declassified, you have to file a FOIA request to the National Archives and Records Administration (NARA), who in turn has to turn them over to the actual declassifying agency (whichever agency, or its heirs, made the information classified in the first place). As the caretaker of all government archives, NARA receives huge volumes of FOIA requests on all topics, and so has a massive backlog. So my FOIA request for the Oppenheimer hearings would have to worm its way through the NARA system in order to be forwarded to the Department of Energy, the institutional heir of the Atomic Energy Commission, so the actual work of declassification review could begin.

Like Oppenheimer, one must cultivate a sense of Zen while waiting for classified documents to be reviewed. Photo source:  Emilio Segrè Visual Archives at the AIP Niels Bohr Library. (The first Oppenheimer photo at the top of this post is also from the ESVA.)

Like Oppenheimer, one must cultivate a sense of Zen while waiting for classified documents to be reviewed. Photo source: Emilio Segrè Visual Archives at the AIP Niels Bohr Library. (The first Oppenheimer photo at the top of this post is also from the ESVA.)

So I expected this to be a slow process. But it was much slower than I’d have guessed. For three years, NARA did nothing with my request. At regular intervals I checked in, via e-mail, on its status, and every time was told that it was simply in a very long queue. The NARA employee I corresponded with was sympathetic and friendly, but insisted that he could do nothing to improve the speed of the system. So I waited — not for them to actually declassify the records, but for them to even start processing them, so that they could be sent to the Department of Energy for the actual declassification effort.

Finally, in 2012, I was told they were “out for review,” having finally been sent to the Department of Energy. It seemed like things might finally pick up. Still, I heard almost nothing for another two years. That is, until October, when I saw that the Department of Energy had declassified the entire transcript and posted it onto their OpenNet website… without informing me. A contact of mine at the Department of Energy has assured me that they did not realize there was a FOIA request associated with these records, and my contact at the National Archives has apologized over e-mail for the way this got handled.1

As you can imagine, I was more than a little surprised that a process that made no obvious steps forward over the course of six years suddenly burst into the public eye in the most rapid way possible. In their defense, NARA seemed just as surprised as I was that the files had been posted online, which complicated their own screening process — as often happens in the Federal Government, the left hand didn't quite know what the right hand was doing. As someone who studies the history of nuclear secrecy, I have allowed myself to be amused by the way this has all transpired. To have my priority claims on finding a secret document dashed by excessive openness on behalf of the government is perhaps an appropriately ironic fate, is it not? One of the key points of my (someday) forthcoming book is that revelation can be as much as a problematic activity as concealment, though this in this case it was a bit more personal than usual!

Part II, which contextualizes the newly revealed content and its impact on Oppenheimer's legacy, is available here.

  1. In full disclosure, I worked briefly for the Department of Energy while in graduate school — I was the Edward Teller Graduate Fellow in Science and Security Studies for 2007-2008. This gets me no advantages other than knowing who to contact in their history division when I have questions (and I have good relations with the history division), and knowing a bit about how their system works. []

General Groves’ secret history

Friday, September 5th, 2014

The first history of the Manhattan Project that was ever published was the famous Smyth Report, which was made public just three days after the bombing of Nagasaki. But the heavily-redacted Smyth Report understandably left a lot out, even if it did give a good general overview of the work that had been done to make the bomb. Deep within the secret files of the Manhattan Project, though, was another, classified history of the atomic bomb. This was General Leslie Groves' Manhattan District History. This wasn't a history that Groves ever intended to publish — it was an internal record-keeping system for someone who knew that over the course of his life, he (and others) would need to be able to occasionally look up information about the decisions made during the making of the atomic bomb, and that wading through the thousands of miscellaneous papers associated with the project wouldn't cut it.

Manhattan District History - Book 2 - Vol 5 - cover

Groves' concern with documentation warms this historian's heart, but it's worth noting that he wasn't making this for posterity. Groves repeatedly emphasized both during the project and afterwards that he was afraid of being challenged after the fact. With the great secrecy of the Manhattan Project, and its "black" budget, high priority rating, and its lack of tolerance for any external interference, came a great responsibility. Groves knew that he had made enemies and was doing controversial things. There was a chance, even if everything worked correctly (and help him if it didn't!), that all of his actions would land him in front of Congress, repeatedly testifying about whether he made bad decisions, abused public trust, and wasted money. And if he was asked, years later, about the work of one part of the project, how would he know how to answer? Better to have a record of decisions put into one place, should he need to look it up later, and before all of the scientists scattered to the wind in the postwar. He might also have been thinking about the memoir he would someday write: his 1962 book, Now it Can Be Told, clearly leans heavily on his secret history in some places.

Groves didn't write the thing himself, of course. Despite his reputation for micromanagement, he had his limits. Instead, the overall project was managed by an editor, Gavin Hadden, a civil employee for the Army Corps of Engineers. Individual chapters and sections were written by people who had worked in the various divisions in question. Unlike the Smyth Report, the history chapters were not necessarily written near-contemporaneously with the work — most of the work appears to have been started after the war ended, some parts appear to have not been finished until 1948 or so.

General Groves not amused

In early August 1945 — before the bombs had been dropped — a guide outlining the precise goals and form of the history was finalized. It explained that:

Tho purpose of the history is to serve as a source of historical information for War Department officials and other authorized individuals. Accordingly, the viewpoint of the writer should be that of General Groves and the reader should be considered as a layman without any specialized knowledge of the subject who may be critical of the Department or the project.

Which is remarkably blunt: write as if Groves himself was saying these things (because someday he might!), and write as if the reader is someone looking for something to criticize. Later the guide gives some specific examples on how to spin problematic things, like the chafing effect of secrecy:

For example, the rigid security restrictions of the project in many cases necessitated the adoption of unusual measures in the attainment of a local objective but the maintenance of security has been recognized throughout as an absolute necessity. Consequently, instead of a statement such as, "This work was impeded by the rigid security regulations of the District," a statement such as, "The necessity of guarding the security of the project required that operations be carried on in — etc." would be more accurate.1

This was the history that Groves grabbed whenever he did get hauled in front of Congress in the postwar (which happened less than he had feared, but it still happened). This was the history that the Atomic Energy Commission relied upon whenever it needed to find out what its predecessor agencies had done. It was a useful document to have around, because it contains all manner of statistics, technical details, legal details, and references to other documents in the archive.

"Dante's Inferno: A Pocket Mural" by Louis C. Anderson, a rather wonderful and odd drawing of the Calutron process. From Manhattan District History, Book 5, "Electromagnetic Project," Volume 6.

"Dante's Inferno: A Pocket Mural" by Louis C. Anderson, a rather wonderful and odd drawing of the Calutron process. From Manhattan District History, Book 5, "Electromagnetic Project," Volume 6.

The Manhattan District History became partially available to the general public in 1977, when a partial version of it was made available on microfilm through the National Archives and University Publications of America as Manhattan Project: Official History and Documents. The Center for Research Libraries has a digital version that you can download if you are part of a university that is affiliated with them (though its quality is sometimes unreadable), and I've had a digital copy for a long time now as a result.2 The 1977 microfilm version was missing several important volumes, however, including the entire book on the gaseous diffusion project, a volume on the acquisition of uranium ore, and many technical volumes and chapters about the work done at Los Alamos. All of this was listed as "Restricted" in the guide that accompanied the 1977 version.3

I was talking with Bill Burr of the National Security Archive sometime in early 2013 and it occurred to me that it might be possible to file a Freedom of Information Act request for the rest of these volumes, and that this might be something that his archive would want to do. I helped him put together a request for the missing volumes, which he filed. The Department of Energy got back pretty promptly, telling Bill that they were already beginning to declassify these chapters and would eventually put them online.

Manhattan Project uranium production flow diagram, from book 7, "Feed materials."

Manhattan Project uranium production flow diagram, from Manhattan District History, Book 7, "Feed materials."

The DOE started to release them in chunks in the summer of 2013, and got the last files up this most recent summer. You can download each of the chapters individually on their website, but their file names are such that they won't automatically sort in a sensible way in your file system, and they are not full-text searchable. The newly-released files have their issues — a healthy dose of redaction (and one wonders how valuable that still is, all these years — and proliferations — later), and some of the images have been run through a processor that has made them extremely muddy to the point of illegibility (lots of JPEG artifacts). But don't get me started on that. (The number of corrupted PDFs on the NNSA's FOIA website is pretty ridiculous for an agency that manages nuclear weapons.) Still, it's much better than the microfilm, if only because it is rapidly accessible.

But you don't need to do that. I've downloaded them all, run them through a OCR program so they are searchable, and gave them sortable filenames. Why? Because I want people — you — to be able to use these (and I do not trust the government to keep this kind of thing online). They've still got loads of deletions, especially in the Los Alamos and diffusion sections, and the pro-Groves bent to things is so heavy-handed it's hilarious at times. And they are not all necessarily accurate, of course. I have found versions of chapters that were heavily marked up by someone who was close to the matter, who thought there were lots of errors. In the volumes I've gone the closest over in my own research (e.g. the "Patents" volume), I definitely found some places that I thought they got it a little wrong. But all of this aside, they are incredibly valuable, important volumes nonetheless, and I keep finding all sorts of unexpected gems in them.

You can download all of the 79 PDF files in one big ZIP archive on WARNING: the ZIP file is 760MB or so. You can also download the individual files below, if you don't want them all at once.

Statistics on the ages of Los Alamos employees, from Ted Hall (19) to Niels Bohr (59). From Manhattan District History, Book 8.

Statistics on the ages of Los Alamos employees, May 1945, from the young spy, Ted Hall (19), to the old master, Niels Bohr (59). From Manhattan District History, Book 8.

What kinds of gems are hidden in these files? Among other things:

And a lot more. As you can see, I've drawn on this history before for blog and Twitter posts — I look through it all the time, because it offers such an interesting view into the Manhattan Project, and one that cuts through a lot of our standard narratives about how it worked. There are books and books worth of fodder in here, spread among some tens of thousands of pages. Who knows what might be hidden in there? Let's shake things up a bit, and find something strange.

Below is the full file listing, with links to my OCR'd copies, hosted on Again, you can download all of them in one big ZIP file by clicking here, (760 MB) or pick them individually from below. Items marked with an asterisk are, as far as know, wholly new — the others have been available on microfilm in one form or another since 1977. Read the full post »

  1. E.H. Marsden, "Manhattan District History Preparation Guide," (1 August 1945), copy in the Nuclear Testing Archive, Las Vegas, Nevada, accession number NV0727839. []
  2. In fact, I used portions of it — gasp! — on actual microfilm very early on my grad school career, when you still had to do that sort of thing. The volume on the patenting program was extremely useful when I wrote on Manhattan Project patent policies. []
  3. Some of the Los Alamos chapters were later published in redacted form as Project Y: The Los Alamos Story, in 1983. []
Meditations | Redactions

The Problem of Redaction

Friday, April 12th, 2013

Redaction is one of those practices we take for granted, but it is actually pretty strange if you think about it. I mean, who would imagine that the state would say, "well, all of this is totally safe for public consumption, except for a part right here, which is too awful to be legally visible, so I'll just blot out that part. Maybe I'll do it in black, maybe in white, maybe I'll add DELETED in big bold letters, just so I know that you saw that I deleted it."

From Hans Bethe's "Memorandum on the History of the Thermonuclear Program" (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From Hans Bethe's "Memorandum on the History of the Thermonuclear Program" (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From a security perspective, it's actually rather generous. The redactor is often giving us the context of the secret, the length of the material kept from us (a word? a sentence? a paragraph? a page?), and helpfully drawing our eye to the parts of the document that still contain juicy bits. The Onion's spoof from a few years back, "CIA Realizes It's Been Using Black Highlighters All These Years," is only slightly off from the real truth. Blacking something out is only a step away from highlighting its importance, and the void makes us curious. In fact, learning what was actually in there can be quite anticlimactic, just as learning how a magician does their trick ("the guy in the audience is in on the trick").

And, of course, the way the US declassification system is set up virtually guarantees that multiple, differently-redacted copies of documents will eventually exist. Carbon copies of the same documents exist in multiple agencies, and each agency can be separately petitioned for copies of their files, and they will send them to individual reviewers, and they will each review their guides and try and interpret them. There's very little centralization, and lots of individual discretion in interpreting the guides.

The National Security Archive recently posted an Electronic Briefing Book that was very critical of this approach. In their case, they pointed out that a given paragraph in a once-secret document that was deemed by the redactor to be completely safe in 2001 was in 2003 deemed secret again, and then, in 2007, reaffirmed safe, and then, in 2012, again secret. "There often seems little logic to redaction decisions, which depend on the whim of the individual reviewer, with no appreciation of either the passage of time or the interests of history and accountability," writes Michael Dobbs.

This sort of thing happens all the time, of course. In the National Security Archive's Chuck Hansen papers there are bundles of little stapled "books" he would create of multiply, differently-redacted copies of the same document. They are a fun thing to browse through, viewing four different versions of the same page, each somewhat differently hacked up.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

In the case of Hansen's papers, these differences came about because he was filing Freedom of Information Act requests (or looking at the results of other's requests) over extended periods of time to different agencies. The passage of time is important, because guides change in the meantime (usually towards making things less secret; "reclassification" is tricky). And the multiple sites means you are getting completely different redactors looking at it, often with different priorities or expertise.

Two different redactors, working with the exact same guides, can come up with very different interpretations. This is arguably inherent to any kind of classifying system, not just one for security classifications. (Taxonomy is a vicious profession.) The guides that I have seen (all historical ones, of course) are basically lists of statements and classifications. Sometimes the statements are very precise and technical, referencing specific facts or numbers. Sometimes they are incredibly broad, referencing entire fields of study. And they can vary quite a bit — sometimes they are specific technical facts, sometimes they are broad programmatic facts, sometimes they are just information about meetings that have been held. There aren't any items that, from a distance, resemble flies, but it's not too far off from Borges' mythical encyclopedia.

The statements try to be clear, but if you imagine applying them to a real-life document, you can see where lots of individual discretion would come into the picture. Is fact X implied by sentence Y? Is it derivable, if paired with sentence Z? And so on. And there's a deeper problem, too: if two redactors identify the same fact as being classified, how much of the surrounding context do they also snip out with it? Even a stray preposition can give away information, like whether the classified word is singular or plural. What starts as an apparently straightforward exercise in cutting out secrets quickly becomes a strange deconstructionist enterprise.

One of my favorite examples of differently redacted documents came to me through two Freedom of Information Act requests to the same agency at about the same time. Basically, two different people (I presume) at the Department of Energy looked at this document from 1970, and this was the result:

1970 AEC declassification guide redactions

In one, the top excerpt is deemed declassified and the bottom classified. In the other, the reverse. Put them together, and you have it all.  (While I'm at it, I'll also just add that a lot of classified technical data looks more or less like the above: completely opaque if you aren't a specialist. That doesn't mean it isn't important to somebody, of course. It is one of the reasons I am resistant to any calls for "common sense" classification, because I think we are well beyond the "common" here.) In this case, the irony is double, because what they're de/classifying are excerpts from classification guides... very meta, no?1

What's going on here? Did the redactors really interpret their guidelines in exactly the opposite ways? Or are both of these borderline cases where discretion was required? Or was it just an accident? Any of these could be plausible explanations, though I suspect they are each borderline cases and their juxtaposition is just a coincidence. I don't actually see this as a symptom of dysfunction, though. I see it as a natural result of the kind of declassification system we have. It's the function, not the dysfunction — it's just that the function is set up to have these kinds of results.

The idea that you can slot all knowledge into neat little categories that perfectly overlap with our security concerns is already a problematic one, as Peter Galison has argued. Galison's argument is that security classification systems assume that knowledge is "atomic," which is to say, comes in discrete bundles that can be disconnected from other knowledge (read "atomic" like "atomic theory" and not "atomic bomb"). The study of knowledge (either from first principles or historically) shows exactly the opposite — knowledge is constituted by sending out lots of little tendrils to other bits of knowledge, and knowledge of the natural world is necessarily interconnected. If you know a little bit about one thing you often know a little bit about everything similar to it.

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version...

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version. In some cases, numbers like these were initially omitted in drawing up the original documents, and a separate sheet of numbers would be kept in a safe, to be produced only when necessary.

This is a good philosophical point, one that arguably is a lot stronger for scientific facts than many others (the number of initiators, for example, is a lot less easily connected to other facts than is, say, the chemistry of plutonium), but I would just add that layered on top of this is the practical problem of trying to get multiple human beings to agree on the implementations of these classifications. That is, the classification are already problematic, and now you're trying to get people to interpret them uniformly? Impossible... unless you opt for maximum conservatism and a minimum of discretion. Which isn't what anybody is calling for.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren't just for show; they tell you what it's been through, and how to regard it now.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren't just for show; they tell you what it's been through, and how to regard it now.

Declassification can be arbitrary, or at least appear arbitrary to those of us locked outside of the process. (It is one of the symptoms of secrecy that the logic of the redactor is itself usually secret.) But to me, the real sin of our current system is the lack of resources put towards it, which makes the whole thing run slow and leads to huge backlogs. When the system is running at a swift pace, you can at least know what it is they're holding back from you, compare it to other sources, file appeals, draw attention to it, and so on. When it takes years to start processing requests (as is the case with the National Archives, in my experience; it varies a lot by agency), much less actually declassify them, there is a real impediment to research and public knowledge. I'd rather declassification be arbitrary and fast than conservative and slow.

That individual redactors individually interpreting the guidelines according to the standards they are told to use come up with different results doesn't bother me as much. There is going to be a certain amount of error in any large system, especially one that deals with borderline cases and allows individual discretion. Sometimes you win, sometimes you lose, but it's being able to play the game in the first place that matters the most to me.

  1. The document is a discussion of instances in which classification guidelines are based on strict numerical limits, as opposed to general concepts. Citation is: Murray L. Nash to Theos Thomson (3 November 1970), "AEC Classification Guidance Based on Numerical Limits," part of SECY-625, Department of Energy Archives, RG 326, Collection 6 Secretariat, Box 7832, Folder 6, "O&M 7 Laser Classification Panel. The top was received as the response to a FOIA request I made in 2008, the bottom another one in 2010. Both were part of FOIA requests relating to declassification decisions relating to inertial confinement fusion; the memo in question was part of information given to a panel of scientists regarding creating new fusion classification guidelines. []

Bethe on SUNSHINE and Fallout (1954)

Wednesday, June 27th, 2012

Project SUNSHINE definitely takes the prize for "most intentionally-misleading title of a government program." The goal of SUNSHINE (co-sponsored by the Atomic Energy Commission and RAND) was to figure out what the impact radioactive fallout from American nuclear testing was on the world population. The initial study was started in 1953, and involved checking biological material for the the radioactive fission product Strontium-90, with an attempt to correlate Sr-90 levels with various known nuclear test series. Not exactly what you think of when you hear the term "sunshine," eh?

It actually gets much creepier than just the confusing name. The "biological material" they were studying was, well, dead organic matter. What kind of organic matter, specifically? The dataset for a "pre-pilot" study on Strontium-90 intake, was a real witches brew:

  • "Wisconsin cheese (1 month old)"
  • "clam shells (Long Island)"
  • "Wisconsin cat bone"
  • "Montana cat (6 months, fed on milk from free-range cows)"
  • "stillborn, full term baby (Chicago)"
  • "rib from a Harvard man" 

Pardon me while I count my ribs... and cats... and... well... yuck. You can't make this stuff up. Well, I can't, anyway. Here's your creepy meeting transcript of the week, from the planning of SUNSHINE: "Dr. Libby commented on the difficulty of obtaining human samples, and suggested that stillborn babies, which are often turned over to the physician for disposal, might be a practical source."1

As an aside to an aside, in the full study, they did use samples from corpses — corpses of children in particular seemed of particular interest — in getting their data. It's a bit gory to read through their data sets as they describe the Sr-90 they found in the ribs or vertebrae of the dead. US scientist Shields Warren in particular seemed to have quite a lot of access to the bones of young children through the Cancer Research Institute in Boston, Massachusetts. Not a job I'd envy.2

Anyway — the document I wanted to share had nothing to do with the sample sources, but I got a little distracted while poking around in the SUNSHINE literature, and couldn't not pass that on.

Hans Bethe and W.F. Libby

The letter in question comes from 1954, after SUNSHINE had been completed. It's a request from December 1954 from the well-coifed Hans Bethe to the aforementioned Willard F. Libby, the physical chemist best known as the inventor of radiocarbon dating (for which he would win a Nobel Prize, in 1960), and in 1954 one of the five Commissioners of the AEC.3 In the letter, Bethe is arguing in favor of SUNSHINE's declassification — and his justifications are not necessarily what you might expect.4

Click to view PDF (yes, it's in color!)

Bethe started out by noting that even in the summer of 1953, when SUNSHINE was being finished up, they (it seems that Bethe and Libby were both there) thought that it would "be highly desirable to declassify a large part of project SUNSHINE." Bethe thought the matter has gotten rather urgent:

I still feel the same way about this, and I think the arguments for declassification have become far stronger than they were in 1953. There is real unrest both in this country and abroad concerning the long-range as well as short-range radioactivity, and it would, in my opinion, greatly allay the fears of the public if the truth were published.

There's the kicker: Bethe was convinced that SUNSHINE will show that fallout from testing isn't as big a problem as people thought it was. Releasing SUNSHINE wouldn't be a matter of propaganda (and holding it back wasn't a matter of covering it up), in Bethe's mind — it would simply be getting the facts out.

And why might people suddenly be getting concerned about nuclear fallout?

Map showing points (X) where contaminated fish were caught or where the sea was found to be excessively radioactive, following the Castle Bravo nuclear test.

No doubt because of all of the attention that the Castle BRAVO nuclear test had gotten with respects to high amounts of fallout finding its way into all sorts of biological systems far from its source — like the radioactive tuna that was caught for weeks afterwards off the waters of Japan.

Bethe understood, though, that the classification reasons holding back the publication of SUNSHINE were non-trivial. SUNSHINE studies the deposition of fission products following testing, and to make much sense of that, you had to know the fission yields from the tests. If you knew the fission yields, you'd know quite a lot about American nuclear weapons — especially if you knew the fission yield of the Ivy MIKE test, the first H-bomb.

Why? Because knowing the fission component of the first H-bomb test would possibly give away all sorts of information about the Teller-Ulam design. Multi-stage H-bombs have a reasonably large fission trigger that ignites the fusion fuel, which then again induces more fission in a "natural" uranium tamper. In the case of MIKE, 77% of the total 10.4 megaton yield came from the final fission stage. Knowing that would be a good hint as to the composition of the American H-bombs, and was not something they wanted to share with the USSR.

But Bethe thought you could get around this:

I believe the story of SUNSHINE could be published without giving away any information about our H-bombs: it is merely necessary to put the permissible accumulated yield in terms of fission yield rather than total yield.

In other words, if you just talked of fission yield — and didn't give the total yield — you wouldn't be able to figure out how much of the yield was not fission, and thus the high disparity (which would be a big red flag for a weapons designer) would be hidden.

Bethe also thought that they should publish the fallout data from the H-bomb tests (likely including those from the CASTLE series). Bethe didn't think that information would give away any design information, but it was clear that others were suspicious. Bethe put the question to a test: he asked Philip Morrison to try and figure out how an H-bomb worked from just published stories about the Castle BRAVO fallout accident.

A youngish Philip Morrison, courtesy of the Emilio Segrè Visual Archives.

Morrison at that point had no access to classified information. He had been part of the Manhattan Project, and so knew quite a bit about fission weapons, but had been cut out of the classified world by the time the H-bomb had come along. (More on Morrison's security clearance another time — lots of interesting stories there.)

Morrison's conclusions (oddly title "FISSION ENERGY IN IVY," even though it was about BRAVO) are attached to Bethe's letter. In many ways it is an analysis typical of a somewhat cocky physicist: things are described as "easy" and conclusions are lead to "clearly" and everything is stated as if it is pretty obvious and pretty straightforward. Morrison concludes that the total fission yield of BRAVO (again, misidentified as IVY) is between 0.2Mt and 0.6Mt, and that most of the fission must have been from the fission primary that started the reactions. In reality, 10Mt of the 15Mt total yield was from fission, which is why it was such a "dirty" shot.

Bethe took this as evidence that indeed, looking at just the fallout alone, you couldn't figure out how much of the explosion was from fission yield, and thus the design information was safe: "As Morrison's report shows, it seems to be easy to draw entirely wrong conclusions from the fall-out data."

Why Morrison got this wrong is a little mysterious to me. Ralph Lapp had managed to conclude, more or less correctly, that there was a third "dirty" fission stage, and had popularized the idea enough that it trickled into  Life magazine in December 1955. But Bethe thought Morrison's analysis was more or less sound, given his lack of detailed information. It's a weird thing to conclude, based on one study, that some piece of information is fundamentally unknowable, when you already know what the piece of information is.

Life magazine, 1955: not quite right, not entirely wrong.

Speaking of speculating based on missing information, part of Bethe's letter is redacted, for reasons I do not know. His conclusion makes it pretty clear it has to do with this absolute vs. fission yield/fallout issue, though.

Bethe concludes: "I believe it would greatly improve international feeling about our Pacific tests if we were to publish the correct story of SUNSHINE and of fall-out."

Libby would come around to Bethe's position and push for declassification. In Libby's mind, like Bethe's, SUNSHINE showed that the world wasn't going to become mutated just because of a little testing in the Pacific. Furthermore, he also came to believe that you could shut down a lot of the anti-nuclear testing demands by just showing people that you were paying close attention to this sort of thing — by the time of Operation Redwing (1956), he felt that this sort of disclosure had already made the international community more friendly to US testing.

It wasn't until 1956 that the declassification eventually occurred, however, and even then, a lot of things were removed. (The "Amended*" in the RAND report cover page above is because it was "Amended to remove classified data; otherwise the report remains unchanged and represents the 1953 estimate of the fallout problem.") Of course, by that point it was clear that the Soviets had already figured out how to make an H-bomb work.

Also! I will be giving a talk this Friday at the annual meeting of the Society for Historians of American Foreign Relations (SHAFR) in Hartford, CT. Just putting that out there.

  1. Minutes of the 36th Meeting of the General Advisory Committee to the U.S. Atomic Energy Commission (17, 18, and 19 August 1953), copy in the OHP Marshall Islands Document Collection. []
  2. E.g. E.A. Martell, "Strontium-90 Concentration Data for Biological Materials, Soils, Waters and Air Filters," Project Sunshine Bulletin No. 12, [AECU-3297(Rev.)], (1 August 1956); human bone data listings start on page 29. []
  3. Libby was also the husband of Leona Woods, which I didn't realize. Marshall was the only woman who had a role in the development of CP-1, the first nuclear reactor, and stands out quite conspicuously in the Met Lab photographs. []
  4. Citation: Hans Bethe to W.F. Libby (17 December 1954), copy in Nuclear Testing Archive, Las Vegas, NV, document NV0032161. []

Declassifying ARGUS (1959)

Wednesday, May 23rd, 2012

One of the strangest — and perhaps most dangerous — nuclear tests ever conducted was Operation ARGUS, in late 1958.

The basic idea behind them was proposed by the Greek physicist Nicholas Christofilos, then at Livermore. If you shot a nuclear warhead off in the upper atmosphere, Christofilos argued, it would create an artificial radiation field similar to the Van Allen radiation belts that surround the planet. In essence, it would create a "shell" of electrons around the planet.

Frame from an government film showing the electron shell going around the planet

The tactical advantage to such a test is that hypothetically you could use this knowledge to knock out enemy missiles and satellites that were coming in. So they gave it a test, and indeed, it worked! (With some difficulty; it involved shooting nuclear weapons very high into the atmosphere on high-altitude rockets off of a boat in the middle of the rough South Atlantic Ocean. One wonders what happened to the warheads on them. They also had some difficulty positioning the rockets. The video linked to discusses this around the 33 minute point. Also, around the 19 minute mark is footage of various Navy equator-crossing hazing rituals, with pirate garb!)

It created artificial belts of electrons that surrounded the planet for weeks. Sound nutty yet? No? Well, just hold on — we'll get there.

(Aside: Christofilos is an interesting guy; he had worked as an elevator repairman during World War II, studying particle physics in his spare time. He independently came up with the idea for the synchrotron and eventually was noticed by physicists in the United States. He later came up with a very clever way to allow communication with submerged submarines deep under water which was implement in the late 20th century.)

James Van Allen kissing Explorer IV (a satellite used in Argus) good-bye

In early 1959 — not long after the test itself — none other than James Van Allen (of the aforementioned Van Allen radiation belts) argued that the United States should rapidly declassify and release information on the Argus experiment.1

Click for the PDF.

Van Allen wanted it declassified because he was a big fan of the test, and thought the US would benefit from the world knowing about it:

As you will note, my views are (a) that continued security classification of the Argus-Hardtack tests is of little practical avail, (b) that a prompt and full public report of the tests and observations will contribute greatly to the international prestige of the United States as a leader in the application of atomic devices to scientific purposes, and (c) that if we fail to do (b) the U.S. will be quite likely be again 'Sputniked' in the eyes of the world by the Soviets.

Basically, Van Allen argued, the idea of doing an Argust-type experiment was widely known, even amongst uncleared scientists, and that the Soviets could pull off the same test themselves and get all the glory.

But here's the line that makes me cringe: "The U.S. tests, already carried out successfully, undoubtedly constitute the greatest geophysical experiment ever conducted by man." 

This was an experiment that affected the entire planet — "the greatest geophysical experiment ever conducted by man" — that were approved, vetted, and conducted under a heavy, heavy veil of secrecy. What if the predictions had been wrong? It's not an impossibility that such a thing could have been the case: the physics of nuclear weapons are in a different energy regime than most other terrestrial science, and as a result there have been some colossal miscalculations that were only revealed after the bombs had gone off and, oh, contaminated huge swathes of the planet, or, say, accidentally knocked out satellite and radio communications. (The latter incident linked to, Starfish-Prime, was a very similar test that did cause a lot of accidental damage.)

There's some irony in that the greatest praise, in this case, is a sign of how spooky the test was. At least to me, anyway.

This is the same sort of creepy feeling I get when I read about geoengineering, those attempts to purposefully use technology to affect things at the global scale, now in vogue again as a last-ditch attempt to ameliorate the effects of climate change. It's not just the hubris — though, as an historian, that's something that's easy to see as an issue, given that unintended consequences are ripe even with technologies that don't purposefully try to remake the entire planet. It's also the matter of scale. Something happens when you go from small-scale knowledge (produced in the necessarily artificial conditions that laboratory science requires) to large-scale applications. Unpredicted effects and consequences show up with a vengeance, and you get a rapid education in how many collective and chaotic effects you do not really understand. It gives me the willies to ramp things up into new scales and new energy regimes without the possibility of doing intermediate stages. 

(Interestingly, my connection between Argus and geoengineering has been the subject of at least one talk by James R. Fleming, a space historian at Colby College, who apparently argued that Van Allen later regretted disrupting the Earth's natural magnetosphere. Fleming has a paper on this in the Annals of Iowa, but I haven't yet tracked down a copy.)

As for Argus's declassification: while the Department of Defense was in the process of declassifying Argus, per Van Allen's recommendations, they got a call from the New York Times saying that they were about to publish on it. (The Times claimed to have known about Argus well before the tests took place.) It's not clear who leaked it, but leaked it did. The DOD decided that they should declassify as much as they could and send it out to coincide with this, and the news of Argus hit the front pages in March 1959.

  1. Citation: James Van Allen to James R. Killian (21 February 1959), copy in the Nuclear Testing Archive, Las Vegas, NV, document NV0309054. []