Posts Tagged ‘Declassification’

Meditations | Redactions

The Problem of Redaction

Friday, April 12th, 2013

Redaction is one of those practices we take for granted, but it is actually pretty strange if you think about it. I mean, who would imagine that the state would say, “well, all of this is totally safe for public consumption, except for a part right here, which is too awful to be legally visible, so I’ll just blot out that part. Maybe I’ll do it in black, maybe in white, maybe I’ll add DELETED in big bold letters, just so I know that you saw that I deleted it.”

From Hans Bethe's "Memorandum on the History of the Thermonuclear Program" (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From Hans Bethe’s “Memorandum on the History of the Thermonuclear Program” (1952), which features some really provocative DELETED stamps. A minimally-redacted version assembled from many differently redacted copies by Chuck Hansen is available here.

From a security perspective, it’s actually rather generous. The redactor is often giving us the context of the secret, the length of the material kept from us (a word? a sentence? a paragraph? a page?), and helpfully drawing our eye to the parts of the document that still contain juicy bits. The Onion’s spoof from a few years back, “CIA Realizes It’s Been Using Black Highlighters All These Years,” is only slightly off from the real truth. Blacking something out is only a step away from highlighting its importance, and the void makes us curious. In fact, learning what was actually in there can be quite anticlimactic, just as learning how a magician does their trick (“the guy in the audience is in on the trick”).

And, of course, the way the US declassification system is set up virtually guarantees that multiple, differently-redacted copies of documents will eventually exist. Carbon copies of the same documents exist in multiple agencies, and each agency can be separately petitioned for copies of their files, and they will send them to individual reviewers, and they will each review their guides and try and interpret them. There’s very little centralization, and lots of individual discretion in interpreting the guides.

The National Security Archive recently posted an Electronic Briefing Book that was very critical of this approach. In their case, they pointed out that a given paragraph in a once-secret document that was deemed by the redactor to be completely safe in 2001 was in 2003 deemed secret again, and then, in 2007, reaffirmed safe, and then, in 2012, again secret. “There often seems little logic to redaction decisions, which depend on the whim of the individual reviewer, with no appreciation of either the passage of time or the interests of history and accountability,” writes Michael Dobbs.

This sort of thing happens all the time, of course. In the National Security Archive’s Chuck Hansen papers there are bundles of little stapled “books” he would create of multiply, differently-redacted copies of the same document. They are a fun thing to browse through, viewing four different versions of the same page, each somewhat differently hacked up.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

A page from a 1951 meeting transcript of the General Advisory Committee, from the Hansen files. Animated to show how he staples three different copies together. Some documents contain five or more separate versions of each page. For closer inspections of the page, click here.

In the case of Hansen’s papers, these differences came about because he was filing Freedom of Information Act requests (or looking at the results of other’s requests) over extended periods of time to different agencies. The passage of time is important, because guides change in the meantime (usually towards making things less secret; “reclassification” is tricky). And the multiple sites means you are getting completely different redactors looking at it, often with different priorities or expertise.

Two different redactors, working with the exact same guides, can come up with very different interpretations. This is arguably inherent to any kind of classifying system, not just one for security classifications. (Taxonomy is a vicious profession.) The guides that I have seen (all historical ones, of course) are basically lists of statements and classifications. Sometimes the statements are very precise and technical, referencing specific facts or numbers. Sometimes they are incredibly broad, referencing entire fields of study. And they can vary quite a bit — sometimes they are specific technical facts, sometimes they are broad programmatic facts, sometimes they are just information about meetings that have been held. There aren’t any items that, from a distance, resemble flies, but it’s not too far off from Borges’ mythical encyclopedia.

The statements try to be clear, but if you imagine applying them to a real-life document, you can see where lots of individual discretion would come into the picture. Is fact X implied by sentence Y? Is it derivable, if paired with sentence Z? And so on. And there’s a deeper problem, too: if two redactors identify the same fact as being classified, how much of the surrounding context do they also snip out with it? Even a stray preposition can give away information, like whether the classified word is singular or plural. What starts as an apparently straightforward exercise in cutting out secrets quickly becomes a strange deconstructionist enterprise.

One of my favorite examples of differently redacted documents came to me through two Freedom of Information Act requests to the same agency at about the same time. Basically, two different people (I presume) at the Department of Energy looked at this document from 1970, and this was the result:

1970 AEC declassification guide redactions

In one, the top excerpt is deemed declassified and the bottom classified. In the other, the reverse. Put them together, and you have it all.  (While I’m at it, I’ll also just add that a lot of classified technical data looks more or less like the above: completely opaque if you aren’t a specialist. That doesn’t mean it isn’t important to somebody, of course. It is one of the reasons I am resistant to any calls for “common sense” classification, because I think we are well beyond the “common” here.) In this case, the irony is double, because what they’re de/classifying are excerpts from classification guides… very meta, no?1

What’s going on here? Did the redactors really interpret their guidelines in exactly the opposite ways? Or are both of these borderline cases where discretion was required? Or was it just an accident? Any of these could be plausible explanations, though I suspect they are each borderline cases and their juxtaposition is just a coincidence. I don’t actually see this as a symptom of dysfunction, though. I see it as a natural result of the kind of declassification system we have. It’s the function, not the dysfunction — it’s just that the function is set up to have these kinds of results.

The idea that you can slot all knowledge into neat little categories that perfectly overlap with our security concerns is already a problematic one, as Peter Galison has argued. Galison’s argument is that security classification systems assume that knowledge is “atomic,” which is to say, comes in discrete bundles that can be disconnected from other knowledge (read “atomic” like “atomic theory” and not “atomic bomb”). The study of knowledge (either from first principles or historically) shows exactly the opposite — knowledge is constituted by sending out lots of little tendrils to other bits of knowledge, and knowledge of the natural world is necessarily interconnected. If you know a little bit about one thing you often know a little bit about everything similar to it.

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version...

For this archive copy of a 1947 meeting of the General Advisory Committee, all of the raw numbers were cut out with X-Acto knives. Somewhere, one hopes, is an un-mutilated version. In some cases, numbers like these were initially omitted in drawing up the original documents, and a separate sheet of numbers would be kept in a safe, to be produced only when necessary.

This is a good philosophical point, one that arguably is a lot stronger for scientific facts than many others (the number of initiators, for example, is a lot less easily connected to other facts than is, say, the chemistry of plutonium), but I would just add that layered on top of this is the practical problem of trying to get multiple human beings to agree on the implementations of these classifications. That is, the classification are already problematic, and now you’re trying to get people to interpret them uniformly? Impossible… unless you opt for maximum conservatism and a minimum of discretion. Which isn’t what anybody is calling for.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren't just for show; they tell you what it's been through, and how to regard it now.

In theory, you can read the classification history of a document from all of its messy stamps and scribblings. They aren’t just for show; they tell you what it’s been through, and how to regard it now.

Declassification can be arbitrary, or at least appear arbitrary to those of us locked outside of the process. (It is one of the symptoms of secrecy that the logic of the redactor is itself usually secret.) But to me, the real sin of our current system is the lack of resources put towards it, which makes the whole thing run slow and leads to huge backlogs. When the system is running at a swift pace, you can at least know what it is they’re holding back from you, compare it to other sources, file appeals, draw attention to it, and so on. When it takes years to start processing requests (as is the case with the National Archives, in my experience; it varies a lot by agency), much less actually declassify them, there is a real impediment to research and public knowledge. I’d rather declassification be arbitrary and fast than conservative and slow.

That individual redactors individually interpreting the guidelines according to the standards they are told to use come up with different results doesn’t bother me as much. There is going to be a certain amount of error in any large system, especially one that deals with borderline cases and allows individual discretion. Sometimes you win, sometimes you lose, but it’s being able to play the game in the first place that matters the most to me.

Notes
  1. The document is a discussion of instances in which classification guidelines are based on strict numerical limits, as opposed to general concepts. Citation is: Murray L. Nash to Theos Thomson (3 November 1970), “AEC Classification Guidance Based on Numerical Limits,” part of SECY-625, Department of Energy Archives, RG 326, Collection 6 Secretariat, Box 7832, Folder 6, “O&M 7 Laser Classification Panel. The top was received as the response to a FOIA request I made in 2008, the bottom another one in 2010. Both were part of FOIA requests relating to declassification decisions relating to inertial confinement fusion; the memo in question was part of information given to a panel of scientists regarding creating new fusion classification guidelines. []
Redactions

Bethe on SUNSHINE and Fallout (1954)

Wednesday, June 27th, 2012

Project SUNSHINE definitely takes the prize for “most intentionally-misleading title of a government program.” The goal of SUNSHINE (co-sponsored by the Atomic Energy Commission and RAND) was to figure out what the impact radioactive fallout from American nuclear testing was on the world population. The initial study was started in 1953, and involved checking biological material for the the radioactive fission product Strontium-90, with an attempt to correlate Sr-90 levels with various known nuclear test series. Not exactly what you think of when you hear the term “sunshine,” eh?

It actually gets much creepier than just the confusing name. The “biological material” they were studying was, well, dead organic matter. What kind of organic matter, specifically? The dataset for a “pre-pilot” study on Strontium-90 intake, was a real witches brew:

  • “Wisconsin cheese (1 month old)”
  • “clam shells (Long Island)”
  • “Wisconsin cat bone”
  • “Montana cat (6 months, fed on milk from free-range cows)”
  • “stillborn, full term baby (Chicago)”
  • “rib from a Harvard man” 

Pardon me while I count my ribs… and cats… and… well… yuck. You can’t make this stuff up. Well, I can’t, anyway. Here’s your creepy meeting transcript of the week, from the planning of SUNSHINE: “Dr. Libby commented on the difficulty of obtaining human samples, and suggested that stillborn babies, which are often turned over to the physician for disposal, might be a practical source.”1

As an aside to an aside, in the full study, they did use samples from corpses — corpses of children in particular seemed of particular interest — in getting their data. It’s a bit gory to read through their data sets as they describe the Sr-90 they found in the ribs or vertebrae of the dead. US scientist Shields Warren in particular seemed to have quite a lot of access to the bones of young children through the Cancer Research Institute in Boston, Massachusetts. Not a job I’d envy.2

Anyway — the document I wanted to share had nothing to do with the sample sources, but I got a little distracted while poking around in the SUNSHINE literature, and couldn’t not pass that on.

Hans Bethe and W.F. Libby

The letter in question comes from 1954, after SUNSHINE had been completed. It’s a request from December 1954 from the well-coifed Hans Bethe to the aforementioned Willard F. Libby, the physical chemist best known as the inventor of radiocarbon dating (for which he would win a Nobel Prize, in 1960), and in 1954 one of the five Commissioners of the AEC.3 In the letter, Bethe is arguing in favor of SUNSHINE’s declassification — and his justifications are not necessarily what you might expect.4

Click to view PDF (yes, it’s in color!)

Bethe started out by noting that even in the summer of 1953, when SUNSHINE was being finished up, they (it seems that Bethe and Libby were both there) thought that it would “be highly desirable to declassify a large part of project SUNSHINE.” Bethe thought the matter has gotten rather urgent:

I still feel the same way about this, and I think the arguments for declassification have become far stronger than they were in 1953. There is real unrest both in this country and abroad concerning the long-range as well as short-range radioactivity, and it would, in my opinion, greatly allay the fears of the public if the truth were published.

There’s the kicker: Bethe was convinced that SUNSHINE will show that fallout from testing isn’t as big a problem as people thought it was. Releasing SUNSHINE wouldn’t be a matter of propaganda (and holding it back wasn’t a matter of covering it up), in Bethe’s mind — it would simply be getting the facts out.

And why might people suddenly be getting concerned about nuclear fallout?

Map showing points (X) where contaminated fish were caught or where the sea was found to be excessively radioactive, following the Castle Bravo nuclear test.

No doubt because of all of the attention that the Castle BRAVO nuclear test had gotten with respects to high amounts of fallout finding its way into all sorts of biological systems far from its source — like the radioactive tuna that was caught for weeks afterwards off the waters of Japan.

Bethe understood, though, that the classification reasons holding back the publication of SUNSHINE were non-trivial. SUNSHINE studies the deposition of fission products following testing, and to make much sense of that, you had to know the fission yields from the tests. If you knew the fission yields, you’d know quite a lot about American nuclear weapons — especially if you knew the fission yield of the Ivy MIKE test, the first H-bomb.

Why? Because knowing the fission component of the first H-bomb test would possibly give away all sorts of information about the Teller-Ulam design. Multi-stage H-bombs have a reasonably large fission trigger that ignites the fusion fuel, which then again induces more fission in a “natural” uranium tamper. In the case of MIKE, 77% of the total 10.4 megaton yield came from the final fission stage. Knowing that would be a good hint as to the composition of the American H-bombs, and was not something they wanted to share with the USSR.

But Bethe thought you could get around this:

I believe the story of SUNSHINE could be published without giving away any information about our H-bombs: it is merely necessary to put the permissible accumulated yield in terms of fission yield rather than total yield.

In other words, if you just talked of fission yield — and didn’t give the total yield — you wouldn’t be able to figure out how much of the yield was not fission, and thus the high disparity (which would be a big red flag for a weapons designer) would be hidden.

Bethe also thought that they should publish the fallout data from the H-bomb tests (likely including those from the CASTLE series). Bethe didn’t think that information would give away any design information, but it was clear that others were suspicious. Bethe put the question to a test: he asked Philip Morrison to try and figure out how an H-bomb worked from just published stories about the Castle BRAVO fallout accident.

A youngish Philip Morrison, courtesy of the Emilio Segrè Visual Archives.

Morrison at that point had no access to classified information. He had been part of the Manhattan Project, and so knew quite a bit about fission weapons, but had been cut out of the classified world by the time the H-bomb had come along. (More on Morrison’s security clearance another time — lots of interesting stories there.)

Morrison’s conclusions (oddly title “FISSION ENERGY IN IVY,” even though it was about BRAVO) are attached to Bethe’s letter. In many ways it is an analysis typical of a somewhat cocky physicist: things are described as “easy” and conclusions are lead to “clearly” and everything is stated as if it is pretty obvious and pretty straightforward. Morrison concludes that the total fission yield of BRAVO (again, misidentified as IVY) is between 0.2Mt and 0.6Mt, and that most of the fission must have been from the fission primary that started the reactions. In reality, 10Mt of the 15Mt total yield was from fission, which is why it was such a “dirty” shot.

Bethe took this as evidence that indeed, looking at just the fallout alone, you couldn’t figure out how much of the explosion was from fission yield, and thus the design information was safe: “As Morrison’s report shows, it seems to be easy to draw entirely wrong conclusions from the fall-out data.”

Why Morrison got this wrong is a little mysterious to me. Ralph Lapp had managed to conclude, more or less correctly, that there was a third “dirty” fission stage, and had popularized the idea enough that it trickled into  Life magazine in December 1955. But Bethe thought Morrison’s analysis was more or less sound, given his lack of detailed information. It’s a weird thing to conclude, based on one study, that some piece of information is fundamentally unknowable, when you already know what the piece of information is.

Life magazine, 1955: not quite right, not entirely wrong.

Speaking of speculating based on missing information, part of Bethe’s letter is redacted, for reasons I do not know. His conclusion makes it pretty clear it has to do with this absolute vs. fission yield/fallout issue, though.

Bethe concludes: “I believe it would greatly improve international feeling about our Pacific tests if we were to publish the correct story of SUNSHINE and of fall-out.”

Libby would come around to Bethe’s position and push for declassification. In Libby’s mind, like Bethe’s, SUNSHINE showed that the world wasn’t going to become mutated just because of a little testing in the Pacific. Furthermore, he also came to believe that you could shut down a lot of the anti-nuclear testing demands by just showing people that you were paying close attention to this sort of thing — by the time of Operation Redwing (1956), he felt that this sort of disclosure had already made the international community more friendly to US testing.

It wasn’t until 1956 that the declassification eventually occurred, however, and even then, a lot of things were removed. (The “Amended*” in the RAND report cover page above is because it was “Amended to remove classified data; otherwise the report remains unchanged and represents the 1953 estimate of the fallout problem.”) Of course, by that point it was clear that the Soviets had already figured out how to make an H-bomb work.


Also! I will be giving a talk this Friday at the annual meeting of the Society for Historians of American Foreign Relations (SHAFR) in Hartford, CT. Just putting that out there.

Notes
  1. Minutes of the 36th Meeting of the General Advisory Committee to the U.S. Atomic Energy Commission (17, 18, and 19 August 1953), copy in the OHP Marshall Islands Document Collection. []
  2. E.g. E.A. Martell, “Strontium-90 Concentration Data for Biological Materials, Soils, Waters and Air Filters,” Project Sunshine Bulletin No. 12, [AECU-3297(Rev.)], (1 August 1956); human bone data listings start on page 29. []
  3. Libby was also the husband of Leona Woods, which I didn’t realize. Marshall was the only woman who had a role in the development of CP-1, the first nuclear reactor, and stands out quite conspicuously in the Met Lab photographs. []
  4. Citation: Hans Bethe to W.F. Libby (17 December 1954), copy in Nuclear Testing Archive, Las Vegas, NV, document NV0032161. []
Redactions

Declassifying ARGUS (1959)

Wednesday, May 23rd, 2012

One of the strangest — and perhaps most dangerous — nuclear tests ever conducted was Operation ARGUS, in late 1958.

The basic idea behind them was proposed by the Greek physicist Nicholas Christofilos, then at Livermore. If you shot a nuclear warhead off in the upper atmosphere, Christofilos argued, it would create an artificial radiation field similar to the Van Allen radiation belts that surround the planet. In essence, it would create a “shell” of electrons around the planet.

Frame from an government film showing the electron shell going around the planet

The tactical advantage to such a test is that hypothetically you could use this knowledge to knock out enemy missiles and satellites that were coming in. So they gave it a test, and indeed, it worked! (With some difficulty; it involved shooting nuclear weapons very high into the atmosphere on high-altitude rockets off of a boat in the middle of the rough South Atlantic Ocean. One wonders what happened to the warheads on them. They also had some difficulty positioning the rockets. The video linked to discusses this around the 33 minute point. Also, around the 19 minute mark is footage of various Navy equator-crossing hazing rituals, with pirate garb!)

It created artificial belts of electrons that surrounded the planet for weeks. Sound nutty yet? No? Well, just hold on — we’ll get there.

(Aside: Christofilos is an interesting guy; he had worked as an elevator repairman during World War II, studying particle physics in his spare time. He independently came up with the idea for the synchrotron and eventually was noticed by physicists in the United States. He later came up with a very clever way to allow communication with submerged submarines deep under water which was implement in the late 20th century.)

James Van Allen kissing Explorer IV (a satellite used in Argus) good-bye

In early 1959 — not long after the test itself — none other than James Van Allen (of the aforementioned Van Allen radiation belts) argued that the United States should rapidly declassify and release information on the Argus experiment.1

Click for the PDF.

Van Allen wanted it declassified because he was a big fan of the test, and thought the US would benefit from the world knowing about it:

As you will note, my views are (a) that continued security classification of the Argus-Hardtack tests is of little practical avail, (b) that a prompt and full public report of the tests and observations will contribute greatly to the international prestige of the United States as a leader in the application of atomic devices to scientific purposes, and (c) that if we fail to do (b) the U.S. will be quite likely be again ‘Sputniked’ in the eyes of the world by the Soviets.

Basically, Van Allen argued, the idea of doing an Argust-type experiment was widely known, even amongst uncleared scientists, and that the Soviets could pull off the same test themselves and get all the glory.

But here’s the line that makes me cringe: “The U.S. tests, already carried out successfully, undoubtedly constitute the greatest geophysical experiment ever conducted by man.” 

This was an experiment that affected the entire planet — “the greatest geophysical experiment ever conducted by man” — that were approved, vetted, and conducted under a heavy, heavy veil of secrecy. What if the predictions had been wrong? It’s not an impossibility that such a thing could have been the case: the physics of nuclear weapons are in a different energy regime than most other terrestrial science, and as a result there have been some colossal miscalculations that were only revealed after the bombs had gone off and, oh, contaminated huge swathes of the planet, or, say, accidentally knocked out satellite and radio communications. (The latter incident linked to, Starfish-Prime, was a very similar test that did cause a lot of accidental damage.)

There’s some irony in that the greatest praise, in this case, is a sign of how spooky the test was. At least to me, anyway.

This is the same sort of creepy feeling I get when I read about geoengineering, those attempts to purposefully use technology to affect things at the global scale, now in vogue again as a last-ditch attempt to ameliorate the effects of climate change. It’s not just the hubris — though, as an historian, that’s something that’s easy to see as an issue, given that unintended consequences are ripe even with technologies that don’t purposefully try to remake the entire planet. It’s also the matter of scale. Something happens when you go from small-scale knowledge (produced in the necessarily artificial conditions that laboratory science requires) to large-scale applications. Unpredicted effects and consequences show up with a vengeance, and you get a rapid education in how many collective and chaotic effects you do not really understand. It gives me the willies to ramp things up into new scales and new energy regimes without the possibility of doing intermediate stages. 

(Interestingly, my connection between Argus and geoengineering has been the subject of at least one talk by James R. Fleming, a space historian at Colby College, who apparently argued that Van Allen later regretted disrupting the Earth’s natural magnetosphere. Fleming has a paper on this in the Annals of Iowa, but I haven’t yet tracked down a copy.)

As for Argus’s declassification: while the Department of Defense was in the process of declassifying Argus, per Van Allen’s recommendations, they got a call from the New York Times saying that they were about to publish on it. (The Times claimed to have known about Argus well before the tests took place.) It’s not clear who leaked it, but leaked it did. The DOD decided that they should declassify as much as they could and send it out to coincide with this, and the news of Argus hit the front pages in March 1959.

Notes
  1. Citation: James Van Allen to James R. Killian (21 February 1959), copy in the Nuclear Testing Archive, Las Vegas, NV, document NV0309054. []
Redactions

Declassifying the Ivy Mike film (1953)

Wednesday, February 8th, 2012

Every good nuclear wonk has seen the delightfully over-the-top film that the government made about the Operation Ivy test in 1952. If you’ve seen any films involving nuclear test footage, you’ve probably seen parts of it, even if you didn’t recognize it as such. It ranks probably second in the all-time-most-viewed nuclear weapons films.1

Ivy Mike was, of course, the first test of a staged, multi-megaton thermonuclear weapon: the first hydrogen bomb. With an explosive yield of 10.4 million tons of TNT, it was a grim explication how tremendously destructive nuclear arms could be. Even Congressmen had difficulty making sense its power.

A 17-minute version (down from 28 minutes, which is already down from the hour-plus version now available from Archive.org, embedded above) of the Operation Ivy film was released for American citizens on April 1, 1954. The domestic and international reactions were immediate. The Soviet Union warned its people that these weapons could destroy “the fruits of a thousand years of human toil”; Premier Nehru of India called for the US and USSR to cease all hydrogen bomb tests. It was replayed two days later in the United Kingdom with an estimated 8 million viewers, even though supposedly the film was not meant to be distributed overseas, to avoid inflaming international opinion against nuclear testing.

The New York Times’ television critic, Jack Gould, reviewed it negatively: “A turning point in history was treated like another installment of ‘Racket Squad.’”2 The problem, Gould explained, was that it used “theatrical tricks” to talk down to the audience. Now the irony here is that the Operation Ivy film wasn’t made for a television audience. It was made for the President of the United States and top military brass and folks like that. Which makes the “talking down” even more disturbing, no?

This week’s document concerns the internal deliberations by the Atomic Energy Commission regarding the declassification and sanitizing of the Operation Ivy film. This report, AEC 483/47, outlines the opinion of the AEC directors of Classification and the Information Service about whether the film could and should be declassified.3

Click the image for the full PDF.

This isn’t the story of how it ends up on American television, but it is moving in that direction. The document goes over a proposal to release an edited (sanitized) version of the film for usage at a Conference of Mayors that President Eisenhower had assembled. The goal was to convince the mayors that Civil Defense was important: you’d better act now, before your city gets nuked.

The problem: the AEC didn’t really want to release the precise yield of the Mike shot. That’s a hard thing to hide when you’re obliterating an island with it. They also weren’t keen on releasing the fact that this wasn’t a deliverable weapon yet, but they couldn’t see a way of getting around that without seriously cutting it down to nothing. But at least they managed to cut out everything about its design, and the Ivy King shot (the largest pure-fission explosion, at half a megaton).

Read the full post »

Notes
  1. First likely goes to the Crossroads Baker test, which aside from being used everywhere is featured very prominently, repeatedly, at the end of Dr. Strangelove. []
  2. Note that the Operation Ivy narrator was Reed Hadley, from the aforementioned “Racket Squad.” []
  3. Citation: Report by the Directors of Classification and Information Service regarding the Film on Operation Ivy (AEC 483/47), (8 December 1953), copy in Nuclear Testing Archive, Las Vegas, NV, document #NV0074012. []
Redactions

Implosion: To Declassify or Not to Declassify? (1945)

Wednesday, February 1st, 2012

The implosion design of the atomic bomb is considered the ultimate secret triumph of Los Alamos. Unlike the relatively simple gun-type design, the implosion design required innovation on a whole manner of scientific fronts: nuclear physics, metallurgy, chemistry, ordnance engineering, electronics… the list goes on. Making explosive lenses that could precisely compress a solid sphere of plutonium into a supercritical state wasn’t easy.

The very idea of implosion — much less the specifics of its implementation — wasn’t declassified after the bombings of Japan. As I’ve mentioned previously, it wasn’t until 1951 (that is, well after the Soviets had demonstrated their own ability to do it) that a bare-basics idea of implosion was declassified, as part of the evidence in the Rosenberg trial.

But that isn’t to say that people hadn’t thought that it was perhaps safe to declassify it earlier than that. This week’s document is a memo from then-Commodore William S. Parsons (USN) to Norris Bradbury, scientific director of Los Alamos, from late October 1945 on this very subject. Parsons you will remember as the weaponeer on the Enola Gay, and a truly key figure at multiple junctions in the development of the practical ordnance engineering of the atomic bomb. In this memo, Parsons was arguing against the declassification of implosion — something he felt needed to be done because there were a considerable number of folks who were arguing for its declassification:1

Click the image to view the PDF.

As you’ve probably picked up on by now, I like anything that helps me get inside the head of a classifier (or declassifier) and see how they saw the world in their time and place. It’s the grist for my historical mill: it’s how I understand nuclear secrecy as a never-quite-stable category, one that is always evolving, one whose logic is always up for grabs and thus needs to be articulated and re-articulated repeatedly.

Read the full post »

Notes
  1. Citation: William S. Parsons to Norris E. Bradbury, “Declassification of Implosion,” (30 October 1945), copy in the National Security Archive, George Washington University, Chuck Hansen Papers, Box 11, “1945-1949,” Folder 3. []