Posts Tagged ‘Language’


The End of the Nuclear Age

Friday, August 17th, 2012

In the twentieth century, Americans in particular seemed to have picked up a bug for defining themselves by the technologies they used. We are always apparently living in an "Age" of something. In and of itself, defining your own "Age" while you are living in it isn't brand new — there was that whole "Age of Enlightenment" thing, of course — but our amazement at the apparently changed pace of life brought on by science and technology has sped this up quite a bit once things really got hopping in the last century.

New York Times Magazine, August 12, 1945.

Starting in August 1945, we officially began living in the "Atomic Age." Which is to say, really, that people started saying that they were living the atomic age. We did this for awhile, and at some point transitioned to the "Nuclear Age," the "Jet Age," the "Space Age," and so on.

A nice set of historical questions follow: When did those transitions between "Ages" happen? Which "Ages" were more influential, as a term of self-identification? Do "Ages" die, or just fade away?

Google's Ngram Viewer makes this sort of thing quite fun to track, though the results aren't necessarily straightforward. Basically the Ngram Viewer can track the usage of specific (case-sensitive) phrases across the Google Books corpus over time, normalizing them to a relative frequency of use (so that the results just don't reflect how many books there were being published at any given time). It's a nice way to get preliminary information about linguistic disputes, though it has plenty of obvious methodological difficulties.1

A wonderful case study: when did "atomic" lose traction to "nuclear"? Google NGrams gives a fairly unambiguous, mostly straightforward result:

Click image to see data, details.

In the beginning (1945), "atomic" was king. In 1958 or so, it was surpassed by "nuclear." This coincides nicely with the startup of the first nuclear power plant in the United States, the Shippingport Atomic Power Station. It's hard not to conclude that the shift from "atomic" to "nuclear" was caused by the growth of the nuclear power industry (even if Shippingport was itself under "atomic power").

While "atomic" was on a free-fall from then on, "nuclear" enjoyed two peaks: on in the mid-1960s, then one in the mid-1980s. This maps fairly well onto the cultural history of nuclear weapons in the United States.2 With only a slight understanding of nuclear history, the 1960s (Cuban Missile Crisis, ICBMs, Limited Test Ban Treaty, Non-Proliferation Treaty) and the 1980s (Reagan, Gorbachev, Pershing Missiles, Reykjavik, "Star Wars," Chernobyl) conjure up periods of high cultural interest in many things nuclear.

Atomic Dining Room, 1952, Augusta, Georgia. Would anybody want to eat in a Nuclear Dining Room?

Interesting side-note: we all know that scientists and other precise-minded people consider "atomic" to be an inferior designation than "nuclear." The energy we care about is not "atomic" in nature (which also includes the electrons) — it's specifically involved in the fissioning or fusing of nuclei. And yet, "atomic" was what was even plastered across the official government statements in 1945 -- the Smyth Report was originally meant to be titled "Atomic Bombs," as I discussed on Wednesday. An interesting wrinkle is that Smyth himself hated the use of the term "atomic" when "nuclear" was meant, but was overruled by Groves and others. "Nuclear" just wasn't a word well-known by the general public in 1945, whereas "atomic" has been common currency for a long time.

During the Manhattan Project, the scientists at the University of Chicago thought that they ought to use a completely new term to describe what they were doing:

We propose to use the word "nucleonics" as a name for this field. Reflecting the modern trend toward close correlation between science and industry, and following the load of "electronics", we propose that the word "nucleonics" shall refer to both science and industry in the nuclear field.3

"Nucleonics" didn't really take off, though. It was occasionally used by scientists, and there was a journal with the title, but in the public mind it never had any traction.

Returning to our question about the "Ages," I ran a whole bunch of "age" phrases through the Ngram viewer. Here are the interesting results, methodological caveats notwithstanding:4

An interesting conclusion: We no longer live in the "nuclear age." Which is to say, we no longer define our times by the fact of our using nuclear technology -- which we still do, in abundance. (The United States still has well over 100 operating commercial nuclear reactors, providing around 20% of the nation's electricity generation. The world still has thousands of nuclear weapons in it. Nuclear issues still appear on the front pages of newspapers with alarming regularity.) But since the mid-1990s, "information" has defined us overwhelmingly.

Methodological issues with these kind of keyword searches aside, these results jibe with the general feeling that our having specifically "nuclear" technology is not longer a distinguishing — or at least novel — characteristic of the times in which we live, in the same way that calling attention to the engines in our airplanes, or places we visited on a handful of occasions (outer space), soon ceased to be definitional of our times.

There's an easy narrative one can make about this — perhaps too easy. New, disruptive technologies enter into our world. They seem to change everything. Machines completely changed the way labor worked and the nature of manufactured goods. The atomic bomb seemed to change everything about security, diplomacy, and war. The jet suddenly made distances very small indeed. Nuclear power and nuclear weapons became a mainstay of modern life. And information gradually began more and more to define how we operated in the world.

And yet, not one of these technologies replaced the others. We still have machines. We still have jets. We still have nuclear weapons and nuclear plants. But all of the others have long since ceased to impress us. Information still impresses us — we're still in the middle of its thrall, we're still shocked and surprised by the things it does for better and worse. So even though the information age feels a little old hat at this point, as a phrase, it's still going strong in the zeitgeist. Until the next revolution.

But lest we feel that Information is something terribly new and shocking, take a look at that graph again: none of these, even the Information Age, hold a candle to how people talked about living in the Machine Age. One might be tempted, were one to take a long view of things, to say that the twentienth century was bracketed on one side by Machines, and on the other by Information. In between, we flirted with the Bomb.

  1. Transcription fidelity and dating fidelity are two major systemic issues with the Google Books corpus; the inability to tell what sense a given word is being used is an issue with any kind of "dumb" concordance approach. []
  2. I haven't separated out "American English" from "British English" on these charts, though Ngrams lets you do it. Frankly, I just don't trust the results — I don't know how it is claiming to tell one from the other, and I fear it has to do with publisher location, which is very misleading. In this particular graph, though, there are some interesting differences in the "British English" version from the "English" and "American English" versions, which are basically the same. Specifically, the rise of "nuclear" occurs slightly later, and slower in "British English," and has a much more impressive peak around 1985. []
  3. Z. Jeffries, Enrico Fermi; James Franck, T.R. Hogness, R.S. Mulliken, R.S. Stone, C.A. Thomas, "Prospectus on Nucleonics," (18 November 1944), Bush-Conant File Relating the Development of the Atomic Bomb, 1940-1945, Records of the Office of Scientific Research and Development, RG 227, microfilm publication M1392, National Archives and Records Administration, Washington, D.C., n.d. (ca. 1990), Roll 3, Target 4, Folder 17, "S-1 Technical Reports (1942-44)." []
  4. In all cases except Information Age, lower-case capitalization increased prevalence. One might suspect that indicates weird false-positives, but a perusing of the data shows that indeed, people did refer to, say, the "machine age" in lower case: "With the coming of the machine age" ... "For the sake of the argument, it may be conceded that the machine age has produced nothing comparable with the best of the painting, sculpture, and architecture of antiquity and the middle ages" ... "Further, if we are to preserve our adolescents from the banal mechanizing of a machine age"... and so on. "Motor Age," "Plastic Age," anything related to biological sciences don't really chart compared to the others. "Sex Age" goes about as much as "jet age" albeit a decade later, but sex isn't exactly a technology, so I've left it off. And the biggest difficulty here, of course, is the fact that you can't tell from word counts whether people are self-identifying — thus a search for "Industrial Revolution" tells you little about how people called themselves at that point, given that the phrase takes off pretty much after the Revolution in question as a way of talking about the period itself. Ditto "Middle Ages," "Age of Exploration," and other such phrases which are descriptive of past times rather than present times. Additionally, it should be noted that the dataset only goes through 2008. []

Do We Want Another Manhattan Project?

Monday, April 2nd, 2012

Some time back, I set up a Google Alert to forward me, on a daily basis, any news stories that mention the phrase "Manhattan Project." The not-terribly-surprising result is that I get to see exactly how often people call for new Manhattan Project-like efforts to solve various technical or political problems of our times.

It turns out to be something that is very often called for. More than I had expected. Each and every day, more or less, someone, somewhere, is invoking the Manhattan Project as the be-all and end-all for intense problem solving.

One of the silver Manhattan Project pins issued after the war to individuals who worked on the Manhattan Project for over a year. From my personal collection.

Here is what people seem to mean by calling for a new Manhattan Project:

  • Round up all the experts on a given subject. All of them.
  • Give them near unlimited resources and allow them to follow each and every possible lead towards solving the problem. Each and every one of them. Even redundantly.
  • Expect positive results in three to five years, give or take. What else could the outcome of such an effort be be?

Well, that sounds find and dandy if you support whatever the outcome is meant to be — renewable energy, bird flu studies, understanding autism, fertilizer, nuclear fusion, fighting cancer (just to name a few such ideas floated in the last two weeks alone) — and you believe that it actually works that way.

But here's the rub: this is a really lousy way to think about what the Manhattan Project was.

Here's what the Manhattan Project means to me, as an historian of, well, the Manhattan Project:

  • Spend a fantastical amount of resources — diverting it from other necessary projects — on something which is actually a huge gamble. We tend to gloss over the fact today that when they started pouring money into the bomb work in 1942 and 1943, there were really quite long odds that it would pay off, and it was a lot harder to pull off than they had originally estimated when they OK'd the project.
  • Invest so much into it that the penalty for calling off the work, even if the initial purpose is no longer relevant or feasible, would be institutionally impossible. In fact, invest so much effort into it that the question of whether it ought to be pushed towards completion is essentially unraisable amongst the highest echelon of planners.
  • Do all of the above with essentially no external oversight or independent auditing. This is actually a requirement for spending money on such a wildly improbable project at any time when resources are considered scarce — which is always.1
  • Keep all long-term or big-picture planning in the hands of a deliberately small cabal, and rigidly work to keep dissenting views out of view of those who might actually care about them.
  • After completing your goal, instead of dismantling the infrastructure and moving on to other problems, you actually expand it considerably, making it several times larger than anyone would have guessed during the initial project.

Now, I hear you saying, "well, Alex, obviously no one who invokes the Manhattan Project means any of that!" Oh, I know. That's the problem — we've turned "the Manhattan Project" into a vague metaphor for scientific effort. But it's highly misleading. The Manhattan Project was an unusual, somewhat dubious enterprise that had massive, world-affecting consequences. Ignoring that not only misunderstands the Manhattan Project, it misunderstands what happens when you pour essentially unlimited resources into a given field — which actually is the primary goal of those who use this metaphor.

The problem is, the Manhattan Project "worked," if by "worked" you mean, "produced atomic bombs for use in the Pacific theatre during World War II." It almost didn't work — there are plenty of reasons to believe that the war would have been over fairly soon with or without the bombs (the main historical question is not whether it would soon end, but on what terms and at what costs). But because it did "work," it seems inevitable and even like a good gamble. But the odds were actually incredible long, and it was an incredible accomplishment to actually get that much done in that amount of time — a flattering fact for those who worked on it, but something we tend to gloss over since we know how the story ends.2

The other problem is that many of the other models one might use had much more ambiguous results. Do we want a "War on Cancer" for our present problems? Probably not, since that didn't actually end cancer. (It did many good things, to be sure! But it also showed us that "cancer" is a fantastically bigger medical problem to surmount than many people had realized.)

Do we want a "Apollo program" instead? Maybe — getting to the moon first is seen as more or less good thing today by most Americans. But it also has overtones of technical innovation for propaganda's sake, and in its own time, the technical accomplishments were noticeably complicated by a rupturing domestic scene (riots, Civil Rights, Vietnam, etc.). Of course, the fact that the popular attitude is something along the lines of "we went to the moon, and then space got boring," doesn't help. Still, it is probably a more realistic model for what one might want, in terms of time horizons, expenditures, and programs that are actually compatible with democratic institutions.

Ironically, a public call for a new "Manhattan Project" is also something of a logical contradiction. There was no public call for the original Manhattan Project — it was a very private, secret effort by a small group of insiders running off of specialized knowledge that most of the world wasn't aware even existed. General Groves would have loathed (and would have attempted to censor) an actual public call for making atomic bombs in the 1940s, because it would have completely defeated the goals of the Manhattan Project as they actually existed — secrecy, winning the "race," avoiding audits, and so on. Calling for a Manhattan Project makes about as much sense as ordering up your own surprise party.

  1. See my previous post on the Origins of the Nuclear Black Budget for a discussion of these kinds of issues. []
  2. I'd be remiss not to cite Michael Gordin's provocative book, Five Days in August: How World War II Became a Nuclear War, which interrogates the "what do we mean by work?" question in wonderful detail. I wrote a review of it awhile back, which you can read here if you are interested. []

Liquid Metaphors of Classification (1965)

Friday, January 6th, 2012

One of the most interesting courses I took as an undergraduate at UC Berkeley was a class on cognitive science from the famed linguist George Lakoff. The course was essentially just us reading through his classic book, Women, Fire, and Dangerous Things: What Categories Reveal About the Mind (University Of Chicago Press, 1987), and then going to lectures where Lakoff would talk about the big points and answer questions. It was a pretty great way to get introduced to Lakoff's approach to the mind: metaphor and frame analysis.

For Lakoff, metaphors are not just linguistic trickery, but are deep ways of understanding how the human mind operates. It's the Sapir-Whorf hypothesis with a cognitive science spin, more or less. The conceptual metaphor is a hint as to how our brains actually process the world. If we think that a discussion is a battle, we approach it differently than we might if we thought that a discussion is a dance (as one of my old, dear Berkeley friends liked to put it). At its very basest level, the conceptual metaphor is the borrowing of properties from a source domain (battles or dancing, in my example) and applying them to something in a target domain (the discussion). The argument that it is "just" a metaphor usually means, "well, we all know that metaphors have limitations, and we don't literally bring over all properties from the source to the destination domain," but Lakoff would argue that the brain isn't quite so "logical" as that, and that the metaphors we pick do matter.

Read the full post »