One of the most interesting courses I took as an undergraduate at UC Berkeley was a class on cognitive science from the famed linguist George Lakoff. The course was essentially just us reading through his classic book, Women, Fire, and Dangerous Things: What Categories Reveal About the Mind (University Of Chicago Press, 1987), and then going to lectures where Lakoff would talk about the big points and answer questions. It was a pretty great way to get introduced to Lakoff’s approach to the mind: metaphor and frame analysis.
For Lakoff, metaphors are not just linguistic trickery, but are deep ways of understanding how the human mind operates. It’s the Sapir-Whorf hypothesis with a cognitive science spin, more or less. The conceptual metaphor is a hint as to how our brains actually process the world. If we think that a discussion is a battle, we approach it differently than we might if we thought that a discussion is a dance (as one of my old, dear Berkeley friends liked to put it). At its very basest level, the conceptual metaphor is the borrowing of properties from a source domain (battles or dancing, in my example) and applying them to something in a target domain (the discussion). The argument that it is “just” a metaphor usually means, “well, we all know that metaphors have limitations, and we don’t literally bring over all properties from the source to the destination domain,” but Lakoff would argue that the brain isn’t quite so “logical” as that, and that the metaphors we pick do matter.
Historians have found conceptual metaphors useful tools because they help us make the argument that the way in which historical actors talk about something is actually important. Paul Edwards, in his classic The Closed World: Computers and the Politics of Discourse in Cold War America (MIT Press, 1996), uses Lakoff quite explicitly in this way, arguing that the hegemonic metaphors that Cold Warriors like Robert McNamara used in trying to articulate their technological systems were more than just figures of speech — they actually tell us a lot about how someone like McNamara conceptualized the Cold War itself. There are no doubt limits to this line of analysis, as many folks have pointed out, and perhaps we historians should be suspicious about how adopting too many forms of deconstruction that make us focus on the style rather than the substance of our historical actors, but when done right (and I find Edwards’ book useful and compelling), it does enlighten us a bit.
All of this is prelude, as usual, for my image of the week. One of the most common metaphors for talking about secrecy is information is a liquid (to put in the Lakoff-style conceptual metaphor format). Information flows; secrets leak. Don’t drown in that sea of data! And so on.
Hugh Gusterson has recently taken issue with the question of how technology “leaks,” in reference to a recent New York Times article about commercialization of the Silex method of laser isotopic enrichment. Sayth The Times: “He and two other former government officials concluded that the laser secrets had a low chance of leaking and that a clandestine laser plant stood a high chance of being detected.” Gusterson comments on this line:
It conjures an image of a fluid escaping containment to go somewhere it should not. If there is agency involved it belongs not to people but to the fluid secrets themselves, which have, however, a “low chance of leaking.”
Gusterson then goes on to point out that this information as liquid metaphor is really quite at odds with how science studies scholars have come to understand technology transfer: it’s not about the information so much as it is about the people. Focusing on the “leaking” as a factor of the information, rather than the folks who work on it (and leak it), encourages looking at it all the wrong way.
I have my own problems with the information as liquid metaphor. The truth is, information doesn’t really act like a liquid much of the time. Liquid can’t duplicate itself into perfect copies; liquid can’t suddenly be “rediscovered” by multiple people; liquid can’t be pieced together from tiny parts to form a coherent whole; and so on. Liquid is an extremely lousy metaphor in the Information Age in particular, where our data literally can be sent through the air, through solid walls. (Perhaps information as radiation is a little too cheeky to be a replacement?)
I’ve actually given talks where I’ve railed on about how silly the information as liquid metaphor is — as if you could put it into a pool, or behind a giant dam. So I was pretty floored when I found out that the AEC’s Director of Classification, Charles L. Marshall, was using the following image on his letterhead in the early 1970s:
The idea here is to illustrate how the different categories of classification interact with each other — “Restricted Data” is a special sub-pool of secret defense information, and moving information between these separate “pools,” or outside of the “secrecy pool” itself, can only be done by specific legal means. (The movement of “Restricted Data” to “Formerly Restricted Data,” which is still a separate little secret pool of its own, is especially confusing to the uninitiated. FRD is basically information relating to nuclear weapons that you can safely give members of the armed forces for the purpose of actually using the bombs. It’s still a special category, but it’s no longer “Restricted Data,” strictly speaking. And despite it’s “formerly” modifier, it is not declassified.)
Further digging through the files of the Joint Committee on Atomic Energy turned up the source of this drawing, as well as another version of the information as liquid metaphor:
This one shows the action of declassification as some sort of immense plumbing system, moving “Restricted Data” from behind the “dam” of secrecy into the bustling civilian world of science and industry. Both of these sketches were drawn by Howard F. Gunlock, a Senior Classification Analyst with the AEC in the early 1960s. They accompanied an article by AEC Classification Director Charles L. Marshall on “Classification/Declassification” that ran in Nuclear News, a publication of the American Nuclear Society, in the February 1965 issue.
So there you have it: the information as liquid metaphor taken to its graphical extreme by members of the Atomic Energy Commission themselves, as a way of making graphical sense of how classification/declassification works, drawn by actual AEC employees. You can’t make this stuff up, folks.
Great post! Your introduction on how metaphors used can shed some light on how historical actors viewed a particular issue reminds me a great deal of Phil Zelikow’s discussion of analogical vs. deductive reasoning in this AWESOME lecture that he delivered on the Manhattan Project during the summer 2010 Hertog Global Strategy Initiative’s “Nuclear Summer.” Absolutely worth a watch.
Please pardon my HTML if the link doesn’t work…
Here is the link in plain vanilla form:
[…] Will Thomas has posted on his blog, Ether Wave Propaganda. Will’s post was in response to my previous post on imagery and metaphors; you should go to Will’s blog to see his comments and my comments on […]