As We May Think, 2012

 

In July of 1945, just as the conclusion of World War II was coming into view, Vannevar Bush, former dean of engineering at MIT, then administrator of the twin war-time revolutions of radar and the atomic bomb, founder of Raytheon (the future Cold War military industrial powerhouse), and soon to be advocate of a National Research Foundation (to support basic research in the United States), asked in a provocative The Atlantic article “what are the scientists to do next?”  Acknowledging, “it has been a war in which all have had a part,” he contemplates what comes after a total mobilization of society (involving military, industry, science, journalism, politics, and civilians) against a global enemy.  What happens to all that expertise, all that social energy, once the war is over?  His radical answer, of course, was to recommend that U.S. institutions simply stay mobilized, to confront the future itself as a project requiring drastic new forms of technological engineering for the good of both commerce and security.  Thus, before the atomic bombings of Hiroshima and Nagasaki pushed the Japanese leadership to actually surrender, Bush is already thinking of the next act for the fledgling U.S. military industrial economy, of how the extraordinary productivity of a new kind of technoscientific war machine could be extended into a post-war future.

 
In a fascinating move, he focuses his “As We May Think” article on information technologies, and specifically on the problems of archiving and retrieving knowledge as it proliferates across scientific fields.  He focuses on emerging new modes of image making (television, photography, film) and imagines future calculating machines that are eerily prescient of the coming world of computers, hypertext communications, and digitization.  Bush considers information overload as a barrier to social progress, and contemplates how to actualize the full potential of knowledge production across the sciences and professions in real time.  Widely remembered today as a premonition of the coming information economy, we could also read his essay as a roadmap for a permanent military-scientific mobilization, as a basic means of engineering the American future.  For example, it is crucial to know that much of the technological innovation in photography over the next generation would come from military-scientific efforts to photograph the exploding atomic bomb, a process which revolutionized high speed photography, color and specialty film stocks, and ultimately produced high definition digital imaging.  Similarly, our contemporary world of information technologies was founded on the Cold War investment in digital computing by the Defense Department, which invented networked digital communication — the foundation for today’s internet — as a means of solving the problem of maintaining command and control during nuclear war.  Moreover, work on the atomic bomb has pushed the frontier of supercomputing for decades — with the fastest computers in the world residing at the U.S. national laboratories devoted to the U.S. stockpile.  Thus, while Bush contemplates the revolutions for law, medicine, chemistry, and history of instantly available archives of expert knowledge — or as he puts it “science may implement the ways in which man produces, stores, and consults the record of the race” — he also approaches science itself as an instrumental state project, a necessary means of crafting the future through achieving ever greater technical capacities across the range of expert practices. In this way, science as instrumental reason is implicitly nationalized in Bush’s presentation, a game played for American advantage in the dangerous world of competing corporations and nation-states.
 
Thus, in addition to the specific revolutions in information technologies he forecasts in 1945, Bush offers a more structural — and profound — articulation of the emerging U.S. social relationship to knowledge production itself.  Knowledge has meaning in the first instance in this account as a form of military power, making war the ideal form, as well as the unending motivator, for science and technology.  We can imagine that his managerial overview of spectacular war-time achievements — radar and the atomic bomb — cemented this concept of a permanent technological revolution for Bush, who projects constant innovation across the sciences into a deep and dangerous future. He concludes his essay by stating:
 
The applications of science have built man a well-supplied house, and are teaching him to live healthily therein. They have enabled him to throw masses of people against one another with cruel weapons. They may yet allow him truly to encompass the great record and to grow in the wisdom of race experience. He may perish in conflict before he learns to wield that record for his true good. Yet, in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate state at which to terminate the process, or to lose hope as to the outcome.
 
The application of science to needs and desires is a race to the future. But what are these needs and desires “of the race” in a world of real and imaginary existential threats? How are they to be bounded and/or pursued and at what cost?  When can one know that the future has been secured and wisdom achieved?
 
As Bush predicted, U.S. science has been one of the extraordinary achievements of post-World War II American society, leading to continuing revolutions across health, industry, technology, and communication. But this massive investment in science and engineering was achieved not as a structure of peace but rather via Cold War — a permanent militarization of American society. Americans have not thought enough about what it means to give up so completely on the idea of peace or considered how deeply U.S. institutions are imbricated with militarism. I would suggest this is because permanent mobilization is so materially productive and psychologically seductive, and because the Cold War turned militarism itself into a form of normality.
 
In a project of total social mobilization, the boundary of war itself is expanded to all parties. Thus, a civilian National Science Foundation (as advocated by Bush) was conceived also as a supplement to ongoing military research expenditures — which in the 1950s accounted for roughly two-thirds of all U.S. spending on science. Similarly, a revolutionary investment in the social sciences under the rubric of Area Studies focused scholarly energies on those areas of the world of interest to the emerging U.S. Cold War geopolitical strategy. Corporate alliances involved not only military industrial companies like Raytheon, but also Kodak, IBM, Bell, and GM. The ever-deeper imbrication of militarization within American institutions after World War II makes peace a structural problem for American society rather than the goal. Indeed, a recurring problem for this Defense establishment is that of endings, of how to reorient from a permanent mobilization in the name of combating existential danger to address an everyday filled with mundane forms of low-level violence (i.e., accidents, disasters, calamities) while also recognizing the deep challenges to infrastructure as well as sustainability.
 
In 1993, I went to Los Alamos imagining I would write a book about the end of the Cold War, thinking I would have a discussion with nuclear weapons scientists about winning a war and retooling their scientific capacities for peace. Instead, I found an expert community tooling up both politically and conceptually for a future structured by continued nuclear threat.  At that moment, it was a scientific community deeply confused (with the demise of the Soviet Union) about where that threat might come from but also deeply convinced of the existence of an unnamable but nonetheless totalizing danger. Nuclear fear was simultaneously eliminated with the fall of the great Cold War enemy and a proliferating concern in the immediate post-Cold War moment, as experts realized they no longer had the capacity to think without an existential danger. Permanent mobilization remained the answer because no other answer seemed possible — a drastic foreshortening of the future possibility of a national laboratory system, in favor of threat proliferation and nuclear weapons. Just as Americans confronted the problem of winning in 1945 by simply redefining “war” as “defense,” in the post-Cold War moment, U.S. society did not demobilize but rather constituted the future itself as threatening.  Unlike war, defense is a potentially unending commitment, as there is no specific enemy to defeat but rather the challenge of a proliferating universe of imaginary dangers to be defeated in the future.
 
Today, when we think about the costs and accomplishments of the U.S. commitment to militarism — which is unprecedented in human history in both its scale and planetary reach — the most difficult aspect to consider is the counter-factual, to imagine an American society that is interested and capable of demobilizing, of redirecting its remarkable creative and financial commitments to the war-machine to non-militarized projects. If war is now part of the DNA of American science and technology, it is also because conflict itself has been crafted to be largely invisible at home, to allow those with no overt war-like interests to nonetheless participate in the industries, institutions, academies, and processes (i.e., elections) that are deeply imbricated in the structural reality of a permanent U.S. global mobilization.
 
With the end of the war on terror potentially in sight: if we were to revisit Vannevar Bush’s question today — what should the scientists now do — we would have to ask it of a broader section of American society, to taxpayers as well as experts, to politicians as well as soldiers. For how is de-mobilization thinkable after war itself has been naturalized as the basis for American power, a multigenerational commitment to “defense” that has now survived every kind of domestic crisis as well as actually losing conflicts (Korea, Vietnam, Iraq, Afghanistan) while allowing national infrastructures to become both frayed and outdated.
 
As we may think, 2012: what is to be done with the U.S. war machine and how do we articulate it boundaries? Unlike in 1945 (and partly because of the great information technologies Bush saw coming), we now know that permanent militarization has a high social cost: the diversion of funds, expertise, and energy to war extracts those resources from domestic life while also chilling the political potential of a democracy. Today, after the near annihilation of the Al Qaeda leadership behind the 9/11 attacks, including Osama Bin Laden, the U.S. has the ability to declare an end of war on terror at any time it wishes. Doing so would allow a long overdue recalibration of American society, allowing its extraordinary resources to be directed at non-militarizable dangers — climate, energy, health, and education — which all require technoscientific understanding and innovation. Bush seems to suggest this outcome is his ultimate goal in 1945, once the nation overcomes the implied but not yet formally acknowledged danger of nuclear crisis. But at that moment, war was still war, peace was still thinkable, and American institutions were not structured by either the Cold War “balance of terror” nor the U.S. “war on terror.” In the 21st century, American society has naturalized technological revolution — living in an age of constantly changing technical capacities (a register of the incredible vitality and creativity of contemporary science and engineering) — but is has also naturalized permanent war as the basis for American success. Perhaps now with our perfect digital archives and instant systems of information retrieval, we could focus on remembering a time, not actually so long ago, before war became defense, and when peace was a national good.

joseph masco