Innovation through Economic States of Exception

”America is somewhere between dire straights and dead.”

— Peter Thiel

Innovation across almost every American industry, which had seen steady upwards progress in the late 20th century, had reached a plateau at the turn of the millennium. As the dual dogmas of globalization and financialization took root, the American economy shifted its focus away from innovating and towards rent-seeking activities in order to generate profits. This paper will attempt to argue that the period of tremendous strides in American innovation from the 1940s to the 1990s was possible in part due to the continuous state of war, starting with World War II and through the Cold War. The impending existential conflict pushed the American people, government, and society to enter, out of desperation, an “economic state of exception”, in which they were able to ignore economic rationality in order to invest their time, money, and effort toward radical causes―many of which lead to technological innovation―that would be infeasible or too risky for profit-seeking entities. The end of the Cold War freed America from the existential threats of Nazi Germany and Soviet Russia, forcing it to shift back into normal capitalistic tendencies, which promotes rent-seeking behavior over investment into technological innovation.

The half-century from 1940 to 1990 saw unprecedented technological innovation that pushed humanity forward through a multitude of new inventions, from nuclear power to the Green Revolution, from the television to the computer, from the Internet to the cell phone. This generation saw a man on the moon, smallpox eradicated, and air travel brought to the masses. Anthropologist David Graeber notes that for many who grew up in the fast-changing world of the second half of the 20th century, such rapid innovation had come to be seen as expected and led to grand visions of the future of technology.

The preceding generation, who grew up reading Jules Verne or H.G. Wells around the turn of the 20th century were told that by mid-century they would have things as far-fetched as flying machines, submarines, rockets, and televisions―and the second half of the 20th century, indeed, delivered. In contrast, the modern world did not deliver the technological progress that Cold War era science fiction predicted. Back to the Future II told us we would have flying cars and hoverboards by 2015. Kubrick’s 2001: A Space Odyssey thought that by the turn of the millennium we’d be flying on commercial flights to the moon, developing strong AI computer personalities, and building city space stations. The original Star Trek thought that by the nineties we’d already have had our first war against genetically engineered superhumans. So, why didn’t progress deliver the world it was expected to? It is necessary, perhaps, to first uncover where progress originates from.

Graeber, despite being a prolific anarchist, admits that much of the innovation that happened in the era was due to government action. He explains that “the Apollo program was a Big Government project, Soviet-inspired in the sense that it required a national effort coordinated by government bureaucracies.” Indeed, many of the innovations mentioned so far have roots in government projects or public-private collaborations. This pro-state narrative is upheld by Italian-American economist Mariana Mazzucato, who claims that even innovations typically thought of as feats of private industry tend to have roots in state investment. For example, the personal computer is often considered the creation of Silicon Valley, its invention credited to the efforts of companies like Apple, HP, and IBM. However, this ignores the massive influence that government played through its role in establishing “new computer science departments at various universities in the USA” and funding the research that led to the creation and growth of Silicon Valley. According to John Aubrey Douglass’s paper The Cold War, Technology, and the American University, “The federal government was limping toward a sort of industrial policy…Since American industry was failing to invest in sufficient research and development to bring new products to market that could compete internationally…the government provided public funds to universities to help move the fruits of basic research into the marketplace.”

Government funding is not only the bedrock of computer technology, but also of numerous other fields such as biotechnology and nanotechnology. This same pattern of private industry building off the work of government continues even today. The self-driving car industry is built upon decades of research done by DARPA. Many of the companies created by “future visionary” Elon Musk like Tesla and SpaceX only exist because of funding and investment from the government. The iPhone, while a brilliant product by Apple, was built by amalgamating a number of government-created technologies such as cellular technology, GPS, the Internet, and Siri.

What is especially interesting to note, however, is that not only are these component technologies created by the state, most of them were created by a specific sector of the government―the military. Cellular technology was developed by the military to communicate during Operation Desert Storm. The Global Positioning System was created for military and intelligence applications. The precursor to the Internet, ARPANET, was created by the Department of Defense to allow for the sharing of intelligence across military networks. Looking even slightly further back in history, the first general-purpose computer was designed by Alan Turing as a way to break Nazi Enigma code. Nuclear energy came from the weaponry designed to end World War II. The rocketry built for space exploration started as rocketry built for missiles. Even the airplane, while invented by the Wright brothers, private individuals, truly only reached its potential when its use as a weapon was discovered during World War I and was improved when the German military invented the jet engine during World War II.

It is no coincidence that the military is responsible for so much innovation; the investment required to innovate to such an extent is often too risky or even irrational for private enterprise to pursue. Non-monopolistic, profit-seeking firms in a capitalist system have little ability to take risks and pursue innovation. Peter Thiel, in his book Zero to One notes the economic principle that “under perfect competition, in the long run no company makes an economic profit.” Without long-term profit, most companies are unable to accumulate enough capital to invest in high risk innovation. Thiel continues, “In perfect competition, a business is so focused on today’s margins that it can’t possibly plan for a long-term future.” Essentially, for a profit-seeking company, it doesn’t make sense for them to invest in innovation, but rather focus on either building a monopoly to escape competition (extremely difficult for most firms) or on gaining short term profits. Historical sociologist Greta Krippner notes that the pattern of accumulation for short term profits leans towards rent-seeking methods rather than production and innovation. In Capitalizing on Crisis: The Political Origins of the Rise of Finance, she writes that if given the opportunity, the “profit making occurs increasingly through financial channels rather than through trade and commodity production.” So, if private capitalistic firms are unable to pursue innovation, why are the government and military able to do so?

Simply, they had the ability to operate outside the sphere of traditional economics. The fact that the nation was in the state of war gave the military the capacity to spend money on things it wouldn’t have been able to in a time of peace. For example, the Manhattan Project costed $23 billion dollars and was, to an extent, a gamble as to whether it would actually work. Such a risky investment would likely not normally have been made; however, given the circumstances, it was necessary because of the dangerous potential that Nazi Germany might build an atomic weapon first. In order to overcome economics, the war need not even be physical; it can be an ideological conflict as well. Going to the moon, arguably one of the greatest feats of mankind, is ultimately an extremely economically irrational thing to do. Manned space travel is not even an efficient way to engage in scientific research. However, our ideological battle for “global supremacy” with the Soviet Union made the space race a political imperative, no matter the economic cost. It also affected investment into consumer technology in order to “prove that our society was better”. Graeber writes that “America’s rivalry with the Soviet Union made innovation appear to accelerate. There [were]…frenetic efforts by U.S. industrial planners to apply existing technologies to consumer purposes, to create an optimistic sense of burgeoning prosperity…”. This sense of fundamental rivalry pushed both sides to invest into highly optimistic (but ultimately unsuccessful) grand projects. The United States attempted to build its global “Star Wars system” while the Soviet Union tried to solve its energy problems “by launching hundreds of gigantic solar-power platforms into orbit and beaming the electricity back to earth.” These projects may seem insane, but it was exactly this kind of moonshot thinking that was enabled through the state of conflict, and perhaps if the Cold War had lasted longer may have even one day seen completion. It is important to remember that the state is still subservient to society and its actions react to the sentiment of society. So what convinced the capitalistic-minded American society to agree to high taxes in order to fund extravagant government spending? Throughout history, powerful forces and ideologies have pushed people to overcome economic rationality such as religion or Manifest Destiny, which helped propel American expansionism through much of the 19th century. During the 20th century, however, it was the state of war that was the political force that superseded society’s economic rationality.

To understand how this works, we can turn to Carl Schmitt’s Political Theology: Four Chapters on the Concept of Sovereignty. In this work, Schmitt introduces the concept of the “state of exception” in which the “sovereign” is able to execute complete control over the state suspending its political systems, the constitution, and individual liberties for the purpose of serving the common good. We can narrow Schmitt’s idea by specifying it as a “political state of exception”, in which a crisis allows for subversion of the political norm. From this, we can then theorize the economic parallel to this, which we will aptly call an “economic state of exception”. We will define this as a situation in which a crisis allows for the subversion of the economic norm. Schmitt defines a crisis or exception as “a case of extreme peril, a danger to the existence of the state, or the like.” From this it’s very easy to make the claim that the Cold War, which provided people with the constant peril of the worry of nuclear attack at any minute, created an economic state of exception that temporarily suspended the capitalistic economic norm. Both kinds of states of exceptions are very dangerous tools. Political states of exceptions can lead to “good sovereigns” such as Julius Caesar or “bad sovereigns” such as Adolf Hitler. In the same vein, economic states of exception can also result in positive or negative outcomes. Both the United States and the Soviet Union engaged in high levels of investment spending in order to win the Cold War, but in the end, the United States ended up making massive technological advancements while the Soviet Union ended up overspending itself to its ultimate destruction. Regardless, it was this economics state of exception which enabled American society and thus the American government to pursue its innovative moonshots projects.

Once the peril associated with the Cold War ended, the state of exception also came to an end. Once society decided that unrestrained investment was no longer necessary or in the national interest, the state had lost its mandate to operate in the exception and had to accommodate to society’s sentiment. One of Newt Gingrich’s first acts upon winning the House of Representatives in 1995 was to defund the Office of Technological Assessment (OTA) citing it as an example of useless government extravagance. Since the end of the 1990s government has become increasingly less innovative, having turned its focus away from innovating and towards things like redistributing money. In every year since the end of the Cold War, government entitlement spending has eclipsed discretionary spending. After the Cold War, the government even began to mimic some of neoliberal values espoused by industry and society, which cause it to further falter in the realm of innovation. Graeber writes that “The increasing interpenetration of government, university, and private firms has led everyone to adopt the language, sensibilities, and organizational forms that originated in the corporate world. Although this might have helped in creating marketable products, since that is what corporate bureaucracies are designed to do, in terms of fostering original research, the results have been catastrophic.” A similar period of stagnation occurred during the interwar period of the 1920s in which innovation halted and America regressed to isolationism and financialization, which led in part to the depression of the 1930s which lasted until a new state of exception was triggered in America with the attack on Pearl Harbor.

However, to say that innovation has completely come to a halt since the end of Cold War would be ingenuous. There is one industry that has accelerated in its rate of innovation — information technology. To explain this, we have to remember that once the Cold War ended, it did not mean the end of ideological conflict and war forever. Schmitt argued that fundamental conflict will always exist between philosophically opposed entities. The opportunity for a new state of exception came on September 11, 2001 when America was attacked by a type of enemy it had never faced before. Unlike previous conventional wars or even the Cold War, in which we were fighting a known enemy, America was now at war with an enemy that it could not easily trace or find and used the new digital realm to its advantage. Thus, to eliminate the new peril, innovation wasn’t required in new rocketry, nuclear weaponry, or transportation systems. Instead, this modern warfare required mastery over information and the digital realm — exactly where governmental innovation is focused in the 21st century, in communications and surveillance technologies.

What can we learn from this narrative that claims to link the rapid governmental innovation of the latter half of the 20th century to the state of war due to the theory of economic states of exception? Does this mean that the only solution to technological stagnation is to start a new global war and put ourselves in a state of peril like the Cold War once again? Not quite. Economic states of exceptions are simply one way to overcome societal economic norms. Any sort of ideology that can convince society to value something beyond short term economic profits can achieve similar results. As discussed earlier, much of the rapid American innovation in the 19th century in industries such as railroads was propelled not by fear of an existential threat but by the more positive belief in ideals such as Manifest Destiny. It may be possible to reorient societal values towards something that for example values scientific curiosity over economic concerns. Even if we adopt the pessimistic view that only states of exception can motivate our society, there is still the possibility that the “crisis” we face does not have to be another nation or group; it just has to be an existential threat to society. If we could get society to rally around something like fighting the existential threat of climate change, we can overcome our natural capitalistic tendencies and use the economic state of exception to both advance technology and save the planet.

Sources

Chow, Denise. “DARPA and Drone Cars: How the US Military Spawned Self-Driving Car Revolution.” LIVESCIENCE. March 21, 2014. Accessed April 14, 2017. http://www.livescience.com/44272-darpa-self-driving-car-revolution.html.

Cowen, Tyler. The great stagnation: how America ate all the low-hanging fruit of modern history, got sick, and will (eventually) feel better. New York: Dutton, 2011.

Douglass, John Aubrey. The Cold War, Technology and The American University. CSHE Research and Occasional Paper Series. July 1999. http://www.cshe.berkeley.edu/sites/default/files/shared/publications/docs/PP.JD.Sputnik_Tech.2.99.pdf.

Graeber, David. “Of Flying Cars and the Declining Rate of Profit.” The Baffler, March 2012.

Hanlon, Michael. “The Golden Quarter.” Aeon, December 3, 2014.

“Has the Ideas Machine Broken Down?” The Economist, January 12, 2013.

Krippner, Greta R. Capitalizing on Crisis: The Political Origins of the Rise of Finance. Harvard University Press, 2012.

“Manhattan Project.” CTBTO Preparatory Commission. Accessed April 18, 2017. https://www.ctbto.org/nuclear-testing/history-of-nuclear-testing/manhattan-project/.

Mazzucato, Mariana. The entrepreneurial state debunking public vs. private sector myths. New York, NY: PublicAffairs, 2015.

Schmitt, Carl. Political Theology: Four Chapters on the Concept of Sovereignty. University of Chicago Press, 2006.