Abstract
One of the problems with assessing the emerging threats from advances in biotechnology and the life sciences is that there are two competing narratives for understanding biotechnology development: one based on the notion of a biotech revolution and the other based on biotech evolution. The biotech revolution is a dystopian tale in which scientific advances lead to rapid changes in biotechnology, its applications, and its potential threats. The biotech evolution, however, describes slower and more complex trajectories for biotechnology development and threats. These two narratives are based on different assumptions and analytic methods, which lead to fundamentally different conclusions. The US intelligence and policy communities use the biotech revolution story line to make sense of today’s bioweapons threats, but this approach fails to reflect the complex social, economic, scientific, and technical factors that shape biotechnology and life science developments. The author argues that more critical perspectives on biotechnology are needed in order to improve intelligence assessments of present and future bioweapons threats and policy regarding them.
Keywords
In 1998, President Bill Clinton read a novel about biological warfare that deeply disturbed him. In fact, the story reportedly kept him up all night. It’s one of the reasons that Clinton became personally invested in protecting the United States from bioterrorism threats. The book was The Cobra Event (Preston, 1998), a sci-fi thriller by journalist and novelist Richard Preston that told of a mad scientist who brewed a lethal, genetically engineered virus in his New York City apartment. Preston’s tale highlighted the potential ease with which individuals or small groups with access to advanced bioweapons capabilities could launch attacks on major US cities. 1 After reading The Cobra Event, Clinton called several advisory meetings and ordered classified assessments and simulation exercises to examine the threat depicted in the story. As a result of these deliberations, by the end of his administration Clinton had increased funding for biodefense preparedness efforts fourfold, to more than $400 million per year (Schuler, 2004: 87).
Clinton’s heightened concern about bioweapons illustrates the power of storytelling—even in policy making at the highest levels. And it’s not just Preston’s book. Other dystopian stories about bioweapons threats have also played powerful roles in shaping intelligence and policy. 2 Scientists, biosecurity experts, and intelligence and policy officials drawing on these plot lines have often emphasized how advances in the life sciences will lead to revolutionary changes in the power of biotechnology, its applications, and its potential bioweapons threats (Block, 1999; Bowman et al., 2011; Brent, 2006; Carlson, 2003; Central Intelligence Agency, 2003; Choffnes et al., 2006; Commission, 2005; Dando, 2010; Danzig, 2003; Epstein, 2005; Institute of Medicine and National Research Council, 2006; Nouri and Chyba, 2009; Petro et al., 2003). In recent years, the US intelligence community has focused increasing attention and resources on assessing these emerging biotechnology threats (Bhattacharjee, 2007; Central Intellig-ence Agency, 2003; Commission, 2005; National Intelligence Council, 2004).
Although scary stories about biotechnology have become popular and are often taken at face value in biosecurity discussions, do they reflect reality? Or has everyone become captive to tales that should be confined to the realm of science fiction? Are there alternative ways to think about biotechnology and its possible implications for bioterrorism that can lead to better understanding for the purposes of intelligence assessment and policy making?
There are two contrasting narratives for understanding biotechnology development: one based on the notion of a biotech revolution and another based on biotech evolution. To date, the US intelligence and policy communities’ reliance on the revolution-based approach for their understanding of emerging bioweapons threats—with that approach’s frightening assumptions of rapid and increasing technical and threat trajectories—fails to take into account the complex social, economic, scientific, and technical factors that shape biotechnology and life science developments. The evolution-based approach, however, which describes slower and more complex trajectories for biotechnology development, pinpoints important social and technical dimensions of biotechnology that have been overlooked in intelligence and policy assessments.
These two narratives are based on different assumptions and analytic methods, which lead to fundamentally different conclusions about the character and trajectory of biotechnology developments and their implications for bioweapons threats. An exploration of both views demonstrates that, instead of instinctively accepting the revolution story, intelligence analysts and policy officials need to consider more critical perspectives and to bring broader sets of expertise to intelligence and policy discussions to improve their understanding of bioweapons threats.
The revolution narrative
The biotechnology-as-revolution story emphasizes the importance of codified knowledge and the material aspects of biotechnology, with a fixed technological trajectory. Gerald Epstein (2012), Homeland Security Department deputy assistant secretary for chemical, biological, radiological, and nuclear policy, has identified four tenets of the revolution paradigm: biotechnology is becoming more powerful, more available, more familiar, and more decentralized. As a result, the bioweapons threat is expected to grow in the future.
The conventional wisdom says that biotechnology is growing in power because scientists and other technical experts possess an increased ability to understand and manipulate nature. Moreover, the rate of new biotechnology developments is increasing rapidly, creating new possibilities for security threats. Biotechnology information, materials, infrastructure, and expertise are spreading across a wide range of commercial and academic settings—and becoming accessible to individuals working on their own, in small groups, or within large organizations. With barriers falling, biological weapons threats are judged to be more likely and potentially more devastating than in the past.
The historical record shows little precedent for terrorists demonstrating the technical capability or desire for biological weapons. Nevertheless, the revolution tale suggests that the capabilities of terrorists will inevitably rise to the level needed for a substantial biological attack. Proponents of this view assume that technology is the primary driver and that terrorists or other non-state actors will readily exploit modern biological materials and techniques to lower technical barriers, avoid existing controls, and create vulnerabilities.
The evolution narrative
An alternative story line comes from scholars in the field of science and technology studies, who focus on the social context and character of technology, rather than simply the material aspects. While the dystopian view of biotechnology assumes a rapid pace of change and an ease of uptake for specific applications, Paul Martin—a sociologist of biotechnology at the University of Sheffield who has spent several decades studying the development of new biotechnologies in the commercial sector—points to the high level of uncertainty involved with and the difficulty of predicting future biotech trends. Longitudinal case studies spanning 20 to 30 years, and covering a range of biotechnologies, reveal a complex and nonlinear model for biotechnology development (Martin, 2012; Nightingale, 2004; Nightingale and Martin, 2004).
For example, in the 1990s and 2000s, there were great expectations that genomics and biotech would transform pharmaceutical innovation, support a new industrial biotech sector, create a large number of new drugs, and increase pharmaceutical productivity. By the mid-2000s, however, these expectations had largely failed to materialize. There are several explanations: a high attrition rate of genomic drug targets across the development process; a lack of the long-term, publicly funded research that was needed to make drug targets viable; development processes that were slow and difficult; and market failure. Each of these problems is the result of a complex mix of social, economic, scientific, and technical factors.
In the small number of cases where specific products and innovations emerged, it was the result of many decades of incremental collaborative research; typically, it has taken 35 years for new biotechnology innovations to mature. While these case studies focused on commercial biotechnology rather than biological weapons development, they still revealed patterns of innovation, diffusion, translation, and uptake that may be common to all life sciences.
Curiously, although many high expectations for biotechnology have not been met, there remain constant efforts to keep that hope alive, with biotech revolutions still promised. A plethora of popular books and articles continue to tout the “biotech revolution,” “genome revolution,” and “genetics revolution”—branding phrases that ultimately are used in a variety of policy documents. Those who have conducted in-depth studies of the social and organizational dimensions of biotechnology argue that the biotech revolution is the wrong paradigm for understanding change and innovation in biotechnology and bioweapons threats (Ben Ouagrham-Gormley, 2012; Martin, 2012; McLeish, 2006; Vogel, 2008). Their view of biotechnology is one of slow, incremental innovation.
The technological determinism model
These opposing interpretations draw on different analytic models. The biotech revolution story is largely based on a popular but dated model of technological development known as technological determinism. In this model, technological development consists of a set of cumulative technical results, each with its own momentum or trajectory, leading to an end product such as a blueprint, tool, or gadget. The trajectory of development is fairly predictable, whether it be linear or exponential, and technology flows directly from basic and applied scientific research. Technologies are largely independent from social influence or control, and their success is primarily based on their inherent material properties or problem-solving processes. Technologically deterministic thinking often incorporates a “technological imperative,” which assumes that technological developments, once set in motion, are unstoppable.
MIT historian of technology Merritt Roe Smith (Smith and Marx, 1994) has documented how technological determinism is deeply embedded in US culture and has underpinned many popular chronicles since the Industrial Revolution. 3 In particular, technologically deterministic thinking has played a powerful role in US security policy and understandings of weapons technologies. In the 1950s and 1960s, for example, American security analysts and policy makers advanced concerns about missile and bomber gaps (York, 1970). Their writings described the Soviets as having ballistic missile and bomber systems superior to those of US military forces. Although they were later shown to be in error, these accounts provided some analysts and policy makers with justification for certain avenues of defense planning and spending.
From the late 1990s to the early 2000s, policy elites have invoked the “revolution in military affairs” theme, derived from advances in computational and related technologies, to help shape defense planning and spending for US force modernization (Krepinevich, 1994; Rumsfeld, 2002). Political scientist Jacques Hymans (2012: 6–9, 24–25) has also noted how a “techno-centric” view has dominated thinking in the government and nongovernment policy communities on nuclear proliferation. Technological determinism is popular in policy circles because it provides a simple, common-sense framework for understanding technological development.
The sociotechnical model
As historians and social scientists have observed, developing and using biotechnologies for specific purposes is not as simple or as automatic as the determinism model claims. In an alternative model of technological development that is often referred to as the sociotechnical model, technologies emerge within social, natural, economic, and political contexts, and as a result, technological trajectories can be multidirectional. 4 This analytic approach studies local technical practices, as well as the larger laboratory, institutional, industrial, and environmental settings in which science and technology are developed and used. It assumes that the potentially disruptive character of biotechnologies is modulated by their sociotechnical elements. In this model, there are a variety of possibilities for how equipment could be developed and used—or, alternatively, fail and not be used. Over the past four decades, scholars in the humanities and social sciences have shown that the determinism model gives a distorted picture of the complex interactions that occur between technology and society. 5 Intelligence and policy practitioners, however, remain unaware or uncertain of this social science literature or how to apply it to bioweapons issues.
To counter technical-oriented thinking in assessments, sociologist of science Steven Flank argues for using the language of systems or networks to highlight that a successful working technology and its applications are the result of a complex matrix of people, materials, financing, infrastructure, and so forth—including regulatory, security, and safety regimes—that have to be mobilized, organized, and stabilized in specific ways (Flank, 1993). Technology should not be seen as a single, final-stage product, but rather as an entire networked system that allows that technology to be realized. Flank argues against using technical superiority or cost-effectiveness as simple explanations for success; instead, he says, analysis must closely examine why a particular technology does or does not become superior or cost effective, and how access to resources (such as authority or consensus) supports a particular technological development.
It can be instructive to look at the alliances that have brought a technology to fruition, how these alliances were formed, and how they became stable enough to help a particular product or application succeed. How, for example, do alliances and networks (with their varied sociotechnical elements) form and stabilize in state and terrorist programs in ways that would sustain the development and use of emerging biotechnologies? Also, what are the important factors that allow for successful uptake and use of emerging biotechnologies by states or terrorists operating in different contexts? The sociotechnical model does not assume that, simply because new science and new technologies emerge and are available, that they will be readily adopted and used.
Red-teaming the revolution
One way to counter the dominant biotech revolution narrative in intelligence assessments is to use “red-teaming” techniques, in which a problem is viewed from the perspective of a competitor or adversary. This type of exercise is not currently being done, for different technology models, but there are precedents for it. For example, in the late 1990s, the Defense Advanced Research Projects Agency (DARPA) funded a project to assess the quantity of biological agents that a terrorist could produce. A defense contractor was selected to conduct the assessment, with advisers available for consultation. 6 University of Pittsburgh security expert Dennis Gormley was brought in to evaluate the findings. 7
Upon reviewing the contractors’ method and conclusions, Gormley realized that they had used only quantitative techniques and data. Because of a series of conversations he had had over several years with the head of worldwide biotech research and development for a multinational pharmaceutical company, Gormley doubted that this approach would accurately capture the complexity of working with biological agents. The biotech executive (who has asked to remain anonymous) told Gormley that none of his company’s $25-million start-up projects had led to a viable commercial product even though considerable manpower and time had been devoted to these projects.
Gormley asked the executive to join his DARPA team, and together they outlined the need for qualitative measures to take into account the difficulty and unpredictability of advanced biological work. Interestingly, it was Gormley and the executive who raised these issues—not the advisers involved in the project who had had long careers in bioweapons development or policy. This example illustrates the value of a contrarian technology perspective—one that is supported and monitored as part of the assessment. In several well-known threat assessments, a true contrarian position was not represented or practiced (Gormley, 2008; Leitenberg, 2005).
Reaching out to social scientists and historians
Qualitative assessments can reveal the complexities that are overlooked when analysts assume that technology marches steadily forward, undeterred by social and technical circumstances. Arthur C. Clarke (1951) dramatized the dangers of deterministic thinking in his science fiction story “Superiority,” which tells of a futuristic battle between two adversaries. One side is focused on pursuing advanced technologies to anticipate a “revolution in warfare,” and assumes that its enemy will pursue these new technologies as well. Instead, the enemy chooses to pursue more rudimentary military technologies. As the narrator of the story relates, the side focused on advanced technologies is ultimately routed: “We were defeated by one thing only—by the inferior science of our enemies.” The advanced technologies worked in the laboratory but not in wartime conditions.
Studying technology in its sociotechnical context requires the US intelligence community to reach out to social scientists and historians, as well as to those who can provide contrarian technological perspectives. 8 In July 2008, the Office of the Director of National Intelligence issued a directive charging intelligence analysts to engage outside experts in order to “explore ideas and alternative perspectives, gain new insights, generate new knowledge, or obtain new information” (Office of the Director of National Intelligence, 2008). 9 Although this directive is vague, it could authorize red-teaming activities and new forms of expert advising. Currently, however, the directive remains underused because, as one intelligence analyst explains, it is “pretty much an unfunded mandate.” 10 The largest intelligence organization that consistently uses outside experts is the National Intelligence Council, but its reports on biotechnology threats largely draw on technologically deterministic perspectives, rather than alternative models (National Intelligence Council, 2004).
Intelligence analysts must still reach out to new experts in order to develop holistic assessments that move beyond the conventional, albeit popular, wisdom of the biotech revolution paradigm. Doing so will enable analysts to integrate information about the broader social, economic, and political context with technical analyses of bioweapons programs and emerging biotechnologies. 11 This is a tall order, but one that must be filled if we want more accurate assessments of emerging biotechnologies to inform policy making.
Footnotes
Acknowledgements
The author thanks Sonia Ben Ouagrham-Gormley, Stephen Hilgartner, Gerald Epstein, Dennis Gormley, and Christine Knight for their feedback on an early version of this article.
Funding
This work was supported by grants from the United Kingdom’s Economic and Social Research Council and from the US National Science Foundation.
