Abstract
The natural world is simply wondrous, unless you're bioengineer Drew Endy. Then it's also inefficiently designed.
Drew Endy is hungry to build. He needs to build: standard biological parts, biological organisms, a community of bioengineering enthusiasts, an open learning environment for his students. The 37-year-old engineer and synthetic biology pioneer is determined not to waste his “life energy.” His time is running out, or at least it feels that way to him.
But first, some ice cream. After dinner on a cold night in late January, Endy hungers for ginger snap molasses ice cream from Toscani-ni's, a famed cafe blocks from his MIT lab. While waiting in line he works the room, recording in his BlackBerry the name of the photographer whose work decorates the walls and snooping at customers' laptop screens. Ice cream in hand, he heads back to his lab, through the well-lit canyons between campus buildings (“It's light enough to read a newspaper while walking at night,” he says), to work on research grants past midnight.
Endy finds it hard to curb his enthusiasms. When he came to MIT in 2002, his willingness to collaborate buried him under a pile of commitments and left him with little time to pursue his primary interests. A colleague suggested he learn how to say no. He taped notes next to his phone, encouraging him to turn down commitments; he jerry-rigged a spring to his phone's headset that encouraged him to keep calls short. But little helped. “I don't like to say no,” he says. “When I say no to something it drains me of energy.” Instead, he managed to construct a world where “I only say yes.” He set his priorities and if projects didn't fit into one of those areas, he didn't get involved. This meant not returning some phone calls, a practice he admits could be considered rude. “It's not a perfect system,” he says, “but it's a practical system.”
Practicality drove Endy into synthetic biology in the first place. He grew up near Valley Forge, Pennsylvania, and had little interest in biology until college. In fact, high school biology proved an unsuccessful exercise in memorization. “I got a D in biology,” he says. His interest in the subject grew out of a molecular genetics class he took as a structural engineering student at Lehigh University. Endy began modeling bacterial viruses as an engineering graduate student at Dartmouth College and then tested his models as a post-doctoral student at the University of Texas-Austin. Predicting the behavior of biological systems turned out to be complicated, so he decided to take a more engineering-minded approach and build systems that would be easier to model instead.
Endy soon learned that building biological systems posed its own challenges. “Engineers have not been spending a lot of energy trying to get better at it, and construction–tedious, slow, and an experiment in itself–is still at its heart, “ he says. Yet, he is convinced that by focusing on simplifying and abstracting biological parts from their applications, researchers will be able to do wondrous things with the constituents of life–DNA.
Endy and his peers have yet another motivation, best characterized by geneticist Roger Brent, the director of the Molecular Sciences Institute in Berkeley, California, where Endy worked for three years. Brent sees a quality in synthetic biologists, not found in the biologists who “hack DNA” for the purposes of understanding how living things work, or to create medicines, or to make money. “Through Drew, I was acquainted with the ethos and folkways of the high-self-regarding engineers, who want to do clean designs because a clean design is important for its own sake. They want to build something that is good and sweet and beautiful,” says Brent. “My closest acquaintance to that is computer programming culture.”
Synthetic biologists embrace the computer programming analogy. Endy and his peers at MIT, Harvard, and the University of California-San Francisco, founded the nonprofit BioBricks foundation in 2005 to encourage the development and use of standard DNA parts–genetic material with a stripped-down, prescribed function–to “program living organisms in the same way a computer scientist can program a computer.” In Endy's ideal world, scientists and amateur researchers working in their homes will be an engine for a range of future biotech applications, just as garage computer clubs propelled the information technology boom of the last quarter century.
This fresh, egalitarian approach to applying biotech tools, such as DNA sequencing and synthesis, pushes several frightening scenarios to the public forefront. Whereas biologists initially confronted the potential unintended hazards of these technologies in the lab and surrounding environments, biologists and bioen-gineers now face even tougher questions about the deliberate abuse of these technologies. What if someone creates a strain of flu that can evade existing vaccines? Or recreates an enhanced smallpox?
“When scientists convened for the first annual synthetic biology conference in 2004, the conversation eventually turned to all the bad things that you could do with biology. “It turns out there's no end to that exercise,” Endy says. “You can just imagine bad things in combination with more and more combinations of bad things.” To avoid getting weighed down by the flurry of concerns, the synthetic biology community began addressing some of the security issues raised by their research, much as nuclear scientists contemplated the ramifications of their work on nuclear weapons and nuclear energy during and after World War II.
It hasn't been easy. For starters, the tools commonly used to address nuclear, chemical, and biological arms control–treaties and material control regimes–are inadequate to confront a discipline where the materials and know-how are as widely available as they are with biotechnology. DNA sequencing machines are sold on eBay, and genetic sequences are available from vendors worldwide.
Some approaches to addressing security concerns are alien to scientists eager to protect their research and academic independence, as well as their funding. But other approaches are gaining momentum. Improvements in screening software that allow sequencing companies to look for hazardous DNA orders are forthcoming, and scientists have discussed employing their economic clout and dealing only with companies that adequately screen sequencing orders. The conflicting opinions about establishing research norms frustrates Endy. “I'm worried that we're selecting for the perfect storm of hard technology development issues and human practice issues and we're not actually interested in doing this work, so much as working on all these intractable problems,” he says. And this will not do.
To Endy, the precocious bioen-gineer, not building is not an option. “I like to make stuff, I like to skateboard, I like existing, I like improbable things, I like things I don't understand, or at least I'm curious about them, but most of all, I like to make stuff.”
Bulletin senior editor Jonas Sie-gel sat down with Endy in Boston to probe the repercussions of building with biology.
There's a smaller community of engineers who are trying to figure out how to make stuff in biology, and get much better at building biological systems. We're interested in learning with biologists while also somehow avoiding getting sucked into solving all of the interesting, compelling scientific questions in biology.
However, there is no effective form of governance that doesn't start with self-governance. I don't care to rehash all of the conversations around recombinant DNA and self-governance with respect to biological safety from 30 years ago, until there are pressing reasons to do so, because the safety record isn't bad and the benefits in terms of discovery and applications are pretty compelling.
But there are also technology gradients. For example, I have access to DNA synthesis technology and most people in the world do not. Because there are advances in technology, there are localized concentrations of power, as defined by access to technology. Some people have access to nuclear weapons, most people don't–thank goodness. But is that gradient in the access to technology good or bad?
I had a conversation about bio-bricks [standard biological parts] and the assumption was that they will never be accessible to a farmer in the countryside in Brazil. I wonder whether that's actually true, I bet we could make them accessible. The question is, is the technology actually useful to such a person? Are technology gradients by themselves bad or is it the misapplication of the technology gradient that is bad? Labeling technology gradients as bad would seem to be incompatible with discovery and innovation.
With respect to biotechnology, I am trying to remove and debase technology gradients and make the technology more accessible and enable garage biotechnology, so that we can have more innovation.
Then the question is: What about things that are engineered to be worse, for example, a putative IL-4 smallpox? Should we be making such work public or should we keep it secret? My thoughts are: Who is making such things and why are they making them? They shouldn't be making them. The natural things should be made public because they exist already and sooner or later somebody will discover them. The engineered things shouldn't need to be kept secret because they shouldn't be getting made.
You would hope such people are as few in number and as limited in skill and equipment as possible. The important observation is that this number is never going to be zero. Eventually somebody will make something for the purpose of causing harm and go out and cause harm with it. How do we best prepare for this future? For example, how do we develop a technical and social framework for responding in a way that's strategic rather than paranoid and reactive?
If you look at the U.S. response to the anthrax attacks in 2001, it's fair to describe it as paranoid and reactive, not strategic. Many people fairly label our investments in biodefense over the last five and a half years as increasing the risk of biological attack, by calling attention to the problem, by stimulating international interest in biological weapons, and by, in the worst case, suggesting to other nation states that we're restarting our biological weapons programs.
We really need to develop a thoughtful, careful strategy for being able to respond to intentional misapplication of biological technology. We need to do so in a way that's open and inclusive and that actually works. It isn't enough to stockpile several tens of millions of units of some therapy, which is going to be readily circumvented by any thoughtful weaponeer.
We can try to liberate ourselves from the tyranny of evolution–it's the starting point in the synthetic evolution. We just don't understand the designs of evolved systems. Nature is not selecting for ease of understanding. Intelligent design, from an engineer's perspective, would have documentation, and we don't see that.
The challenge to future biological engineers is going to be to deliver solutions to such requests in the same way that we respond to requests for a bridge that can support 20 metric tons. How our solutions behave, how they get implemented, and what they look like, I don't think anybody knows. But these are physical problems, they are solvable problems, we just have to go do it.
Synthetic biology uncouples the designs of biological systems from the natural constraints of direct descent and replication with error. As a result, we can make disposable systems, systems that only have to exist for one generation and for which, when we need the next generation, we construct it anew and so on. These systems wouldn't have to be designed to evolve or not to evolve, and so they could be much simpler. Disposable isn't a pejorative here, it's a feature.
All the evolution will take place at the information layer, subject to human whim. Do we want to go hurt people or do we want to go make beautiful things?
Rather than writing a specific genetic program, I'm trying to make better programming languages, so that more people can program in DNA. Because biotechnology is so important, folks tend to work on applications and don't pause to develop a legacy infrastructure that makes the next job easier. There's a big gap in foundational biological engineering research. I've purposely disengaged from pursuing direct application work in the hopes of beginning to fill the gap.
Do we ever not talk about the potential risks? It's exactly the opposite. The problem is that more people aren't talking about risks. Many researchers seem inclined to hide from discussions of risks, out of fear that they will slow down research. We don't want to hinder research. But we do need to develop frameworks for constructively developing future biological technologies and for developing a biological defense system that is strategic and not reactive. We need to have a lot more open conversations about this.
Biology as it's classically presented is God's domain, and biotechnology is dangerous turf so we need to keep it safe, lock it up in the lab so that only scientists at institutions that have biosafety committees can access it and do stuff. I've heard such opinions as: Heaven forbid we ever have garage biotech, that would be horrifying. Such people couldn't possibly be up to any good.
A lot of our technology gets developed and presented in a social framework that is explicitly exclusionary and hands off, and this promotes passive consumerism. What if instead we made biotechnology available, so that more people could learn about what was actually possible and could be involved in figuring out what to do? “What if we made biotechnology not scary? Recall that John von Neumann developed one of the first electronic computers to help design atomic bombs, but that later Steve Wozniak and Steve Jobs built the first Apple personal computers in the Jobs family garage.
