Just because we can, does that mean we should?
In a hard-hitting editorial in Science, three Stanford thinkers – Stanford microbe wizard David Relman, MD; synthetic biologist Megan Palmer, PhD, of Stanford’s Center for International Security and Cooperation; and political theorist Francis Fukuyama, PhD, of the Freeman Spogli Institute for International Studies – have issued a scathing wake-up call to the scientific community and the federal government, sternly questioning the latter’s current plans for ensuring biosafety and biosecurity in the United States.
“Our strategies and institutions for managing biological risk in emerging technologies have not matured much in the last 40 years,” they write, adding:
With the advent of recombinant-DNA technology, scientific leaders resorted to halting research when confronted with uncertainty and public alarm about the risks of their work. To determine a framework for managing risk, they gathered at the now-fabled 1975 Asilomar meeting. Their conclusions led to the recombinant DNA guidelines still used today, and Asilomar is often invoked as a successful model for scientific self-governance.
But, the authors suggest, Asilomar’s legacy may not be all it’s cracked up to be:
Asilomar created risky expectations: that leading biological scientists are best suited for and wholly capable of designing their own systems of governance and that emerging issues can be treated as primarily technical matters.
“Unfortunately,” the editorial goes on to say, “today’s leadership on biological risk reflects Asilomar’s risky legacy: prioritizing scientific and technical expertise over expertise in governance, risk management, and organizational behavior.” Political leaders have largely ceded a strategic leadership role, leaving it up to the scientific community itself to judge the ethical and social implications of its own work.
“Leadership biased toward those that conduct the work in question can promote a culture dismissive of outside criticism and embolden a culture of invincibility” regarding emerging biotechnology risks,” the authors write.
The world of today is not the world of 1975. Since then, the scope and scale of biological science and technology have changed radically. To wit: The increased ease of reading and writing genetic information means that securing materials in a handful of established labs is not feasible, the editorial states. Like it or not, the tools for putting potentially dangerous knowledge into practice are increasingly portable.
For a scary scenario of what such new facility portends, please see this article I wrote a couple of years ago, which begins with the rhetorical question: “What if nuclear bombs could reproduce?”
With so much at stake, we may not want to restrict oversight of scientific advances to those who are making the advances. There’s knowledge, and there’s wisdom.
Previously: How-to manual for making bioweapons found on captured Islamic State computer, Microbial mushroom cloud: How real is the threat of bioterrorism? (Very) and Stanford bioterrorism expert comments on new review of anthrax case
Photo by Mirko Tobias Schafer