Skip to Main Content

The Covid pandemic exacerbated fear and panic regarding the potential for a future bioterrorism agent. As the lab leak theory continues to cause debate, politicians want to be able to tell their constituents that they are solving the problem by adding more oversight to biological research. But if all they are doing is adding more burden, bureaucracy, and box-checking, is it really making anyone more secure?

For half a century, efforts to build a governance system around the security of biology have largely focused on the development and use of biological weapons, starting with the Biological Weapons Convention, which opened for signatories in 1972 and went into force in 1975. It wasn’t until the early 2000s that the research side of biology started getting more attention from the security community, with experts creating methods to determine what security issues we need to worry about, and what we should do about them. After 9/11 and the anthrax mailings, the federal government placed heavy restrictions on research with agents that were deemed to be biological weapons of mass destruction — referred to as select agents. Then, from 2007, there was new attention paid to dual use research, or experiments that could both benefit and harm society, agriculture, or the environment.

advertisement

These governance systems are increasingly not up to the task of managing biosecurity risks. States, industry, and academia have been too focused on the technical frontiers in biotechnology, heralding cheaper, more efficient, and more sophisticated tools to conduct biological research, but not putting the same degree of curiosity or funding into how we might direct these advances in ways that protect the vulnerable and prevent catastrophe. Scientific advances such as CRISPR, gene drives, synthetic viruses, and increased pathogen capabilities, are rapidly proceeding while innovation in our collective ability to govern their security concerns is not.

In fact, there are few incentives in the United States to even identify and manage the security concerns we currently know about in biological research, let alone to think outside the box on where new concerns may arise, and what new governing capabilities may be needed to address them. This has led to a system of governance where rules get made but are rarely revised, biosafety manuals are written and sit on the shelf to collect dust, and inspection forms are created for a once-a-year assessment that doesn’t reflect the daily activities happening in the lab. Without significant change, new policies to prevent biosecurity risks won’t actually make anyone safer.

The current governance system is based on the idea that biological research is separate from its uses. It’s grounded in a mid-20th century idea that the best research comes from governments giving academics money to pursue research the experts think is most appropriate and then getting out of the way. This system is designed to have only a few points where high-level policy can directly touch basic research. This presents a challenge because the current oversight infrastructure was purposefully designed not to address biosecurity concerns. This means that every time a new security concern is identified, policymakers are scrambling to contain it. This reactionary model lacks the ability to adapt to new challenges over time.

advertisement

Efforts over the past decade — such as U.S. policies on dual use research of concern or on the care and oversight of pathogens of pandemic potential — have focused more on the interconnectedness of idea and application, of the work in the lab and the new biological entity in our fields, our foods, our air, and ourselves. But they are being shoehorned into an infrastructure that was designed to keep science apart from society. At some point, the seams are going to burst, unless we develop a better infrastructure. These shoes don’t fit.

The distance between the concerns of scientists and those of security professionals is collapsing, but states, universities, and industry aren’t cultivating the methods they need to actually connect the goals of oversight with the conduct of laboratory work. The recent lab leak theory highlights the political sensitivities regarding modern biological research. Because of the hyper-focused attention on the possibility that the Covid-19 coronavirus emerged from a possible laboratory accident in China, there is now a critical lens being applied to all experiments conducted in high-containment laboratories. As the U.S. government grapples with how to manage “gain-of-function” experiments, such as those that have the potential to create pandemic pathogens, energy is being wasted in a finger-pointing game instead of finding creative solutions to manage the risks.

In an effort to address a wider range of biosecurity threats, the U.S. National Science Advisory Board for Biosecurity, or NSABB, recently released updated recommendations for how to get a handle on dual use experiments. It has been hailed as an advance for the oversight of life sciences research, and stricter rules for gain-of-function research may be coming soon. If this framework becomes policy, it will most likely fall to existing institutional biosafety committees to manage the bulk of the implementation. These committees, along with the position of an Institutional Biosafety Officer, were created by the National Institutes of Health’s Guidelines for Research Involving Recombinant DNA Molecules in the 1970s, but were set up to deal with laboratory safety issues, not security concerns.

As an institutional biosafety officer and a biosecurity expert, we have seen this first-hand in the way local institutional oversight committees are run, the types of data they collect and review, the types of experiments they oversee, along with the manner in which they choose to implement biosecurity policies. For example, the dual use research of concern policy recommends that an Institutional Review Entity review biosecurity, but by far the majority of universities use existing Institutional Biosafety Committees instead. Again, this may seem like a small distinction, but it demonstrates the rigidness and inflexibility in how we think about security threats, and how we keep running back to antiquated biological governance systems not designed to manage the size, scale and timeframes of current biosecurity threats.

Now biosecurity is at the forefront of people’s minds, and it is an opportune moment to think differently. Not only do we have the NSABB’s recommendations on the table, but there is an unusually high level of attention focused on biosecurity governance issues across the government. Last September, the Biden administration announced the Executive Order on Advancing Biotechnology and Biomanufacturing Innovation for a Sustainable, Safe, and Secure American Bioeconomy, which among other things requires the seven main government agencies that will leverage or oversee future life science research to outline ways for how they will support biosecurity. These plans are currently being finalized, and then will be reviewed by the White House’s Office of Science and Technology Policy, leading to an implementation plan to be released by July. This will likely be complemented with the work of the National Security Commission on Emerging Biotechnology, which is focused specifically on the Department of Defense’s life sciences efforts.

The danger is that government agencies, industry, and academia will merely try once more to squeeze new ideas of what should count as a security concern, and who should have responsibility for governing it into the old shoes of biosafety governance infrastructure. They should instead be focusing on new methods for conducting the day-to-day oversight, and that requires listening to and working with those on the ground who will be responsible for implementation and maintenance of whatever policies come about. The issue of biosecurity is a governance issue just as much as it is a technical issue. If states, industry, and universities work together, they could incentivize experiments with alternative biosecurity systems, so it wouldn’t be solely upon the government’s shoulders to solve.

If this mindset of experimentation in governance is to thrive, it will require at least three elements. The first is top cover for practitioners who want to try something different. From government agencies to corporate labs to universities, those in charge need to see that achieving biosecurity means being curious and innovative with oversight design. The second is funding for small scale experiments in governance that are directly addressing the structural issues that limit our current systems. Government departments and industry consortiums could incentivize research institutions to test out different oversight systems in a funding-contingent manner over a set amount of time. The goal would be to meet or exceed existing policies and report back on each institution’s progress on objectives and key results. And finally, there needs to be a process in place for those engaged in these experiments in governance to share our learning across communities, and for that learning from below to feed back into policy design.

Building a capability to experiment with new biosecurity infrastructure means that deep-seated changes can be tried out, and their limitations identified and mitigated, before they get locked in. If we want to find the right-sized shoes, it is essential that those responsible for implementing and sustaining future policies have a larger role in designing biosecurity oversight.

Sam Weiss Evans is a senior research fellow in the Program on Science, Technology & Society at the Harvard Kennedy School. David Gillum is the assistant vice president of environmental health and safety at Arizona State University, an associate editor of Applied Biosafety, and past president of the American Biological Safety Association International. 

STAT encourages you to share your voice. We welcome your commentary, criticism, and expertise on our subscriber-only platform, STAT+ Connect

To submit a correction request, please visit our Contact Us page.