- May 2, 2012
Fears that bioterrorists could learn from controversial experiments that make H5N1 avian influenza more virulent have overshadowed a more pressing danger: accidental releases, laboratory infections and disgruntled workers.
Dozens of all-too-human mistakes have occurred in just the last decade inside high-security laboratories, and many experts say new H5N1 flu strains engineered to infect mammals have not been handled with the care required to minimize chances of unintentional catastrophe.
In fact, research on less-threatening pathogens is conducted at higher security levels than research on the new bird flu and other strains made artificially more virulent.
Accidental infections and disgruntled workers “are by far the most realistic threats associated with these viruses,” said Rutgers University microbiologist Richard Ebright, a vocal critic of the new research. “But almost all of the discussion to date has focused on whether to publish the details. That only addresses lower-order risks.”
H5N1 engineering is conducted so that scientists can study what might happen in nature, giving them early warnings of what to expect in future pandemics. Virologists, epidemiologists and public health experts have argued for years over this strategy’s scientific value and potential risks, but those debates went largely unnoticed by the public.
That changed late in 2011 when two federally funded research teams, one led by University of Wisconsin virologist Yoshihiro Kawaoka and the other by Dutch virologist Ron Fouchier, submitted scientific papers describing how they’d tweaked H5N1 into strains contagious enough to easily infect ferrets, a standard research model for human flu infection.
Until now, H5N1 has struggled to gain human traction, requiring close contact with infected animals or people, though it’s deadly when it happens: Mortality estimates range from 60 to 80 percent. That number may be inflated by unreported low-grade infections that aren’t fatal, but even lower estimates are plenty scary. The flu pandemic of 1918 had just a 2.5 percent mortality rate and killed 50 million people.
Vaccines and medicines are better now, but as demonstrated by the swine flu pandemic of 2009 and 2010, highly infectious influenza is extraordinarily difficult to control. With this in mind, a federal U.S. biosecurity watchdog in November flagged the Kawaoka and Fouchier studies for further precautionary review. After a storm of public outrage and scientific misgivings, the researchers announced a 60-day pause on their research, with formal publication delayed until the flu community agreed it was safe.
'We are creating a risk that is much greater than that posed by nature.'During this time, researchers who supported Kawaoka and Fouchier, and even some of their critics, discussed objections in terms of bioterror and whether the findings’ details should be published. They debated wheather rogue researchers might use them to design weaponized influenza strains.The May 2 publication of Kawaoka’s study in Nature, and the Dutch government’s green-light of Fouchier’s paper, signal that bioterror fears have been allayed. According to the prevailing view, potential benefits outweigh the risks. The conclusion is that for bioterrorists to use the findings would be very hard, while scientists can learn from them immediately.
But many epidemiologists and public health experts say poor handling inside laboratories, rather than bioterror, is the real threat. More than 100 accidents in high-security labs took place between 2003 and 2009, involving everything from flu-infected ferret bites to dropped vials of encephalitis, slips with Ebola needles and lost shipments of bubonic plague. The 1977 “Russian flu” epidemic may have involved a lab escape. Less accidentally, anthrax used in the 2001 attacks almost certainly originated in U.S. military laboratories.
Such events have been downplayed during the current the engineered-flu controversy. Public regulatory debates over the research have largely excluded such views. Michael Osterholm, director of the Center for Infectious Disease Research and Policy at the University of Minnesota, and a member of the federal biosecurity committee that reviewed the research, wrote in an open letter that its public hearings provided “a very ‘one sided’ picture of the risk-benefit.” Nature’s “independent review” of risks associated with publishing Kawaoka’s research doesn’t even mention accidental release.
Supporters of the controversial experiments say the researchers are careful. But dozens of labs and hundreds if not thousands of researchers may eventually handle the new flu strains and others like them. “There’s too high a probability of escape if there are 40 or 50 labs working on something,” said Lynn Klotz, a senior science fellow at the Center for Arms Control and Non-Proliferation. In a commentary published in January inNature, Klotz estimated a 1 percent chance of accidental release per year at each lab working on SARS, the 1918 influenza or the new H5N1 strains. “We are creating a risk that is much greater than that posed by nature. Laboratories need stronger precautions,” wrote Klotz.
Research on H5N1 and the new strains was conducted at what’s known as Biosafety Level 3, or one step down from the moon-suited, multiple-air locked, ultra-high-security of Biosafety Level 4, which is typically reserved for highly lethal pathogens for which there’s no known cure. That’s precisely what engineered H5N1 strains might turn out to be: While Kawaoka’s strains proved non-lethal to ferrets, there was no way to know that before the experiments began.
Of the pathogens currently handled at BSL-4, including hemorrhagic fevers, ebola, tick-borne encephalitis and hantavirus, few are contagious. They primarily threaten people who work directly with them. “If they get out into the environment, most of them are fortunately not very transmissible,” said disease epidemiologist Stephen Morse of Columbia University. With more-contagious pathogens, “you’d want to be very concerned about lab accidents and how they’re being handled in the laboratory.”
“They should use Biosafety Level 4. It’s what BSL-4 was developed for,” said geneticist Steven Salzberg of Johns Hopkins University. “The avian flu people have been saying this disease is incredibly dangerous, kills about 60 percent of the people it infects, and could infect hundreds of millions of people if it goes pandemic — but they don’t say it needs BSL-4? How could they say those two things?”
According to Ian Lipkin, a virus surveillance expert and director of Columbia University’s Center for Infection and Immunity, handling procedures are balanced against the obstacles they raise against conducting research in the first place.
“Some people are concerned. They may push to have this work done at higher levels of containment and restrict the investigation to a few key laboratories. That’s not been decided yet. But the push back, which has happened in the past with SARS, is that doing so would slow down the pace at which research can be pursued,” Lipkin said.
Biosafety recommendations are set by the federal Centers for Disease Control, National Institutes of Health and Department of Agriculture, and formalized in a document known as Biosafety in Microbiological and Biomedical Laboratories, or BMBL. According to the NIH and CDC, there are no plans to change the recommendations for how engineered H5N1 should be handled.
Deborah Wilson, director of the NIH’s Division of Occupational Health and Safety, said “this virus is safely worked with under enhanced BSL-3 conditions.” And Amy Patterson, director of the NIH’s Office of Science Policy, said both Kawaoka and Fouchier’s labs were inspected in 2011 and found to be secure.
A policy that may be changing, however, is the classification of H5N1 under the Select Agent Program, a system established after 9/11 to limit access to especially dangerous pathogens. Researchers who work with select agents must register their projects, and receive background checks and extra federal oversight. It’s a cumbersome process, and researchers have asked that less-dangerous pathogens be moved into a second, less-restrictive category, with only “tier 1″ pathogens receiving continued close monitoring.
According to Ebright and others, engineered H5N1 will not be tier 1, vastly increasing the number of people allowed to work with it and reducing oversight upon those people. “There are over 400 institutions and 15,000 people with access to select agents,” said Ebright. “After the stratification goes into effect, the non-tier 1 number would grow larger. How much larger? That’s driven entirely by funding.”
Ebright emphasized that, while Kawaoka and Fouchier’s manipulations of H5N1 are high-profile, they represent a multi-laboratory, long-term research effort coordinated by the NIH and CDC to engineer more virulent strains of H5N1, the 1918 flu and SARS. Concerns about the new flu strains should be multiplied across these programs.
“If it hadn’t been generated in these labs, it would have been done in a half-dozen other labs that received funding for this project,” Ebright said. “This research was funded to increase national security, but by its very nature, it creates risks to U.S. health, agriculture, economic security and national security. The risks are not unforeseen. They are completely foreseeable.”
Update 5/3: The quote from Michael Osterholm’s letter to the National Institutes of Health was added to the article.
1 comment:
I think that a different biosafety level could be created for transmissible novel influenza, because of the urgency to accomplish rapid research. The researchers should be sequestered in continuous quarantine as opposed to going home to kiss a spouse at the end of the day. But to avoid encumbering and slowing the researchers, use BSL-3 barrier equipment and procedures. That would address a lot of issues, wouldn't it?
Post a Comment