Published On July 23, 2012
Last fall, two research groups announced that they had submitted papers describing their studies on H5N1‚ “bird flu‚” to the journals Nature and Science. Both groups, with funding from the National Institutes of Health, had identified mutant strains that are more easily transmissible among ferrets. The experiments, which aimed to demonstrate how the virus might mutate and become more dangerous to humans, raised red flags for some scientists and experts on biosecurity and public health, who argued that knowledge of the mutant flu presented a grave threat to public safety and national security.
The U.S. National Science Advisory Board for Biosecurity first pressed for the full details of the researchers’ methodologies not to be published. But after months of debate, a majority of the board decided that the benefits outweighed the risks, and the first study, led by Yoshihiro Kawaoka of the University of Wisconsin–Madison, was published in Nature in May. The second study, by Dutch researcher Ron Fouchier, was published in Science in June. Yet debate continues about whether, and how, research on H5N1 and other deadly pathogens should proceed.
Vincent Racaniello, a professor of microbiology and immunology at Columbia University who produced the first infectious DNA copy of an animal virus genome in the 1980s, sees greater benefit than risk. “The study in Nature tells us a lot about what’s important for aerosol transmission of avian flu among mammals,” he says. “The better we understand that, the better chance we have to interrupt it.” On the question of risk, Racaniello says that lab-created viruses tend to be less dangerous than natural ones. And he believes existing regulations and facilities provide a high level of protection. “People work on plenty of dangerous pathogens, and there has never been an accident leading to dire consequences,” he says.
Richard H. Ebright, a professor of chemistry at Rutgers University who serves on the school’s Institutional Biosafety Committee, disagrees, recalling a 1977 pandemic of the H1N1 flu strain that many experts believe started with the escape of a virus from a Russian laboratory. “No facility will provide complete protection,” he says. Even if the current mutant H5N1 strains are debilitated? “That’s not always the case,” says Ebright. “Furthermore, the next generation of research will aim to increase virulence and transmissibility.”
Racaniello and Ebright agree that publishing the studies makes sense, but for different reasons. “If you only give data to 100 flu scientists,” says Racaniello, “you eliminate the chance that a person from another field sees it and sees something new.” The fear of a terrorist using the research to create a weapon is unfounded, he says. “Why would a terrorist want to use something that worked on a ferret without having a clue if it would do the same thing in people?”
For Ebright, whether to publish the studies was a moot point. “The virus and the information already existed,” he says. “Restricting publication would infuriate researchers, and it wouldn’t address the primary security risk, which is release by a person who was authorized to work with it—which is what happened with the 2001 anthrax mailings.”
To mitigate some risks of similar research, a federal policy announced March 29 will require NIH and other funding agencies to review portfolios for research that could cause public harm. In the meantime, critics are sure to keep a close eye on what comes of the H5N1 work.
Stay on the frontiers of medicine