Skip to main content
ABC News
We Have Ways To Stop Rogue Scientists. They Don’t Always Work.

How do you stop a mad scientist?

We’ve been doing it in fiction for centuries. Doctor Faustus was carried off to hell. Pneumonia and an Arctic ice flow ended Victor Frankenstein. Doctor Moreau’s own creations ultimately did him in, which is a more poetic way of saying that he ended up on the wrong end of a fight with a puma-woman.

But all of these fictional comeuppances are proxies for our frustration with science’s potentially unchecked power. Lacking the devil, or a deus ex machina, us nonfictional people are still faced with the challenge of stopping rogue researchers. So can the public control science that leaves us with permanent and unenviable consequences? Recent news suggests that the answer is “not really.” There are tools that we can use to place limits on scientists and the choices they make. But none of them can fully, reliably, put the public in the driver’s seat. And maybe that’s OK.

The mad scientists of the past — both the imaginary and the all too real — have been on my mind. On Nov. 25, MIT Technology Review broke the story of He Jiankui, a Chinese researcher who claims to have created the first genetically edited human babies. He says he used a gene-editing technology called CRISPR to deactivate a gene called CCR5, granting twin baby girls resistance to HIV, smallpox and cholera.

Whether he actually did this is unclear. He apparently experimented largely in secret. The Associated Press has reported that he deceived employers, colleagues and even some of his own research subjects. He hasn’t published any of his work, as scientists typically do, so it can’t be independently confirmed. At an international genetics conference on Nov. 28, He told other scientists that he was proud of his work. And last week, he told the Harvard Crimson that he was working on a rebuttal paper to address the ethical concerns.

This kind of edgy science will happen again, and not just in the biological sciences. The same week that news of He’s experiments came out, the journal Nature ran a story about a team at Harvard that is planning the first real-world experiments in geoengineering — releasing particles of calcium carbonate into the high atmosphere over part of the southwestern United States. It’s part of a project to develop a way to block sunlight and hopefully reduce the global average temperature (but not by too much). The tests could start as early as next year.

To be clear, these are very different situations — ethically and experimentally. The Chinese scientist broke the established international norms of his profession and violated Chinese national research guidelines. There’s good reason to think the gene editing he did — if he did it — won’t work as he intended, could have consequences that reverberate through generations, and wasn’t necessary enough to justify the risk. The geoengineering study, in contrast, is being done through established protocol: starting at a small scale, in public, and preceded by review from an external advisory committee. And there is a case to be made that it’s necessary. Without it, we might have to do something truly wild to avoid the consequences of climate change, like reduce dependence on fossil fuels.

But the cases do share something. Each illustrates how the creativity of individual scientists can butt up against the safety (and sensibilities) of the public — and how hard it is for the public to gain the upper hand in that struggle.

Science today is more publicly regulated than it has ever been, said Alta Charo, professor of law and bioethics at the University of Wisconsin. The law, the marketplace and science’s community norms each provide a layer of protection. But each is imperfect.

Legal restrictions on science basically began as a byproduct of World War II. Emerging from the revelations of grotesque Nazi research, the 1947 Nuremberg Code was the first time that anybody really tried to set international rules for what you could and could not do in medical experiments, Charo said. But that document didn’t have the force of law. Consider the concept of informed consent, which is the idea that research subjects not only have to be told that they are research subjects but that they also must agree to participate, understand what they’re agreeing to and be free to say “no.” It’s part of the Nuremberg Code. Seems pretty basic. But it wasn’t written into U.S. law until 1974, when Congress passed the National Research Act, said C. K. Gunsalus, director of the National Center for Professional and Research Ethics.

And experiments that fall outside biology, medicine or weapons development often fall through the cracks completely. There aren’t any international rules on the use of geoengineering, for example, and it’s not totally clear whether any U.S. laws apply. An American scientist who wanted to hack the planet might have to first produce an environmental impact statement and get the idea through the bureaucracy laid out in the National Environmental Policy Act. But that system has loopholes for research and technical feasibility studies — and for “extraordinary circumstances.” Meanwhile, there are countries that have basically marketed themselves as places where scientists can go to do research that is banned at home. Want to experiment on primates without the strict laws imposed by governments in North America and Europe? Your lab can visit beautiful Mauritius.

Restrictions on science imposed by financial incentives are also slippery. Governments have exerted control over research through rules around what they will and won’t fund, Charo said. When we talk about restrictions on stem cell research in the U.S. that were put in place by the George W. Bush administration, that wasn’t so much a ban as a declaration that federal funding couldn’t be used to do it.

That’s especially important when we’re thinking about technologies like CRISPR. One of the things that makes this tool a big deal is that it’s cheap, relatively speaking. Many expected under-the-table CRISPR research since scientists may not need to depend as much on government funding to finance their work.

The final limiting force is one you might recognize from politics: norms. You might be able to skirt the law. You may find the funding to do ethically dubious research. But will you be able to look your peers in the eyes — and keep your job — in the morning?

There are lots of different ways this system has real power. Publication in a peer-reviewed journal marks the difference between legitimacy and laughingstock. Peer review controls governmental and non-governmental funding sources. A scientist, in whatever field, who violates the codes of ethics established by his research institution or his professional organization could find himself friendless and jobless … and labless.

But He risked it anyway. Norms aren’t indomitable, in part because they can conflict with each other. Scientists say He violated the ethical norms of their industry. But intense competition is also one of science’s norm, said Deborah Johnson, emeritus professor of applied ethics at the University of Virginia. So there’s an incentive to push as hard — and as creatively — as possible.

Johnson was recently part of a committee on responsible science updating some of these normative statements of ethics for the National Academies of Sciences, Engineering, and Medicine. “The idea was to shift from blaming individuals when they did something wrong to recognizing the system that created those people,” she told me. “What made it more likely for them to behave in the way they did.” To her, it matters that there’s a culture of thinking about science as a profit center and of all research as potentially marketable. It matters that scientists are competing intensely for jobs, research dollars, publication spots. Splashy results matter more now, she said.

In other words, you could say that He is as much a product of the community of science as he is in violation of it. The mad scientist will never go away, either as trope or reality. This dark side of science is too much a part of how science works. We’re still not that far away from a time when science answered to no one, save God (and the occasional puma woman).

But the leaky sieve of stopgaps I’m describing here is not purely a bad thing. There’s a lot of research that once was reviled but turned out to be not so bad after someone slipped through the cracks and tried it. In vitro fertilization was once extremely controversial and its inventors were denied funding from the U.K.’s Medical Research Council. Today, IVF is a fixture in fertility treatments. Whether we end up considering a scientist mad or genius often depends more on the cultural context of how we balance risks and rewards than it does on the technology itself, Charo told me.

That doesn’t make it easier to make sense of radical scientific ideas in the moment, however. Back in 1975, a landmark gathering of scientists, lawyers and ethicists hashed out a series of voluntary guidelines that would govern research on genetic engineering for decades to come. To scientists, Gunsalus said, the Asilomar conference on recombinant DNA represented a groundbreaking moment for transparency and ethics. Here was science — an industry that had only recently decided medical research required informed consent of patients — coming together to consider the risks of research before it happened and make rules about what was and wasn’t OK to pursue.

But politicians felt differently. To them, Asilomar was a bunch of unelected people making decisions that affected the public with no input from the public. Scientists saw successful self-regulation in the public interest. Sen. Ted Kennedy saw public policy being made in private.

And maybe the real trouble here is that both those perspectives are true.

Maggie Koerth was a senior reporter for FiveThirtyEight.

Comments