Not long ago–just a millennium or two–medical investigators clashed with conservative critics in an imbroglio that bears a curious resemblance to the firestorm over embryonic stem cell research.
As in the current debate, scientists wanted to use a controversial technique that could improve their understanding of human biology, with the promise of new treatments down the road. Opponents of the approach argued that it would violate important religious tenets and cheapen human life.
The practice that drew such dire condemnation was the dissection of human cadavers, which was outlawed in most places from the time of ancient Greece until the Renaissance.
It’s possible that stem cell research will one day be so routine that our battles over it will seem as quaint as the dissection dispute does now. Indeed, medical techniques that rang urgent ethical alarms just decades ago–such as organ transplantation and genetic research–these days are settling into respectable middle age.
That’s not to say that such advances were free of ethical problems. For example, there is nothing in principle stopping genetic researchers today from attempting some truly obscene creations–such as people with dog genes, or a full-blown clone of Jerry Springer.
In fact, it’s remarkable that researchers do not attempt more ethically dubious work, considering that sometimes the only barrier to such experiments is an individual’s conscience.
But for all the questionable choices that researchers have made, the recent history of medical ethics also abounds with slippery slopes that no one ever quite slid down, horrors that never happened.
In some cases, scientists restrained themselves from pursuing unseemly applications of their work. At other times, public mores simply shifted to accept new approaches that seemed unsavory at first.
The greatest danger, experts say, comes when innovative work proceeds out of public view, so that people cannot evaluate the changes that medical technology offers. That’s hardly a risk with stem cell research, given the prominence of President Bush’s recent deliberations on whether to grant federal funding to work on embryonic stem cells.
One of the best examples of scientific self-discipline happened in the 1970s with genetic engineering–a technology that easily could have unfolded in more troubling ways than it has.
In 1975, just as genetic research was starting to heat up, its leading stars realized the tremendous potential for abuse, said Alexander Capron, a professor of law and medicine at the University of Southern California. At a much-heralded meeting that year in Asilomar, Calif., 140 top genetic researchers agreed to suspend their work while they hashed out ethical and safety guidelines.
“There was a voluntarily imposed set of toeholds at the top of the slippery slope, before they slid down into anything more dangerous,” said Capron, who attended the landmark meeting and issued a strong warning about the personal legal liabilities attached to such research. “It’s a model for scientific self-restraint.”
Even in the frontier world of in vitro fertilization, doctors have not followed every troubling path that technology allowed.
Doctors at one time discussed the feasibility of taking donor eggs for IVF from female fetuses that were aborted, said Lori Andrews, a specialist in biotechnology law at Chicago-Kent College of Law. Yet so far, that option has proven too ghoulish even to be attempted.
“In part, that’s because of the potential impact of knowing that the mother of the child not only died, but in fact never lived,” Andrews said.
IVF itself was the subject of intensive hand-wringing when the first test tube babies were born 20 years ago. In some ways, the technology turned into the opposite of a slippery slope–what began as a fearsome threat to natural reproduction has evolved into a widely popular alternative for couples who can’t have children of their own in any other way.
But the idea of using donor eggs from fetuses fits the classic definition of a slippery slope: the degeneration of seemingly acceptable technology into applications that the public never would have approved at the outset.
Some abortion opponents say that’s exactly what’s happening already with embryonic stem cell research.
Creating embryos
Although most cells used in such research come from spare embryos left over from IVF procedures, a Virginia group announced recently that it had created embryos solely for the purpose of stem cell research. A Massachusetts biotech company said soon after that it is trying to make human embryonic clones that could be harvested for stem cells.
Those studies created a challenge for stem cell advocates. The same American public that supports work on embryonic stem cells, according to public opinion polls, might be far less enthusiastic about a specialty industry that churns out embryos for research.
That’s why supporters of embryonic stem cell research, such as bioethicist Arthur Caplan of the University of Pennsylvania, were among those most alarmed by the Virginia and Massachusetts teams’ work. Their fear was that the questionable techniques might turn opinion against all embryonic research.
History suggests that the early stages of new medical techniques can be critical. Sometimes the only difference between a decisive breakthrough and an ethical nightmare is luck.
That was the case with heart transplants, which met a far different fate in Japan than in other industrialized nations.
In 1968, one year after Dr. Christiaan Barnard performed the first heart transplant, Dr. Juro Wada did the first such procedure in Japan.
But in contrast to the glory heaped on Barnard, Wada was investigated for murder because of suspicions that he had taken the donor’s heart before he was dead.
Although Wada was not indicted, the “Wada transplant” became so infamous in Japan that for decades the country shunned all transplants from brain-dead donors. No surgeon in Japan performed another heart transplant until 1999.
Such disasters illustrate the great importance of having the stem cell debate now, even though widespread uses of the technology are years in the future.
“When you don’t have a wider discussion beforehand, everything hinges on the first court case,” Andrews said.
Slippery-slope arguments account for just some critiques of new medical practices. Most opponents of embryonic research, for example, say their main criticism is that it violates the sanctity of human life, which is a bedrock principle. The real horror, they say, is not what embryonic research could lead to, but the research itself.
Then again, that’s what the ancient Greeks said about dissecting cadavers.
When the Greek physician Hippocrates practiced during the 4th Century B.C., most Greeks believed that the immortal soul could not flourish unless the body was intact. Legal and religious bans also kept the great Roman physician Galen from performing autopsies.
Well into the 13th Century, the Catholic Church issued edicts that seemed to prohibit surgery on the living or the dead.
But those rules slowly dissolved in the 1300s as it became clear that physicians needed to do dissections to learn about the body’s inner workings.
When life begins
The passing centuries have brought changes to even the most basic ideas about when life begins.
Until the 19th Century, Catholic doctrine resembled that of Greek and Jewish scholars, who believed that a fetus becomes a person about 40 days into pregnancy–about the time when the mother feels the first movements in her womb. In 1869, Pope Pius IX pushed the onset of personhood back to the moment of conception, threatening excommunication for Catholics who had abortions at any stage of pregnancy.
In its 1973 Roe v. Wade decision, the Supreme Court ruled that the fetus’ legal status changes over time, with the most protection given after the developing child would be viable outside the womb–usually around six months into pregnancy.
Researchers today believe another crucial time is around two weeks after conception, when the embryo’s cells begin to differentiate into specialized functions. Until that time, the embryo still can split into twins, raising the question of whether it is an individual organism.
Many bioethicists say the embryo’s unsettled nature helps justify stem cell research on embryos younger than 2 weeks old.
Another proposal for when life begins came last year from former Rep. Tom Coburn (R-Okla.), an abortion foe who introduced a bill defining life as the presence of a heartbeat and brain waves. That would put the onset of human life back to around 40 days after conception.
With embryonic stem cell research, science has given new form to an old question: Do the benefits of medical technology outweigh the ethical problems that arise whenever someone devises a new way of tinkering with the human body?
The stakes
The question may be the same one that ancient Greek physicians faced with dissection, but the stakes are much greater now.
IVF has tamed the reproductive process and turned it into a breeding ground for new treatments. Advances in genetic research and stem cells have given scientists the ability to fundamentally change human biology, with applications that could be wondrous or dreadful.
Such thoughts may have been on the mind of Pope John Paul II last week when he argued against stem cell research during a visit with Bush.
By opposing such research, the pope said, “America can show the world the path to a truly humane future, in which man remains the master, not the product, of his technology.”
Yet medical science has advanced so fast that we already are, in fundamental ways, the products of our technology. Whether we remain its masters as well–whether we can prevent ghastly uses of promising research–is a question that will endure long after the embryonic stem cell debate is over.




