Categories: BiosecurityColumnists

How evil can prevail in state-sanctioned biowarfare research

By Laura H. Kahn, June 16, 2008

Some people consider physician Wouter Basson South Africa’s Josef Mengele. During the 1998 Truth and Reconciliation hearings on Project Coast, South Africa’s apartheid-era chemical and biowarfare programs, Schalk Janse van Rensburg, a veterinarian, stated that Basson, the program’s head, wanted to devise a way to kill individuals that would appear undetectable to a forensics laboratory. Van Rensburg confirmed that a front company for the South African bioweapons program produced coffee chocolates with anthrax, peppermints with cyanide, and beer with botulinum toxin.

Similarly, Adrian Jacobus Goosen, a veterinarian involved in the research, testified that Basson once told him, “One day when the black people take over the country, and my daughter asks me, ‘Daddy what did you do to prevent this?’ my conscience would be clean.” Jack Bothma, an orthopedic surgeon and former colleague, claimed that after experimenting on three black prisoners with an ointment in 1983, he and Basson injected them with a lethal drug before loading their bodies into a plane to be dumped into the Indian Ocean.

When prejudice leads to the physical, psychological, sociological, economic, and/or political trauma to those belonging to the ‘outside’ group, it’s pathological and dangerous.”

For these and other alleged deeds, in the late 1990s, the South African government charged Basson with 61 counts of murder, fraud, and drug trafficking. His trial lasted two years, and ultimately, he was acquitted of all charges.

Sadly, Basson isn’t the first physician to be accused of such atrocities.

Earlier biowarfare programs

Germany developed one of the world’s earliest biowarfare programs during World War I, which Robert Koenig documented in his book, The Fourth Horseman. Horses and mules were the intended targets because the military relied on them for transportation.

The lead scientist was Anton Dilger, an American-born, German-educated physician who conducted biowarfare research in the basement of a house in Chevy Chase, Maryland. Koenig described Dilger as a fervent German supporter, extremely cultured, extroverted, and attractive. He didn’t have any personal prejudices against any particular group, and his biowarfare efforts focused solely on animals–i.e., injecting guinea pigs with glanders to test its effectiveness at killing. His efforts appeared to be somewhat successful since horses en route from the United States to Britain had to be thrown overboard because they were sick with glanders.

During World War II, Ishii Shiro, a Japanese physician, convinced his superiors that a bioweapons program would be strategic and cost-effective. Once given the go-ahead, Ishii built up a massive program, including the notorious Unit 731 in Ping Fan, where Ishii’s experiments on humans included studies on frostbite, poisons, and electrocution. Even if the subjects survived the experiments, they were killed in order to ensure secrecy. According to Sheldon Harris’s Factories of Death, Ishii and his fellow researchers referred to their human subjects as “marutas”–meaning “logs.” By dehumanizing their Chinese captives, the Japanese had no ethical problem killing untold numbers of them. Ishii was a racist and womanizer who frequently stepped on others for career advancement.

Although difficult to comprehend, Mengele actually committed worse atrocities against helpless prisoners. An ardent supporter of the Nazi ideology of genetic “purity,” Mengele was an extreme racist and anti-Semite. He had his own financing, laboratory, and staff to conduct his sadistic experiments, which included inducing gangrene, injecting chemicals into people’s eyes, and murdering people so he could undertake postmortem examinations. He was described as popular, handsome, and well-groomed. Survivors later told of how he would show great affection to many of his child twin subjects and then, without hesitation or remorse, kill them.

Unlike the German and Japanese programs during World War II, the subsequent U.S. and Soviet bioweapon programs weren’t led by physicians or scientists determined to wipe out or commit atrocities against a specific group of people because of race, religion, or other prejudices. Still, there were serious unintended consequences. In the early 1950s, one death and numerous illnesses occurred in San Francisco supposedly due to an open-air test of the bacteria Serratia marcescens, with U.S. scientists wrongly believing that Serratia marcescens was harmless. And a 1971 smallpox outbreak in Aralsk, Kazakhstan, has been attributed to open-air testing the Soviets conducted on Vozrozhdeniye Island. The outbreak led to 10 infections and three deaths; 50,000 residents of the city needed to be vaccinated in two weeks to contain the outbreak. Eight years later, an unintentional leak of anthrax spores at a production facility in Sverdlovsk, Ukraine, caused the deaths of more than 60 people, 7 sheep, and 1 cow.

The role of prejudice and dehumanization

Prejudice is typically defined as a negative prejudgment against a group of people. For some, prejudice is so extreme that it leads to hatred and a desire to kill. But prejudice isn’t necessarily a bias against another group; rather, it’s a preference for one’s own group.

There’s evidence that people are hardwired for prejudice. From an evolutionary perspective, it would make sense to be wary of strangers outside of one’s small band or tribe. (See “The Genetic/Evolutionary Basis of Prejudice and Hatred.”) Indeed, children are taught not to go with strangers. But when prejudice leads to the physical, psychological, sociological, economic, and/or political trauma to those belonging to the “outside” group, then it’s pathological and dangerous.

In complex, heterogeneous societies, people encounter individuals of many different races, ethnicities, religions, and nationalities; therefore, prejudice and discrimination are counterproductive for a harmonious existence. Some societies, particularly in Europe, have dealt with the issue by purging their ethnic minorities. In a March/April Foreign Affairs essay, Jerry Muller argues that ethno-nationalism has defined post-World War II Europe. While the United States clearly has problems with prejudice and discrimination, it has an advantage in that virtually everyone–save for Native Americans–is from somewhere else. As a result, ethno-nationalism could be considered less of a national force than in some other countries.

But in some heterogeneous societies, people from one group consider those from other groups as the cause of their ills. At the individual level, this prejudicial response can be an attempt to maintain self-esteem in difficult times; instead of taking responsibility for their failures or misfortunes, some people prefer to blame others.

Unethical research on dehumanization and torture

Sometimes social context determines human thought and behavior. In extreme circumstances, seemingly normal people can become monsters. For example, the 1971 Stanford Prison Experiment demonstrated that if given absolute power and control, normal people are capable of dehumanizing and torturing their fellow humans. In this case, young men answered an advertisement to earn money in a two-week experiment. They were arbitrarily divided into two groups–prisoners and guards.

After being “arrested” and placed in “prison,” the prisoners were forced to strip and given identifying numbers. The guards wore identical khaki uniforms, sunglasses, and carried clubs. After only six days, the experiment had to be stopped. About one-third of the guards showed delight in the power that they wielded and were particularly inventive and sadistic in how they humiliated and tortured the prisoners.

While ethically abhorrent, the Stanford Prison Experiment suggests that anyone can turn into a “monster” if given unrestricted power. In this context, the horrors of Abu Ghraib shouldn’t be surprising.

A decade earlier, Stanley Milgram conducted experiments at Yale University that showed how normal people could be coerced into torturing their fellow humans. In Milgrim’s study, 40 males between 20 to 50 years in age answered an advertisement to participate in a memory and learning experiment. They were paid $4.50 just for showing up. The study’s purported goal was to see what effect punishment had on learning. In actuality, the study sought to see how severely the subjects would be willing to torture their fellow humans.

Those conducting the experiment rigged up a fake electric generator and hooked up a “learner”–in actuality, a confederate in the study–to make it look as though he would receive electric shocks each time he gave an incorrect answer. The subject was asked to increase the shock’s voltage with each wrong answer. The voltage ranged from 15 volts (slight shock) to 450 volts (severely dangerous shock).

The study found that 65 percent of the subjects obeyed the experimenter and administered shocks all the way up to 450 volts–despite the “learner’s” screaming. The rest obeyed until 300 volts, but then stopped at some increment thereafter. Many of the subjects showed symptoms of extreme stress–for instance, one subject experienced a convulsive seizure. Despite the questionable ethics involved, the study was reportedly repeated in Australia and Italy with similar results.

Unethical medical research

From 1932 to 1972, researchers with the U.S. Public Health Service conducted experiments on 399 poor black men suffering from syphilis. Called “The Tuskegee Syphilis Experiment,” the researchers told the subjects that they had “bad blood” and wanted to see how syphilis affected blacks. Even when penicillin became available to cure the disease, treatment was withheld from the subjects. Over the course of the experiment, 40 wives became infected, and 19 children were born with congenital syphilis. A whistleblower contacted the media, and the story broke in the Washington Star in July 1972. Only then did the government stop the study.

Simultaneous to the Tuskegee study, between April 1945 and July 1947, physicians working for the U.S. atomic weapons program deliberately injected plutonium into 18 Americans–including a railroad porter, janitor, housewife, and storeowner. All but one died long, agonizing deaths. In 1987, Eileen Welsome, a reporter for the Albuquerque Tribune, discovered files about the experiment in a safe at Kirtland Air Force Base in New Mexico. The identities of the experiment’s victims would have remained unknown if not for her determined efforts to track down retired scientists and submit countless Freedom of Information Act requests for technical documents.

Unfortunately, during the early decades of the atomic age such experiments were commonplace. For example, the Defense Atomic Support Agency contracted with the University of Cincinnati to expose more than 90 cancer patients to total body irradiation. At Vanderbilt University, 829 pregnant women were given radioactive iron drinks to imbibe and told that it would benefit them and their babies. More than 100 prisoners in Oregon and Washington had their testicles exposed to high levels of radiation. Most of the victims of these experiments were poor and/or disenfranchised minorities. (See Welsome’s The Plutonium Files.)

Human-subject research protections

The atrocities of human-subject abuse during World War II led to the Nuremberg Code, which recommended that all human subjects be provided with information about their prospective study before they gave their voluntary consent and that the study’s benefits outweigh its risks. In 1949, the World Medical Assembly adopted an International Code of Medical Ethics; it spelled out the duties of physicians to their patients. Fifteen years later, the World Medical Assembly adopted the Helsinki Declaration; it outlined basic principles in protecting human subjects.

On July 12, 1974, the U.S. Congress passed the National Research Act. It established a national commission to identify the doctrine that should underlie all human-subject research. The commission issued the Belmont Report, which listed three basic principles–respect people, do no harm, and treat people equally and fairly. Since then, additional U.S. regulations and international guidelines pertaining to human-subject protections have been enacted. For example, FDA regulations require that institutions have Institutional Review Boards (IRBs) to review and monitor biomedical research involving human subjects. Still, this system has its weaknesses: IRBs have been found to have many studies to review with too little time or expertise, and some IRB members have conflicts of interest.

Preventing future research atrocities

It’s important to note that the South African clandestine chemical and biowarfare program developed after the Nuremberg Code and Helsinki Declaration, proving that while important, they aren’t sufficient to protect people from dangerous political regimes. It could be argued that a democratic form of governance can prevent research abuses. But the Tuskegee and plutonium experiments show that even in democracies, research atrocities can occur. Democracies do, however, allow for a free and independent press–a critical component in exposing unethical research programs, as Welsome’s journalistic tenacity proved.

Professional human subjects–volunteering to be a subject for the pharmaceutical industry is now a lucrative endeavor–have taken matters into their own hands. A print and online magazine exists for members of the human “guinea pig” community. Similarly, there’s an alliance between researchers and subjects to protect human rights. But it’s unclear if these groups would have the ability to investigate, expose, and stop unethical research.

And what about physicians in state-sanctioned bioweapon programs? Could another Ishii or Mengele appear in the future?

In some ways, medical training inadvertently encourages those who are predisposed to dehumanize others. For example, from the first day of anatomy lab, some medical students must mentally dehumanize their cadavers in order to dissect them. (See “Cadavers Give Docs Leg Up in Training.”) And the long, grueling hours during internship and residency can lead to anger and frustration–particularly when dealing with difficult, abusive, and sometimes, violent patients.

Occasionally, physicians-in-training begin to refer to their patients in pejorative terms. The House of God, a novel that gives an extremely cynical view of medical training, has become a cult classic among medical trainees because of its satirical portrait of the psychological traumas they must endure to become physicians.

The challenge to balance perspective is considerable: Too much empathy, and physicians can become overwhelmed with the human tragedy they witness daily. But if they distance themselves too much, physicians can become cold automatons.

Today, some medical schools are addressing this challenge. For example, Columbia University maintains a program in narrative medicine that combines the practice of medicine with literature in the hopes of providing physicians and other health care professionals with the tools to care for their patients in a holistic manner. It is argued that literature allows readers to step into the lives of others–to understand their struggles, hopes, and fears. Narrative medicine seeks to reinvigorate the “art” of medicine by encouraging practitioners to listen to the stories of their patients and therefore, acknowledge their humanity. (See “The Writing Cure.”)

Thus, the real challenge–whether during war or peace–is to prevent unethical activities before they become crimes against humanity. It’s a task that requires constant vigilance, education, and public awareness.

As the coronavirus crisis shows, we need science now more than ever.

The Bulletin elevates expert voices above the noise. But as an independent, nonprofit media organization, our operations depend on the support of readers like you. Help us continue to deliver quality journalism that holds leaders accountable. Your support of our work at any level is important. In return, we promise our coverage will be understandable, influential, vigilant, solution-oriented, and fair-minded. Together we can make a difference.

Support the Bulletin