Skip to content
Author
PUBLISHED: | UPDATED:
Getting your Trinity Audio player ready...

The battle against germs never ends. It only escalates.

A century ago, a patient entering the hospital for surgery had a better-than-even chance of picking up an infection. Now, only 5 percent of surgery patients get a new infection as part of their stay.

When hospitals turned to one-use supplies starting in the 1970s, the trend seemed like the ultimate solution to worries about sterilization. Simply throw the problem away.

A century ago, hospitals started to set up sterilization programs to clean equipment such as surgical instruments and wound dressings. Autoclaves, or heat sterilizers, that killed germs using steam or air heated by steam became common after 1900.

The ability to fight infection dawned slowly on the medical profession. In 1847, Ignaz Semmelweis, a Viennese obstetrician, made the first connection between childbed fever, a maternity ward killer, and doctors who performed autopsies without washing their hands and then delivered babies.

Semmelweis required physicians to wash in a chloride of lime solution after autopsies and with soap and water between patients.

Though the number of deaths dropped immediately, Viennese doctors ridiculed him and insulted patients spit on him. He was forced to leave the city and died in an insane asylum.

In the 1860s, a patient undergoing an amputation in a large urban hospital had a 2-in-5 chance of dying as a result of the operation, where in small rural hospitals, the same operation produced just a 1-in-9 chance.

British surgeon Joseph Lister (whose name was later used for Listerine) took the first major steps to correct the situation in large urban hospitals. Drawing on the germ theory developed by Louis Pasteur, Lister set about preventing infections after surgery.

Zeroing in on microbes, Lister began cleaning wounds and operating instruments with a carbolic acid solution, dressed wounds with lint soaked in the solution, and even sprayed the solution in the air. Later he began washing his hands in the solution before surgery.

The result: The death rate after amputations dropped to 15 percent from 46 percent.

From killing germs with chemicals, it was a short step to using heat, or asepsis, to do the job. The new methods meant that more could be sterilized, and in a more systematic way–sheets, dressings, instruments.

Another important advance was the use of gloves. William Halstead, a surgeon at Baltimore’s Johns Hopkins Hospital, asked Goodyear to make two pairs of rubber gloves for his operating room scrub nurse, whose hands were allergic to the chemical solution he used to kill germs.

By 1899, surgeons at Johns Hopkins began wearing gloves not to protect their hands but to protect the patient.

The switch from chemicals to heat at the turn of the century forced surgeons to begin using the shiny instruments we know today, made of stainless steel or chrome-plated steel.

“Surgeons at that time were using instruments that had elaborate ebony or ivory handles, which were porous and could hide germs,” said James Edmonson, curator of the Dittrick Museum of Medical History at Cleveland’s Case Western Reserve University. “The handles couldn’t stand up to the heat in the autoclaves.”

In an effort to eliminate places for germs to lurk, operating rooms in the early 1900s were stripped of carpets and curtains; furniture became strictly utilitarian. The modern operating room had white tiled walls and floors, white enameled metal furniture and powerful overhead lights.

By the 1920s, however, doctors complained that operating rooms were too bright–the glare strained their eyes. After experimenting with different colors, doctors settled on one that seemed to complement the bright red of blood freshly exposed to oxygen–the drab green familiar to us today in modern scrubs.

The 1930s and 1940s brought antibiotics–the sulfa drugs and later penicillin. After World War II, GIs brought home sturdier strains of gonorrhea from Asia. Stronger antibiotics and more sophisticated use of statistics helped in the fight against infectious diseases.

Then came AIDS. In the 1980s, hospitals, with the encouragement of the Center for Disease Control, instituted “Universal Precautions.” The standards advised that all blood and most body fluids from patients should be considered potentially infectious.

“That’s when you started seeing more rigorous handwashing protocols,” said Elizabeth Kinion, director of faculty practice at the University of Akron’s School of Nursing. “Instead of putting the cap back on a syringe, just put both in the red container.”

The stuff in the red container became classified as hazardous waste, regulated by the U.S. Environmental Protection Agency.