by Ananya Sen
Figure 1. War armor- this was worn by doctors who treated plague. The beak was stuffed with herbs to keep away the bad odors that were believed to the cause of infectious diseases. Source
History is rife with techniques and materials used to prevent microbial growth. The Egyptians are famed for their process of mummification, which preserves the skin and organs of deceased bodies. They circumvented putrefaction by dehydrating the body and using chemicals such as natron (a salt) that effectively sterilizes the body and prevents bacterial decomposition (similar to how beef jerky is made). There are several other examples: Greeks used silver and copper containers to store food and wine, explorers like Christopher Columbus traded spices because of their use as preservatives (or to disguise bad tastes), and soldiers in the American Revolution were given three pints of spruce beer every day because of the lack of potable water. Furthermore, hypochlorous acid and hydrogen peroxide were used as household disinfectants long before it was known that the immune system uses those same chemicals to destroy invading pathogens.
The main purpose of disinfection and sterilization, then and now, is to counter the spread of microbes, especially pathogens. Although the causative mechanism of disease progression was unknown, scientists postulated a connection between putrefaction and the dissemination of disease. Therefore, they set out to eliminate the effects of decomposition to reduce the incidence of infections. These attempts led to a multipronged approach. I will trace the history of each approach separately.
Chemical combat
One of the first formal studies on sterilization was carried out by Dutch merchant and scientist Antonie van Leeuwenhoek in 1676. Peering into his microscopes, Leeuwenhoek realized that chemicals such as pepperand wine-vinegar caused bacterial death, as evinced by their loss of motility. His work was followed up by English physician Edmund King (in 1693), who added sulfuric acid, salt, sugar, wine, ink, and blood to the list of chemicals that could be used as sterilization agents. The 18th century saw the discovery of chlorine in 1774 by the Swedish chemist Carl Scheele and the discovery of hypochlorites in 1789 by the French chemist Claude Berthollet. These chemicals had the distinction of countering the odors associated with putrefaction. Unfortunately, they reinforced the mistaken belief that disease was caused by decay and noxious smells.
Figure 2. Listerine was developed in 1879 by Joseph Lawrence in honor of Joseph Lister. It was originally developed as an alcohol-based surgical antiseptic. Source
It was Louis Pasteur's work from 1860 – 1864 that cemented the germ theory of disease. Following that, microbiologists scoured their surroundings to find chemicals that could be used to kill microbes. In 1865 Joseph Lister tested phenol as a disinfectant to treat a fracture wound. He chose phenol because he had heard of its ability to prevent the odor of sewage in a nearby village. However, the credit of using phenol is not Lister's alone. He had been unaware of the works of several others (Kuchenmeister in 1860, Lemaire in 1860) that had shown that phenol is effective in treating wounds (And that is why it is important to read current scientific literature!). To Lister's credit, he succeeded in convincing several surgeons to adopt his antiseptic technique; his persuasiveness can be partly attributed to his success in treating Queen Victoria's armpit sore with phenol. He also worked out the concentrations of phenol that were sufficient to treat wounds, since undiluted phenol is highly caustic and causes inflammation of tissues. In 1881 Robert Koch published a detailed report on the preparation methods for over 70 chemicals that could be used to kill anthrax spores, including iodine, potassium permanganate, formic acid, quinine, and oil of turpentine. The final cornerstone in establishing the principles of chemical disinfection was the work done by Kronig and Paul in 1897. They recognized that bacteria are not all killed simultaneously; the rate of killing being dependent on the concentration of the chemical and the temperature used. These principles led to the development of the famous phenol coefficient test for disinfectants (Rideal and Walker, 1903). This test is a measure of the bactericidal activity of a chemical in relation to phenol using an arbitrary contact time (7.5 minutes), a diluent (distilled water), and a standard test organism (they used Salmonella typhi). A modified version of this protocol is now the basis for testing modern disinfectants.
Some like it hot
One of the earliest records of using heat sterilization comes from 1810, by Parisian chef Nicolas Appert. In an attempt to preserve food, he put it in glass jars, which he sealed and placed in boiling water. This simple method became widespread and rightfully earned him the title "the father of canning". In 1837 a German physician, Theodor Schwann, used heat sterilization to disprove spontaneous generation: when sterile sugar solutions were exposed to air, the sugars decomposed, but no decomposition occurred when the air was heat-sterilized. Years later, in 1860, Louis Pasteur conclusively proved that heat could destroy microorganisms using his famous swan-neck experiment. Subsequently he insisted that surgeons pass their instruments through a flame to destroy adhering microbes, and advocated the use of heat-sterilized bandages on open wounds. Pasteur is also credited with developing pasteurization, a method that prevents the spoiling of milk, beer, and wine by heating them briefly at 50°C – 60°C.
Figure 3. Papin's steam digester that served as a model for the modern autoclave. Source
The primary disadvantage of pasteurization is that it does not kill all the microbes; it only reduces the number of viable pathogens. Trying to achieve sterility, British physicist John Tyndall developed the process of Tyndallization in 1877. Here, the liquid to be sterilized was heated at 100°C for 30 minutes on three successive days. The first round killed the vegetative cells, the second round killed the vegetative cells that had germinated from endospores, and the third round was a precautionary measure. This was a tedious process, and the results were not always reproducible. This problem was solved in 1879 with the development of the autoclave by Charles Chamberland. He modeled the autoclave using the principles of the high-pressure cooker invented by French physicist Denis Papin in 1679. Autoclaving subjects the materials to high pressure steam at 121°C for 15 – 20 minutes. This method is able to destroy all bacteria (except some extreme thermophiles), viruses, fungi, and spores, and is still the primary method of sterilization today.
Razedby Radiation
The first scientific study on the effect of light on microbes was carried out in 1877 by Downes and Blunt. They showed that exposing solutions of sugar water to sunlight prevented microbial growth. They further demonstrated that this phenomenon was dependent on the intensity, duration, and wavelength of the light used; the shorter wavelengths of the solar spectrum were the most effective. This result was confirmed a year later by Tyndall, who noticed that when cultures were exposed to sunlight there was no bacterial growth. However, when these cultures were subsequently moved to warmer temperatures, they did grow. This observation led him to hypothesize that light inhibited bacterial growth instead of causing decreased viability. However, he did not consider that the high concentration of bacteria in his flask was probably protecting some cells from sunlight damage. In 1885 Duclaux and Arloing demonstrated the killing effect of sunlight using pure cultures of Bacillus anthracis; the spores of these bacteria were unable to germinate after prolonged exposure to sunlight. Even so, the region of the spectrum that was responsible for the killing action remained unknown.
In 1892 Theodor Geisler demonstrated that UV rays were the most effective in killing Salmonella typhi. He investigated the effect of the visible spectra on bacterial growth using a prism and a heliostat; he found that UV rays caused the most damage by using uranium-glass tubes which allowed UV rays to pass through. Between 1893-95 Marshall Ward unequivocally proved that the violet-blue and near UV rays were the most harmful to bacteria. He built on the experiments of Geisler and meticulously constructed lens systems that allowed for maximal stability of illumination. He showed that there was growth inhibition in the UV-violet-blue region with a sharp decline at the border of blue and green light. This effect was not observed with any other region of the visible or infra-red spectrum. However, none of these studies quantified the intensity of the light source required to kill bacteria. Such a study was carried out by Hertel in 1904, who proved that the deleterious effects of radiation were inversely proportional to the wavelength of light and directly proportional to the dose (intensity x duration) of radiation. All these studies contributed significantly to public health policies regarding water sterilization and reduced the incidence of infectious diseases.
Filtration- the non-violent line of defense
(Click to enlarge)
Figure 4. Original map by John Snow showing the clusters of cholera cases in the London epidemic of 1854. The pump is located at the intersection of Broad Street and Cambridge Street. Source
Filtration was practiced as early as 4000 B.C.. Ancient Greek and Sanskrit writings recommended filtering water through charcoal. These attempts were intended to reduce the amount of visible cloudiness (turbidity) in drinking water. The first formal study of the benefits of filtration was carried out by Francis Bacon (posthumously published in 1627). who was trying to desalinate sea water. His experiments laid the foundation for using sand filters to remove particulates from water. In 1804 John Gibb used sand filters in order to supply filtered water to residential areas in the U.K., thus creating the first treated public water supply in the world. Thereafter practice of filtration became commonplace. Its virtues were cemented by John Snow during the 1854 Broad Street cholera outbreak. His statistical studies showed that there was a link between the quality of water source and the incidence of cholera. This convinced the authorities to disable the water pumps which ended the outbreak. The materials used to create filters included cotton wool (Schröder and von Dusch 1854) and ceramics (Chamberland 1884). The final demonstration that filtration led to sterility was carried out by Tyndall in 1876 – 1877. He showed that if the dust and micro-organisms were removed by filtration, the air would not contaminate sterile solutions that were left exposed to it. The solutions became contaminated when left open to air that contained particulate matter. The filters used today can remove 99.97% of 0.3 mm particles, which includes most bacterial cells but not viruses. Nanofilters with a pore size of 20 – 50 nm are used to remove viruses.
Current methods of sterilization
(Click to enlarge)
Current methods of sterilization
The methods of sterilization employed nowadays vary based on the place of use. In food and dairy industries, disinfection and sanitization are more important than total sterilization. Conversely, in laboratory studies equipment and media need to be sterile to prevent any contamination. Some of the various methods of sterilization have been summarized below.
Ananya is a graduate student in the Department of Microbiology at the University of Illinois at Urbana-Champaign. She works in the lab of James A. Imlay. Ananya has recently started a blog of her own called “ The History of Science."
Comments