The Two Quantitative Steps in the Biology Growth Curve
The early tools of the exercise: the first one-step growth curve
from the 1940 paper by Max Delbrück, obtained using B. coli and
an unspecified phage. Source.
We are witnessing a highly influential influx of physicists, mathematicians, and engineers into biology. This is not the first time. Over the centuries, biology has been blessed by the involvement of people from other fields. During the last half century or so, we can point to two distinct incoming waves, each one bringing in many with different viewpoints and often greater quantitative skills. The first wave took place after World War II and played a key role in the development of modern biology. The second one is happening now.
In brief, I would like to compare these two episodes not as a historian, but, perhaps naively, simply as one who has lived through these periods. I started my graduate work in 1950, about the heyday of the development of molecular biology, and I am still around to observe what is happening now.
In the first wave, one could distinguish two streams of physicists heading into the field. One group retained its traditional roots in physics and used big machines to study biological phenomena. Modern radiation biology came of it, and although it provided magnificent tools such as isotopes, it was perhaps not as successful in answering basic biological questions as its proponents had envisaged. The other group started afresh, but with the typical physicist way of thinking. For the object of their study they looked for the smallest units. They zeroed in on bacteriophages, shunning the bacteria that they thought too complicated. Their tools were simple: Petri dishes and test tubes. One paradigm is the development of the phage one-step growth curve by Delbruck and Ellis, which permitted quantitative studies of viral development. Another was Luria and Delbruck’s fluctuation test, which provided credible evidence for the random nature of mutations. The only math needed was the Poisson distribution.
Looking behind the historical facts for the moment, there is an obvious lesson to be learned here. As much as technology can contribute to a science, even greater is the value of novel approaches arising from people entering from another field. I can attest that working in one area impels one to continue to work along lines that one is familiar with. One wants to answer the “next question.” It is unusual, although not unheard of, to take a serious pause and ask: “What is the most important question in my field?” Not so for newcomers. They have the freedom of starting out with such questions.
We can now look at the inside of a virus and see what its
parts look like and how they interact. The tail machine
of phage P22 at 9.4Å resolution, as determined by cryoEM.
Fifty-one subunits from five different gene products
assemble to form the tail machine, and are represented in
surface renderings at 1.5 sigma. The image on the right
depicts a cutaway view to expose the tail machine interior.
The current second wave has yielded an avalanche of studies often started by people with a physical, mathematical, and engineering background. I suppose one could divide these newcomers into two categories as well, but the distinctions are different from those in the first wave. One category is represented by people who deal with technology, but in innovative ways. They may have been encouraged by the great physicist Richard Feynman who, when asked how one should study biology, answered: “Look at the thing.” Resulting from these efforts has been the stunning developments in optical, electron, and other microscopes. Sacred cows, such as the belief that the resolving power of the microscope is limited by the wavelength of light and the refractive index of the medium, have been left to graze in distant pastures. Likewise, cryotomographic techniques have allowed the electron microscope to be used on nearly undisturbed cells. I have commented on this Age of Imaging before.
The approach of the other category has been to make predictions of the quantitative sort, that is, models. This is their way of expressing the intricacies of a biological phenomenon in quantitative terms. A model is made and then it is tested against reality. It works in reciprocal way. Sometimes the data used are already available, other times the models suggest experiments designed to test their plausibility, and sometimes both things are in play. For the student wanting to know more about this I cannot think of a better source that the textbook Physical Biology of the Cell.
We are now witnessing stunning developments in all aspects of biology. Just look at what you see on the lab bench. Test tubes give way to microfluidic apparatuses, optical tweezers leave ordinary micromanipulation in the dust, and so forth. And all this has practical relevance. Clinical diagnoses are beginning to be based on extremely rapid techniques such as magnetic resonance analysis, etc., etc. And new algorithms are piled upon algorithms with the gusto that characterizes bioinformatics. In brief, we are transforming our science from Molecular Biology to Systems Biology (complete with all the ambiguities of this new term).
The transformation of the landscape has indeed been dramatic. I cannot begin to guess where this will lead us, other than to be confident that unforeseen and unforeseeable discoveries will be made by the three-way collaboration of biologists, physicists, and mathematicians. Certain goals, such as being able to observe at atomic resolution individual molecules in their natural cellular environment, may take a while to achieve. But who’s to say, maybe this, too, is just around the corner. And yet, have new laws of Biology emerged? Is this a paradigm shift? In the Kuhnian sense, maybe and maybe not. Are we concerned with familiar questions, albeit with fabulous new tools, or are we entering into a different world of biological understandings?