Two Versions of Simplicity in Science: Structural vs. Functional Genetics

Simplicity is often a virtue. We like to call this virtuous simplicity parsimony, and there are all sorts of reasons to think that parsimony is fundamental to defining useful knowledge. Borges’ “On Exactitude in Science” is a classic here, where Borges artfully shows in a single paragraph the worthlessness of a perfect map, since the map would be the size of the territory, and thus no aid in navigation. But what does simplicity look like in the sciences? Does it have just one form?

In today’s science studies workshop, we discussed Evelyn Fox Keller’s The Century of the Gene. The book traces the history of 20th century genetics, from the rediscovery of Mendel through the Human Genome Project, and shows how the concept of the gene was initially quite vague, but became much more focused following Watson and Crick’s publication of the structure of DNA, which seemed to have all the complex properties needed to be the material form of genes. Following advances in the 1970s-1990s, however, our notion of the gene has become fuzzy once again as we learned that not all DNA sequences code for proteins, and that there are hosts of mechanisms in place both inside DNA sequences and in the supporting infrastructure to regulate DNA activation, to detect and correct errors, and even to induce mutations. The book is short and totally free of (science studies) jargon, and seems to be aimed more at a general reader or perhaps undergraduate biology or history student than an STS crowd, but because of that it leaves a lot of material undertheorized (though simultaneously being a much easier quick afternoon read).

To simplify an already simplified story, one of the tensions in the book is between two notions of what a gene is: a kind of stuff (e.g. DNA sequences), a set of functions (the unit of transmission of heredity, a piece of the code for a complete organism). Fox Keller traces the history of these competing notions of genes back to just before the term itself was coined, in the early 20th century, and then follows them through to the end of the century. The expanded understanding of genetics produced by the late 20th century developments (debates over gene activation and junk DNA, etc.) burst open the exact linkage of 1 gene –> 1 protein, the tight connection between structure and function.

All of this is a fascinating refresher and update for someone whose last biology class was a decade ago in high school, around when this book was published. But I think the book also makes a number of points, implicitly, that are of relevance to STS scholars in other areas. One concerns the rise and fall in specificity of a term: “gene” was a useful term in the early 20th century because of its openness (genes are how organisms do heredity, whatever that ends up being), and was useful again in the mid-20th century for its concreteness (DNA = genes), but eventually must be reopened or abandoned to make more progress.

Another potentially useful generalization from the history of 20th century genetics concerns different understandings of a simple or parsimonious explanation. The debate referred to above between “structural” or material understandings of what genes are and functions understandings maps onto two different understandings of a simple explanation or theory, what I’ll call “ontological simplicity” and “mechanistic simplicity”. You can think of these as being tightly related to issues of “representation” vs. “intervention” in an Ian Hacking sense. An ontologically simple theory posits some basic blocks whose interactions explain the phenomenon of interest. So, atomic theory here would be the obvious parallel: all of chemistry and biology can be reduced to the interaction of a few basic atomic building blocks. Similar, one version of genetics is that organisms (phenotypes) can be explained in terms of the interactions of basic building blocks, genes, that have a common structure, DNA.

On the other hand, a mechanistically simple theory is one that focuses on the least intervention required to produce a given change, given a host of other (unspecified) factors that are held constant. So, mechanistically simple theories tell us that if you swap this bit of DNA with this other bit, you produce a fly with too many eyes, or a human being with Tay-Sachs. If you swap these other bits of DNA, or mess with the cell environment in another way, something entirely different happens. These theories (findings, phenomena) allow for a very complex world filled with junk DNA and gene-environment interactions, but one whose complexity can be bracketed for certain purposes. As Fox Keller describes, the success of the cloning of Dolly the sheep had more to do with scientists figuring out little tricks to get a mammalian ovum to behave rather than some deep underlying theory of why mammal and lizard ova behaved differently such that the latter were relatively easy to clone. The process worked, and worked repeatable, and thus produced a new phenomenon, a successful intervention.

These two notions of simplicity don’t necessarily work at cross purposes, and may indeed work best in iteration – come up with a new, parsimonious ontology, then tinker with it to produce new phenomenon in ways that bust up the ontology, repeat if possible. But the two are distinct, and it’s worth thinking about the ways that debates over parsimony in other fields (say economics) may map onto the same or similar categories and thus produce tensions when results are less clear. Parsimony is a virtue, but not always a simple one.

Advertisements
Comments are closed.