“Biology is hard.” These underwhelming but wise words were told to me two years ago by a sixth-year grad student who had given up the frustrations of biological bench work in favor of computational biology. At the time, I had some inkling of what he meant. I had spent many long evenings in college working in a biochemistry lab, bent over the lab bench trying to figure out why my most recent experiment had failed. Often, the failure of an experiment meant some mistake on my part: a forgotten reagent, a botched procedure, a badly designed primer; these are all lessons that most biological researchers learn the hard way and make us better scientists. The rest of the time, there were no answers as to why experiments failed. They just did. A gene wouldn’t clone or express, a protein would aggregate or refuse to crystallize, or an antibody wouldn’t blot the correct protein. These failures were the most frustrating because there was little to learn from them. One had to either try something fundamentally different or give up.
After six years of working in the life sciences, I have come to learn that this experience is widely shared among biological researchers. It is simply the nature of the work. But being a scientist, I had to ask: Why is that? What makes biology so hard to predict, parse, and engineer? The answer, left unspoken but widely acknowledged by biologists, is that living systems are simply too complex to be fully understood.
The failures that absorb so much of a biology grad student’s career are usually ascribed to the complexities within the cells they work with. Even biologists sometimes forget that cells, though stunningly well-tuned and elegantly functional machines, are much more complicated than a microchip and much less predictable. Despite over a century of research effort, cells are still “black boxes” full of mysterious chemical mechanisms and machinery that we are just beginning to understand. The magnitude of the complexity of a single cell is truly overwhelming. Even a relatively simple genetic system, such as that of a bacterial virus, can be so complex as to be beyond the supercomputer’s computational capacity to model. It’s not hard to understand why: Imagine a tiny “bag of chemicals” with a menagerie of millions of molecules shoving around inside it like concert goers in a rave, each going about highly specific tasks, together maintaining the delicate balance of life. Since we cannot model such a machine, we can rarely predict what removing or adding a gear to the mechanism will do to it. But this is how we study life: by breaking or introducing gears in the machines and observing how they behave. Yet these approaches are crude and often fail, the reasons why getting lost in noise of millions of molecules.
Genome sequencing promised to shine light into the “black boxes” of life. It was hoped that a researcher would be able to read the genetic code it like an engineer reads a blueprint. This hope has proven naïve. Even with an annotated genetic code in hand, it is often impossible to predict what gene is expressed when, why, and what it does. Each year, science peels away layers of complexity of how the cell controls its myriad functions, usually revealing even more complexity beyond it. Companies that hedged their bets on genomics revealing the intricacies of disease, such as the recently bankrupt DeCode Genetics based in Iceland, are learning the hard way that life and its diseases are far more complex than anyone thought. To grad students like myself who work in the life sciences, though the complexity of life offers much frustration, it nonetheless instills a deep sense of awe and respect for nature, as we realize that we are just beginning to understand it.