Should Biologists be Guided by Beauty?

Lingulodinium polyedrum is a unicellular marine organism which belongs to the dinoflagellate group of algae. Its genome is among the largest found in any species on this planet, estimated to contain around 165 billion DNA base pairs – roughly fifty times larger than the size of the human genome. Encased in magnificent polyhedral shells, these bioluminescent algae became important organisms to study biological rhythms. Each Lingulodinium polyedrum cell contains not one but at least two internal clocks which keep track of time by oscillating at a frequency of approximately 24 hours. Algae maintained in continuous light for weeks continue to emit a bluish-green glow at what they perceive as night-time and swim up to the water surface during day-time hours – despite the absence of any external time cues. When I began studying how nutrients affect the circadian rhythms of these algae as a student at the University of Munich, I marveled at the intricacy and beauty of these complex time-keeping mechanisms that had evolved over hundreds of millions of years.

Lingulodinium polyedrum (scanning electron micrograph)
Lingulodinium polyedrum (scanning electron micrograph) – Credit: FWC Fish and Wildlife Research Institute (via Flickr)

 

I was prompted to revisit the role of Beauty in biology while reading a masterpiece of scientific writing, “Dreams of a Final Theory” by the Nobel laureate Steven Weinberg in which he describes how the search for Beauty has guided him and many fellow theoretical physicists to search for an ultimate theory of the fundamental forces of nature. Weinberg explains that it is quite difficult to precisely define what constitutes Beauty in physics but a physicist would nevertheless recognize it when she sees it.Over the course of a quarter of a century, I have worked in a variety of biological fields, from these initial experiments in marine algae to how stem cells help build human blood vessels and how mitochondria in a cell fragment and reconnect as cells divide. Each project required its own set of research methods and techniques, each project came with its own failures and successes. But with each project, my sense of awe for the beauty of nature has grown. Evolution has bestowed this planet with such an amazing diversity of life-forms and biological mechanisms, allowing organisms to cope with the unique challenges that they face in their respective habitats. But it is only recently that I have become aware of the fact that my sense of biological beauty was a post hoc phenomenon: Beauty was what I perceived after reviewing the experimental findings; I was not guided by a quest for beauty while designing experiments. In fact, I would have been worried that such an approach might bias the design and interpretation of experiments. Might a desire for seeing Beauty in cell biology lead one to consciously or subconsciously discard results that might seem too messy?

One such key characteristic of a beautiful scientific theory is the simplicity of the underlying concepts. According to Weinberg, Einstein’s theory of gravitation is described in fourteen equations whereas Newton’s theory can be expressed in three. Despite the appearance of greater complexity in Einstein’s theory, Weinberg finds it more beautiful than Newton’s theory because the Einsteinian approach rests on one elegant central principle – the equivalence of gravitation and inertia. Weinberg’s second characteristic for beautiful scientific theories is their inevitability. Every major aspect of the theory seems so perfect that it cannot be tweaked or improved on. Any attempt to significantly modify Einstein’s theory of general relativity would lead to undermining its fundamental concepts, just like any attempts to move around parts of Raphael’s Holy Family would weaken the whole painting.

Can similar principles be applied to biology? I realized that when I give examples of beauty in biology, I focus on the complexity and diversity of life, not its simplicity or inevitability. Perhaps this is due to the fact that Weinberg was describing the search of fundamental laws of physics, laws which would explain the basis of all matter and energy – our universe. As cell biologists, we work several orders of magnitude removed from these fundamental laws. Our building blocks are organic molecules such as proteins and sugars. We find little evidence of inevitability in the molecular pathways we study – cells have an extraordinary ability to adapt. Mutations in genes or derangement in molecular signaling can often be compensated by alternate cellular pathways.

This also points to a fundamental difference in our approaches to the world. Physicists searching for the fundamental laws of nature balance the development of fundamental theories whereas biology in its current form has primarily become an experimental discipline. The latest technological developments in DNA and RNA sequencing, genome editing, optogenetics and high resolution imaging are allowing us to amass unimaginable quantities of experimental data. In fact, the development of technologies often drives the design of experiments. The availability of a genetically engineered mouse model that allows us to track the fate of individual cells that express fluorescent proteins, for example, will give rise to numerous experiments to study cell fate in various disease models and organs. Much of the current biomedical research funding focuses on studying organisms that provide technical convenience such as genetically engineered mice or fulfill a societal goal such as curing human disease.

Uncovering fundamental concepts in biology requires comparative studies across biology and substantial investments in research involving a plethora of other species. In 1990, the National Institutes of Health (NIH – the primary government funding source for biomedical research in the United States) designated a handful of species as model organisms to study human disease, including mice, rats, zebrafish and fruit flies. A recent analysis of the species studied in scientific publications showed that in 1960, roughly half the papers studied what would subsequently be classified as model organisms whereas the other half of papers studied additional species. By 2010, over 80% of the scientific papers were now being published on model organisms and only 20% were devoted to other species, thus marking a significant dwindling of broader research goals in biology. More importantly, even among the model organisms, there has been a clear culling of research priorities with a disproportionately large growth in funding and publications for studies using mice. Thousands of scientific papers are published every month on the cell signaling pathways and molecular biology in mouse and human cells whereas only a minuscule fraction of research resources are devoted to studying signaling pathways in algae.

The question of whether or not biologists should be guided by conceptual Beauty leads us to the even more pressing question of whether we need to broaden biological research. If we want to mirror the dizzying success of fundamental physics during the past century and similarly advance fundamental biology, then we need substantially step-up investments in fundamental biological research that is not constrained by medical goals.

 

References

Dietrich, M. R., Ankeny, R. A., & Chen, P. M. (2014). Publication trends in model organism research. Genetics, 198(3), 787-794.

Weinberg, S. (1992). Dreams of a final theory. Vintage.

Note: An earlier version of this article was first published on the 3Quarksdaily blog. 

 

 

ResearchBlogging.org

Dietrich, M., Ankeny, R., & Chen, P. (2014). Publication Trends in Model Organism Research Genetics, 198 (3), 787-794 DOI: 10.1534/genetics.114.169714

 

Weinberg, Steven (1992). Dreams of a Final Theory Vintage Books

Advertisement

Murder Your Darling Hypotheses But Do Not Bury Them

“Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—whole-heartedly—and delete it before sending your manuscript to press. Murder your darlings.”

Sir Arthur Quiller-Couch (1863–1944). On the Art of Writing. 1916

 

Murder your darlings. The British writer Sir Arthur Quiller Crouch shared this piece of writerly wisdom when he gave his inaugural lecture series at Cambridge, asking writers to consider deleting words, phrases or even paragraphs that are especially dear to them. The minute writers fall in love with what they write, they are bound to lose their objectivity and may not be able to judge how their choice of words will be perceived by the reader. But writers aren’t the only ones who can fall prey to the Pygmalion syndrome. Scientists often find themselves in a similar situation when they develop “pet” or “darling” hypotheses.

Hypothesis via Shutterstock
Hypothesis via Shutterstock

How do scientists decide when it is time to murder their darling hypotheses? The simple answer is that scientists ought to give up scientific hypotheses once the experimental data is unable to support them, no matter how “darling” they are. However, the problem with scientific hypotheses is that they aren’t just generated based on subjective whims. A scientific hypothesis is usually put forward after analyzing substantial amounts of experimental data. The better a hypothesis is at explaining the existing data, the more “darling” it becomes. Therefore, scientists are reluctant to discard a hypothesis because of just one piece of experimental data that contradicts it.

In addition to experimental data, a number of additional factors can also play a major role in determining whether scientists will either discard or uphold their darling scientific hypotheses. Some scientific careers are built on specific scientific hypotheses which set apart certain scientists from competing rival groups. Research grants, which are essential to the survival of a scientific laboratory by providing salary funds for the senior researchers as well as the junior trainees and research staff, are written in a hypothesis-focused manner, outlining experiments that will lead to the acceptance or rejection of selected scientific hypotheses. Well written research grants always consider the possibility that the core hypothesis may be rejected based on the future experimental data. But if the hypothesis has to be rejected then the scientist has to explain the discrepancies between the preferred hypothesis that is now falling in disrepute and all the preliminary data that had led her to formulate the initial hypothesis. Such discrepancies could endanger the renewal of the grant funding and the future of the laboratory. Last but not least, it is very difficult to publish a scholarly paper describing a rejected scientific hypothesis without providing an in-depth mechanistic explanation for why the hypothesis was wrong and proposing alternate hypotheses.

For example, it is quite reasonable for a cell biologist to formulate the hypothesis that protein A improves the survival of neurons by activating pathway X based on prior scientific studies which have shown that protein A is an activator of pathway X in neurons and other studies which prove that pathway X improves cell survival in skin cells. If the data supports the hypothesis, publishing this result is fairly straightforward because it conforms to the general expectations. However, if the data does not support this hypothesis then the scientist has to explain why. Is it because protein A did not activate pathway X in her experiments? Is it because in pathway X functions differently in neurons than in skin cells? Is it because neurons and skin cells have a different threshold for survival? Experimental results that do not conform to the predictions have the potential to uncover exciting new scientific mechanisms but chasing down these alternate explanations requires a lot of time and resources which are becoming increasingly scarce. Therefore, it shouldn’t come as a surprise that some scientists may consciously or subconsciously ignore selected pieces of experimental data which contradict their darling hypotheses.

Let us move from these hypothetical situations to the real world of laboratories. There is surprisingly little data on how and when scientists reject hypotheses, but John Fugelsang and Kevin Dunbar at Dartmouth conducted a rather unique study “Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory” in 2004 in which they researched researchers. They sat in at scientific laboratory meetings of three renowned molecular biology laboratories at carefully recorded how scientists presented their laboratory data and how they would handle results which contradicted their predictions based on their hypotheses and models.

In their final analysis, Fugelsang and Dunbar included 417 scientific results that were presented at the meetings of which roughly half (223 out of 417) were not consistent with the predictions. Only 12% of these inconsistencies lead to change of the scientific model (and thus a revision of hypotheses). In the vast majority of the cases, the laboratories decided to follow up the studies by repeating and modifying the experimental protocols, thinking that the fault did not lie with the hypotheses but instead with the manner how the experiment was conducted. In the follow up experiments, 84 of the inconsistent findings could be replicated and this in turn resulted in a gradual modification of the underlying models and hypotheses in the majority of the cases. However, even when the inconsistent results were replicated, only 61% of the models were revised which means that 39% of the cases did not lead to any significant changes.

The study did not provide much information on the long-term fate of the hypotheses and models and we obviously cannot generalize the results of three molecular biology laboratory meetings at one university to the whole scientific enterprise. Also, Fugelsang and Dunbar’s study did not have a large enough sample size to clearly identify the reasons why some scientists were willing to revise their models and others weren’t. Was it because of varying complexity of experiments and models? Was it because of the approach of the individuals who conducted the experiments or the laboratory heads? I wish there were more studies like this because it would help us understand the scientific process better and maybe improve the quality of scientific research if we learned how different scientists handle inconsistent results.

In my own experience, I have also struggled with results which defied my scientific hypotheses. In 2002, we found that stem cells in human fat tissue could help grow new blood vessels. Yes, you could obtain fat from a liposuction performed by a plastic surgeon and inject these fat-derived stem cells into animal models of low blood flow in the legs. Within a week or two, the injected cells helped restore the blood flow to near normal levels! The simplest hypothesis was that the stem cells converted into endothelial cells, the cell type which forms the lining of blood vessels. However, after several months of experiments, I found no consistent evidence of fat-derived stem cells transforming into endothelial cells. We ended up publishing a paper which proposed an alternative explanation that the stem cells were releasing growth factors that helped grow blood vessels. But this explanation was not as satisfying as I had hoped. It did not account for the fact that the stem cells had aligned themselves alongside blood vessel structures and behaved like blood vessel cells.

Even though I “murdered” my darling hypothesis of fat –derived stem cells converting into blood vessel endothelial cells at the time, I did not “bury” the hypothesis. It kept ruminating in the back of my mind until roughly one decade later when we were again studying how stem cells were improving blood vessel growth. The difference was that this time, I had access to a live-imaging confocal laser microscope which allowed us to take images of cells labeled with red and green fluorescent dyes over long periods of time. Below, you can see a video of human bone marrow mesenchymal stem cells (labeled green) and human endothelial cells (labeled red) observed with the microscope overnight. The short movie compresses images obtained throughout the night and shows that the stem cells indeed do not convert into endothelial cells. Instead, they form a scaffold and guide the endothelial cells (red) by allowing them to move alongside the green scaffold and thus construct their network. This work was published in 2013 in the Journal of Molecular and Cellular Cardiology, roughly a decade after I had been forced to give up on the initial hypothesis. Back in 2002, I had assumed that the stem cells were turning into blood vessel endothelial cells because they aligned themselves in blood vessel like structures. I had never considered the possibility that they were scaffold for the endothelial cells.

This and other similar experiences have lead me to reformulate the “murder your darlings” commandment to “murder your darling hypotheses but do not bury them”. Instead of repeatedly trying to defend scientific hypotheses that cannot be supported by emerging experimental data, it is better to give up on them. But this does not mean that we should forget and bury those initial hypotheses. With newer technologies, resources or collaborations, we may find ways to explain inconsistent results years later that were not previously available to us. This is why I regularly peruse my cemetery of dead hypotheses on my hard drive to see if there are ways of perhaps resurrecting them, not in their original form but in a modification that I am now able to test.

 

Reference:

ResearchBlogging.org

Fugelsang, J., Stein, C., Green, A., & Dunbar, K. (2004). Theory and Data Interactions of the Scientific Mind: Evidence From the Molecular and the Cognitive Laboratory. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, 58 (2), 86-95 DOI: 10.1037/h0085799

 

Note: An earlier version of this article first appeared on 3Quarksdaily.

Inspired By Snake Venom

When I remember the 80s, I think of Nena’s 99 Luftballons, Duran Duran’s Wild Boys and ….snake venom. Back in those days, I used to be a typical high school science nerd. My science nerdiness interfered with my ability to socialize with non-nerds and it was characterized by an unnecessary desire to read science books and articles that I did not really understand, just so that I could show off with some fancy science terminology. I did not have much of an audience to impress, because my class-mates usually ignored me. My high school biology teacher, Herr Sperr, was the only one who had the patience to listen to me. One of the science books that I purchased was called “Gehirn und Nervensystem” (i.e. “Brain and Nervous System”), published by Spektrum der Wissenschaft, the German publisher of Scientific American. It was a collection of Scientific American articles in the field of neuroscience that had been translated into German. I was thumbing through it, looking for some new neurobiology idea or expression that I could use to impress Herr Sperr. While browsing the book, I came across the article “Der Nervenwachstumsfaktor” (originally published in Scientific American as “The Nerve-Growth Factor” in 1979) by Rita Levi-Montalcini and Pietro Calissano.

My curiosity was piqued by this article, because I did not realize that nerves had “growth factors” and because one of the authors, Rita Levi-Montalcini, had just won the Nobel Prize in the preceding year. I started reading the article and loved it, reading it over and over again. I liked the article so much, that I did not even try to show off about it and kept the newly discovered inspiration to myself. There are many reasons why I loved the article and I will just mention two of them:

1. Scientific discovery is an exciting journey, starting and ending with unanswered questions

Levi-Montalcini and Calissano started off by describing the state of knowledge and the unanswered questions in the field of developmental neurobiology and neuronal differentiation in the 1940s, when Levi-Montalcini was about to launch her career as a scientist. They commented on how the simple yet brilliant idea to test whether tumors could influence the growth of nerves sparked a whole new field of investigation. They narrated a beautiful story of scientific discovery, from postulating a “nerve growth factor” to actually isolating and sequencing it. Despite all the advances that Levi-Montalcini and her colleagues had made, the article ended with a new mystery, that the role of the nerve growth factor may be much bigger than all the researchers suspected. The nerve growth factor was able to act on cells that were not neurons and it was unclear why this was the case. By hinting at these yet to be defined roles, the article made it clear that so much more work was necessary and I felt that an invitation was being extended to the readers to participate in the future discovery.

2. Scientific tools can harbor surprises and important clues

The article mentioned one important coincidence that helped shape the progress of discovering the sequence of the nerve growth factor. To assess whether the putative nerve growth factor contained nucleic acids, Levi-Montalcini and her colleagues exposed the “soup” that was inducing the growth of nerves to snake venom. The rationale was that snake venom (by the way, the German expression “Schlangengift” sounds even more impressive than the English “snake venom”) would degrade nucleic acids and if the growth enhancing properties disappeared, it would mean that the nerve growth inducing factor contained nucleic acids. It turned out that the snake venom unexpectedly magnified the nerve growth enhancing effects, because the venom contained large quantities of the nerve growth factor itself. This unexpected finding made it much easier for the researchers to sequence the nerve growth factor, because the snake venom now provided access to a large source of the nerve growth factor and it resulted in a new mystery: Why would snake venom contain a nerve growth factor?

In the subsequent decades, as I embarked on my own career as a scientist, I often thought about this article that I read back in high school. It inspired me to become a cell biologist and many of the projects in my laboratory today focus on the effects of growth factors on blood vessels and stem cells. The article also made me think about the importance of continuously re-evaluating the tools that we use. Sometimes our tools are not as neutral or straight-forward as we think, and this lesson is just as valid today as it was half a century ago. For example, a recent paper in Cell found that the virus used for reprogramming adult cells into stem cells is not merely a tool that allows entry of the reprogramming factors, as was previously thought. The virus tool can actually activate the stem cell reprogramming itself, reminiscent of how the “snake venom” tool was able to induce nerve growth.

Rita Levi-Montalcini was one of the world’s greatest biologists and passed away on December 30, 2012. In addition to her outstanding scientific work, she was also a shining example of an activist scientist with a conscience, who fought for education and research. I never had the opportunity to meet her in person, but I was inspired by her work and I will always see her as a role model.

Image credit: Cover of the book “Gehirn und Nervensystem” by Spektrum der Wissenschaft

Is the Analysis of Gene Expression Based on an Erroneous Assumption?

The MIT-based researcher Rick Young is one of the world’s top molecular biologists. His laboratory at the Whitehead Institute for Biomedical Research has helped define many of the key principles of how gene expression is regulated, especially in stem cells and cancer cells. At a symposium organized by the International Society for Stem Cell Research (ISSCR), Rick presented some very provocative data today, which is bound to result in controversial discussions about how researchers should assess gene expression.

Ptolemey’s world map from Harmonica Macrocosmica

It has become very common for molecular biology laboratories to use global gene expression analyses to understand the molecular signature of a cell. These global analyses can measure the gene expression of thousands of genes in a single experiment. By comparing the gene expression profiles of different groups of cells, such as cancer cells and their healthy counterparts, many important new genes or new roles for known genes have been uncovered. The Gene Expression Omnibus is a public repository for the huge amount of molecular information that is generated. So far, more than 800,000 samples have been analyzed, covering the gene expression in a vast array of organisms and disease states.

Rick himself has extensively used such expression analyses to characterize cancer cells and stem cells, but at the ISSCR symposium, he showed that most of these analyses are based on the erroneous assumption that the total RNA content in cells remains constant. When the gene expression in cancer cells is compared to that of healthy non-cancer cells, the analysis is routinely performed by normalizing or standardizing the RNA content. The same amount of RNA from cancer cells and non-cancer cells is obtained and the global analyses are able to detect relative differences in gene expression. However, a problem arises when one cell type is generating far more RNA than the cell type it is being compared to.

In a paper that was published today in the journal Cell entitled “Revisiting Global Gene Expression Analysis”, Rick Young and his colleagues discuss their recent discovery that the cancer-linked gene regulator c-Myc increases total gene expression by two to three-fold. Cells expressing the c-Myc gene therefore contain far more total RNA than cells that don’t express it. This means that most genes will be expressed at substantially higher levels in the c-Myc cells. However, if one were to perform a traditional gene expression analysis comparing c-Myc cells versus cells without c-Myc, one would “control” for these differences in RNA amount by using the same amount of RNA for both cell types. This traditional standardization makes a lot of sense; after all, how would one be able to compare the gene expression profile in the two samples, if we loaded different amounts of RNA? The problem with this common-sense standardization is that it misses out on global shifts of gene expression, such as those initiated by potent regulators such as c-Myc. According to Rick Young, one answer to the problem is to include an additional control by “spiking” the samples with defined amounts of known RNA. This additional control would allow us to then analyze if there is also an absolute change in gene expression, in addition to the relative changes that current gene analyses can detect.

In some ways, this seems like a minor technical point, but I think that it actually points to a very central problem in how we perform gene expression analysis, as well as many other assays in cell biology and molecular biology. One is easily tempted to use exciting large scale analyses to study the genome, epigenome, proteome or phenome of cells. These high-tech analyses generate mountains of data and we spend an inordinate amount of time trying to make sense of the data. However, we sometimes forget to question the very basic assumptions that we have made. My mentor Till Roenneberg taught me how important it was to use the right controls in every experiment. The key word here is “right” controls, because merely including controls without thinking about their appropriateness is not sufficient. I think that Rick Young’s work is an important reminder for all of us to continuously re-evaluate the assumptions we make, because such a re-evaluation is a pre-requisite for good research practice.

claimtoken-50916a74915c1