Murder Your Darling Hypotheses But Do Not Bury Them

“Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it—whole-heartedly—and delete it before sending your manuscript to press. Murder your darlings.”

Sir Arthur Quiller-Couch (1863–1944). On the Art of Writing. 1916


Murder your darlings. The British writer Sir Arthur Quiller Crouch shared this piece of writerly wisdom when he gave his inaugural lecture series at Cambridge, asking writers to consider deleting words, phrases or even paragraphs that are especially dear to them. The minute writers fall in love with what they write, they are bound to lose their objectivity and may not be able to judge how their choice of words will be perceived by the reader. But writers aren’t the only ones who can fall prey to the Pygmalion syndrome. Scientists often find themselves in a similar situation when they develop “pet” or “darling” hypotheses.

Hypothesis via Shutterstock
Hypothesis via Shutterstock

How do scientists decide when it is time to murder their darling hypotheses? The simple answer is that scientists ought to give up scientific hypotheses once the experimental data is unable to support them, no matter how “darling” they are. However, the problem with scientific hypotheses is that they aren’t just generated based on subjective whims. A scientific hypothesis is usually put forward after analyzing substantial amounts of experimental data. The better a hypothesis is at explaining the existing data, the more “darling” it becomes. Therefore, scientists are reluctant to discard a hypothesis because of just one piece of experimental data that contradicts it.

In addition to experimental data, a number of additional factors can also play a major role in determining whether scientists will either discard or uphold their darling scientific hypotheses. Some scientific careers are built on specific scientific hypotheses which set apart certain scientists from competing rival groups. Research grants, which are essential to the survival of a scientific laboratory by providing salary funds for the senior researchers as well as the junior trainees and research staff, are written in a hypothesis-focused manner, outlining experiments that will lead to the acceptance or rejection of selected scientific hypotheses. Well written research grants always consider the possibility that the core hypothesis may be rejected based on the future experimental data. But if the hypothesis has to be rejected then the scientist has to explain the discrepancies between the preferred hypothesis that is now falling in disrepute and all the preliminary data that had led her to formulate the initial hypothesis. Such discrepancies could endanger the renewal of the grant funding and the future of the laboratory. Last but not least, it is very difficult to publish a scholarly paper describing a rejected scientific hypothesis without providing an in-depth mechanistic explanation for why the hypothesis was wrong and proposing alternate hypotheses.

For example, it is quite reasonable for a cell biologist to formulate the hypothesis that protein A improves the survival of neurons by activating pathway X based on prior scientific studies which have shown that protein A is an activator of pathway X in neurons and other studies which prove that pathway X improves cell survival in skin cells. If the data supports the hypothesis, publishing this result is fairly straightforward because it conforms to the general expectations. However, if the data does not support this hypothesis then the scientist has to explain why. Is it because protein A did not activate pathway X in her experiments? Is it because in pathway X functions differently in neurons than in skin cells? Is it because neurons and skin cells have a different threshold for survival? Experimental results that do not conform to the predictions have the potential to uncover exciting new scientific mechanisms but chasing down these alternate explanations requires a lot of time and resources which are becoming increasingly scarce. Therefore, it shouldn’t come as a surprise that some scientists may consciously or subconsciously ignore selected pieces of experimental data which contradict their darling hypotheses.

Let us move from these hypothetical situations to the real world of laboratories. There is surprisingly little data on how and when scientists reject hypotheses, but John Fugelsang and Kevin Dunbar at Dartmouth conducted a rather unique study “Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory” in 2004 in which they researched researchers. They sat in at scientific laboratory meetings of three renowned molecular biology laboratories at carefully recorded how scientists presented their laboratory data and how they would handle results which contradicted their predictions based on their hypotheses and models.

In their final analysis, Fugelsang and Dunbar included 417 scientific results that were presented at the meetings of which roughly half (223 out of 417) were not consistent with the predictions. Only 12% of these inconsistencies lead to change of the scientific model (and thus a revision of hypotheses). In the vast majority of the cases, the laboratories decided to follow up the studies by repeating and modifying the experimental protocols, thinking that the fault did not lie with the hypotheses but instead with the manner how the experiment was conducted. In the follow up experiments, 84 of the inconsistent findings could be replicated and this in turn resulted in a gradual modification of the underlying models and hypotheses in the majority of the cases. However, even when the inconsistent results were replicated, only 61% of the models were revised which means that 39% of the cases did not lead to any significant changes.

The study did not provide much information on the long-term fate of the hypotheses and models and we obviously cannot generalize the results of three molecular biology laboratory meetings at one university to the whole scientific enterprise. Also, Fugelsang and Dunbar’s study did not have a large enough sample size to clearly identify the reasons why some scientists were willing to revise their models and others weren’t. Was it because of varying complexity of experiments and models? Was it because of the approach of the individuals who conducted the experiments or the laboratory heads? I wish there were more studies like this because it would help us understand the scientific process better and maybe improve the quality of scientific research if we learned how different scientists handle inconsistent results.

In my own experience, I have also struggled with results which defied my scientific hypotheses. In 2002, we found that stem cells in human fat tissue could help grow new blood vessels. Yes, you could obtain fat from a liposuction performed by a plastic surgeon and inject these fat-derived stem cells into animal models of low blood flow in the legs. Within a week or two, the injected cells helped restore the blood flow to near normal levels! The simplest hypothesis was that the stem cells converted into endothelial cells, the cell type which forms the lining of blood vessels. However, after several months of experiments, I found no consistent evidence of fat-derived stem cells transforming into endothelial cells. We ended up publishing a paper which proposed an alternative explanation that the stem cells were releasing growth factors that helped grow blood vessels. But this explanation was not as satisfying as I had hoped. It did not account for the fact that the stem cells had aligned themselves alongside blood vessel structures and behaved like blood vessel cells.

Even though I “murdered” my darling hypothesis of fat –derived stem cells converting into blood vessel endothelial cells at the time, I did not “bury” the hypothesis. It kept ruminating in the back of my mind until roughly one decade later when we were again studying how stem cells were improving blood vessel growth. The difference was that this time, I had access to a live-imaging confocal laser microscope which allowed us to take images of cells labeled with red and green fluorescent dyes over long periods of time. Below, you can see a video of human bone marrow mesenchymal stem cells (labeled green) and human endothelial cells (labeled red) observed with the microscope overnight. The short movie compresses images obtained throughout the night and shows that the stem cells indeed do not convert into endothelial cells. Instead, they form a scaffold and guide the endothelial cells (red) by allowing them to move alongside the green scaffold and thus construct their network. This work was published in 2013 in the Journal of Molecular and Cellular Cardiology, roughly a decade after I had been forced to give up on the initial hypothesis. Back in 2002, I had assumed that the stem cells were turning into blood vessel endothelial cells because they aligned themselves in blood vessel like structures. I had never considered the possibility that they were scaffold for the endothelial cells.

This and other similar experiences have lead me to reformulate the “murder your darlings” commandment to “murder your darling hypotheses but do not bury them”. Instead of repeatedly trying to defend scientific hypotheses that cannot be supported by emerging experimental data, it is better to give up on them. But this does not mean that we should forget and bury those initial hypotheses. With newer technologies, resources or collaborations, we may find ways to explain inconsistent results years later that were not previously available to us. This is why I regularly peruse my cemetery of dead hypotheses on my hard drive to see if there are ways of perhaps resurrecting them, not in their original form but in a modification that I am now able to test.



Fugelsang, J., Stein, C., Green, A., & Dunbar, K. (2004). Theory and Data Interactions of the Scientific Mind: Evidence From the Molecular and the Cognitive Laboratory. Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale, 58 (2), 86-95 DOI: 10.1037/h0085799


Note: An earlier version of this article first appeared on 3Quarksdaily.

Builders and Blocks – Engineering Blood Vessels with Stem Cells

Back in 2001, when we first began studying how regenerative cells (stem cells or more mature progenitor cells) enhance blood vessel growth, our group as well as many of our colleagues focused on one specific type of blood vessel: arteries. Arteries are responsible for supplying oxygen to all organs and tissues of the body and arteries are more likely to develop gradual plaque build-up (atherosclerosis) than veins or networks of smaller blood vessels (capillaries). Once the amount of plaque in an artery reaches a critical threshold, the oxygenation of the supplied tissues and organs becomes compromised. In addition to this build-up of plaque and gradual decline of organ function, arterial plaques can rupture and cause  severe sudden damage such as a heart attack. The conventional approach to treating arterial blockages in the heart was to either perform an open-heart bypass surgery in which blocked arteries were manually bypassed or to place a tube-like “stent” in the blocked artery to restore the oxygen supply. The hope was that injections of regenerative cells would ultimately replace the invasive procedures because the stem cells would convert into blood vessel cells, form healthy new arteries and naturally bypass the blockages in the existing arteries.


Image of mouse red blood cells flowing through an engineered human blood vessel- Image from Paul and colleagues (2013), Creative Commons license.
Image of mouse red blood cells flowing through an engineered human blood vessel implanted in a mouse- Image from Paul and colleagues (2013), Creative Commons license.

As is often the case in biomedical research, this initial approach turned out to be fraught with difficulties. The early animal studies were quite promising and the injected cells appeared to stimulate the growth of blood vessels, but the first clinical trials were less successful. It was very difficult to retain the injected cells in the desired arteries or tissues, and even harder to track the fate of the cells. Which stem cells should be injected? Where should they be injected? How many? Can one obtain enough stem cells from an individual patient so that one could use his or her own cells for the cell therapy? How does one guide the injected cells to the correct location, and then guide the cells to form functional blood vessel structures? Would the stem cells of a patient with chronic diseases such as diabetes or high blood pressure be suitable for therapies, or would such a patient have to rely on stem cells from healthier individuals and thus risk the complication of immune rejection?

The complexity of blood-vessel generation became increasingly apparent, both when studying the biology of stem cells as well as when designing and conducting clinical trials. A large clinical study published in 2013 studied the impact of bone marrow cell injections in heart attack patients and concluded that these injections did not result in any sustained benefit for heart function. Other studies using injections of patients’ own stem cells into their hearts had led to mild improvements in heart function, but none of these clinical studies came close to fulfilling the expectations of cardiovascular patients, physicians and researchers. The upside to these failed expectations was that it forced the researchers in the field of cardiovascular regeneration to rethink their goals and approaches.

One major shift in my own field of interest – the generation of new blood vessels – was to reevaluate the validity of relying on injections of cells. How likely was it that millions of injected cells could organize themselves into functional blood vessels? Injections of cells were convenient for patients because they would not require the surgical implantation of blood vessels, but was this attempt to achieve a convenient therapy undermining its success? An increasing number of laboratories began studying the engineering of blood vessels in the lab by investigating the molecular cues which regulate the assembly of blood vessel networks, identifying molecular scaffolds which would retain stem cells and blood vessel cells and combining various regenerative cell types to build functional blood vessels. This second wave of regenerative vascular medicine is engineering blood vessels which will have to be surgically implanted into patients.  This means that it will be much harder to get approval to conduct such invasive implantations in patients than the straightforward injections which were conducted in the first wave of studies, but most of us who have now moved towards a blood vessel engineering approach feel that there is a greater likelihood of long-term success even if it may take a decade or longer till we obtain our first definitive clinical results.

The second conceptual shift which has occurred in this field is the realization that blood vessel engineering is not only important for treating patients with blockages in their arteries. In fact, blood vessel engineering is critical for all forms of tissue and organ engineering. In the US, more than 120,000 people are awaiting an organ transplant but only a quarter of them will receive an organ in any given year. The number of people in need of a transplant will continue to grow but the supply of organs is limited and many patients will unfortunately die while waiting for an organ which they desperately need. The advances in stem cell biology have made it possible to envision creating organs or organoids (functional smaller parts of an organ) which could help alleviate the need for organs. One thing that most organs and tissues need is a network of tiny blood vessels that permeate the whole tissue: small capillary networks. For example, a liver built out of liver cells could never function without a network of tiny blood vessels which supply the liver cells with metabolites and oxygen. From an organ engineering point of view, microvessel engineering is just as important as the building of functional arteries.

In one of our recent projects, we engineered functional human blood vessels by combining bone marrow derived stem cells with endothelial cells (the cells which coat the inside of all blood vessels). It turns out that stem cells do not become endothelial cells but instead release a molecular signal – the protein SLIT3- which instructs the endothelial cells to assemble into networks. Using a high resolution microscope, we watched this process in real-time over a course of 72 hours in the laboratory and could observe how the endothelial cells began lining up into tube-like structures in the presence of the bone marrow stem cells. The human endothelial cells were like building blocks, the human bone marrow stem cells were the builders “overseeing” the construction. When we implanted the assembled blood vessel structures into mice, we could see that they were fully functional, allowing mouse blood to travel through them without leaking or causing any other major problems (see image, taken from reference 3).

I am sure that SLIT3 is just one of many molecular cues released by the stem cells to assemble functional networks and there are many additional mechanisms which still need to be discovered. We still need to learn much more about which “builders” and which “building blocks” are best suited for each type of blood vessel that we want to construct. The fact that human fat tissue can serve as an important resource for obtaining adult stem cells(“builders”) is quite encouraging, but we still know very little about the overall longevity of the engineered vessels, the best way to implant them into patients, and the key molecular and biomechanical mechanisms which will be required to engineer organs with functional blood vessels. It will be quite some time until the first fully engineered organs will be implanted in humans, but the dizzying rate of progress suggests that we can be quite optimistic.

(An earlier version of this article was first published on


References and links (all of them are open access, so you can read them for free, including the original paper we published):

 1. A recent longform overview article I wrote for “The Scientist” in which I describe the importance of blood vessel engineering for organ engineering:

J Rehman “Building Flesh and Blood“, The Scientist (2014), 28(5):48-53


2. An unusual and abundant source of adult stem cells which promote the formation of blood vessels: Fat tissue obtained from individuals who undergo a liposuction! 

J Rehman “The Power of Fat” Aeon Magazine (2014)


3. The study which describes how adult stem cells release a protein (SLIT3) which organizes blood vessel cells into functional networks (open access – can be read free of charge):

J.D. Paul et al., “SLIT3-ROBO4 activation promotes vascular network formation in human engineered tissue and angiogenesis in vivo” J Mol Cell Cardiol (2013), 64:124-31.

Paul JD, Coulombe KL, Toth PT, Zhang Y, Marsboom G, Bindokas VP, Smith DW, Murry CE, & Rehman J (2013). SLIT3-ROBO4 activation promotes vascular network formation in human engineered tissue and angiogenesis in vivo. Journal of molecular and cellular cardiology, 64, 124-31 PMID: 24090675

Fasting Improves Recovery of Bone Marrow Stem Cells after Chemotherapy

[Note: This is a guest post by Tauseef (@CellSpell)]

Fasting is defined as either completely abstaining from or minimizing food intake for a defined period time – ranging from about 12 hours to even a few weeks. Calorie restriction, on the other hand, refers to an overall reduction in the daily calorie intake by about 20%-40% without necessarily reducing the meal intake frequency. Although calorie restriction is well-suited for weight loss and thus also reduces the risk of chronic diseases such as diabetes or heart disease, proponents of fasting claim that it has distinct health benefits which cannot be attributed to weight loss.


Glass of water by Magalos via Shutterstock
Glass of water by Magalos via Shutterstock

Scientific data for the benefits of fasting have been rather limited, but some recent studies have now shown that fasting can enhance cellular resistance to toxins and increase longevity in laboratory animals as well as humans. Fasting has also been proposed as a therapeutic approach in the setting of selected diseases such as neurodegeneration, seizures and rheumatoid arthritis.

The recent study “Prolonged Fasting Reduces IGF-1/PKA to Promote Hematopoietic-Stem-Cell-Based Regeneration and Reverse Immunosuppression” published in the journal Cell Stem Cell by Cheng and colleagues in 2014 investigated a novel benefit of fasting – enhancing recovery from the side effects of treatments. The researchers looked into whether fasting in a mouse model of chemotherapy would help the mice recover from the suppression of their blood cell production. Chemotherapy drugs suppress the growth of malignant cancer cells, but they unfortunately often also affect healthy stem cells and other growing cells needed to maintain our health. The bone marrow contains blood-forming hematopoietic stem cells (HSCs), which churn out billions of healthy blood cells, such as white blood cells (WBCs) and red blood cells (RBCs), every day. When these healthy stem cells are suppressed or even eliminated as a form of collateral damage during chemotherapy, patients can develop severe anemia or immune suppression.

In their study, Cheng and colleagues showed a strong protective role of multiple cycles of fasting (no food for 48 to 120 hours) in mice treated with the chemotherapy drug cyclophosphamide. The fasting was able to partially reverse the suppression of bone marrow stem cells, improve immune function and reduce the death rate of the mice. The researchers found a similar protective effect of fasting in cancer patients (no food for 72 hours) who were treated with anti-cancer drugs as a part of phase I clinical trial, although there was no control group and no details were provided about the overall fluid and calorie intake of the patients.

By utilizing a gene array to screen for the expression levels of thousands of genes, the researchers determined that the benefits of fasting were due to the reduction of insulin-like growth factor-1 (IGF-1) hormone levels in the bone marrow. Suppression of IGF-1 by fasting increased the expansion of bone marrow stem cells (HSCs) and improved the immune function during chemotherapy. Mice in which the IGF-1 gene was deleted showed a similar degree of protection as what was observed in fasting mice.

Although, the present study provides interesting new insights for how fasting can improve bone marrow function in chemotherapy, some unanswered questions need to be addressed in future studies. People who are suffering from cancer routinely lose a substantial amount of weight during the progression of their disease, and it is not clear that their physical health would be able to tolerate the additional stress of fasting. Moreover, the researchers did not provide details about the calorie restriction and potential weight loss associated with the fasting. Perhaps the benefits in the mice were not due to fasting but instead due to calorie restriction. Furthermore, the patient study only showed that 72 hours of fasting increased lymphocyte counts, but did not describe the nutritional status and any potential weight loss in the patients.

This study is one of the first studies to uncover the molecular mechanisms of how fasting can improve the recovery of bone marrow stem cell function after chemotherapy. Despite its limitations, the study also identified the IGF-1 pathway as a potential new target for treatments to enhance bone marrow stem cell recovery. The outcomes of chemotherapy might be therefore improved by pharmacologically suppressing IGF-1 without requiring fasting, but this idea would still need to be tested in humans.


– M. Tauseef (@CellSpell)

Cheng, C., Adams, G., Perin, L., Wei, M., Zhou, X., Lam, B., Da Sacco, S., Mirisola, M., Quinn, D., Dorff, T., Kopchick, J., & Longo, V. (2014). Prolonged Fasting Reduces IGF-1/PKA to Promote Hematopoietic-Stem-Cell-Based Regeneration and Reverse Immunosuppression Cell Stem Cell, 14 (6), 810-823 DOI: 10.1016/j.stem.2014.04.014

The Road to Bad Science Is Paved with Obedience and Secrecy

We often laud intellectual diversity of a scientific research group because we hope that the multitude of opinions can help point out flaws and improve the quality of research long before it is finalized and written up as a manuscript. The recent events surrounding the research in one of the world’s most famous stem cell research laboratories at Harvard shows us the disastrous effects of suppressing diverse and dissenting opinions.

Cultured cells via Shutterstock
Cultured cells via Shutterstock

The infamous “Orlic paper” was a landmark research article published in the prestigious scientific journal Nature in 2001, which showed that stem cells contained in the bone marrow could be converted into functional heart cells. After a heart attack, injections of bone marrow cells reversed much of the heart attack damage by creating new heart cells and restoring heart function. It was called the “Orlic paper” because the first author of the paper was Donald Orlic, but the lead investigator of the study was Piero Anversa, a professor and highly respected scientist at New York Medical College.

Anversa had established himself as one of the world’s leading experts on the survival and death of heart muscle cells in the 1980s and 1990s, but with the start of the new millennium, Anversa shifted his laboratory’s focus towards the emerging field of stem cell biology and its role in cardiovascular regeneration. The Orlic paper was just one of several highly influential stem cell papers to come out of Anversa’s lab at the onset of the new millenium. A 2002 Anversa paper in the New England Journal of Medicine – the world’s most highly cited academic journal –investigated the hearts of human organ transplant recipients. This study showed that up to 10% of the cells in the transplanted heart were derived from the recipient’s own body. The only conceivable explanation was that after a patient received another person’s heart, the recipient’s own cells began maintaining the health of the transplanted organ. The Orlic paper had shown the regenerative power of bone marrow cells in mouse hearts, but this new paper now offered the more tantalizing suggestion that even human hearts could be regenerated by circulating stem cells in their blood stream.

Woman having a heart attack via Shutterstock
Woman having a heart attack via Shutterstock

2003 publication in Cell by the Anversa group described another ground-breaking discovery, identifying a reservoir of stem cells contained within the heart itself. This latest coup de force found that the newly uncovered heart stem cell population resembled the bone marrow stem cells because both groups of cells bore the same stem cell protein called c-kit and both were able to make new heart muscle cells. According to Anversa, c-kit cells extracted from a heart could be re-injected back into a heart after a heart attack and regenerate more than half of the damaged heart!

These Anversa papers revolutionized cardiovascular research. Prior to 2001, most cardiovascular researchers believed that the cell turnover in the adult mammalian heart was minimal because soon after birth, heart cells stopped dividing. Some organs or tissues such as the skin contained stem cells which could divide and continuously give rise to new cells as needed. When skin is scraped during a fall from a bike, it only takes a few days for new skin cells to coat the area of injury and heal the wound. Unfortunately, the heart was not one of those self-regenerating organs. The number of heart cells was thought to be more or less fixed in adults. If heart cells were damaged by a heart attack, then the affected area was replaced by rigid scar tissue, not new heart muscle cells. If the area of damage was large, then the heart’s pump function was severely compromised and patients developed the chronic and ultimately fatal disease known as “heart failure”.

Anversa’s work challenged this dogma by putting forward a bold new theory: the adult heart was highly regenerative, its regeneration was driven by c-kit stem cells, which could be isolated and used to treat injured hearts. All one had to do was harness the regenerative potential of c-kit cells in the bone marrow and the heart, and millions of patients all over the world suffering from heart failure might be cured. Not only did Anversa publish a slew of supportive papers in highly prestigious scientific journals to challenge the dogma of the quiescent heart, he also happened to publish them at a unique time in history which maximized their impact.

In the year 2001, there were few innovative treatments available to treat patients with heart failure. The standard approach was to use medications that would delay the progression of heart failure. But even the best medications could not prevent the gradual decline of heart function. Organ transplants were a cure, but transplantable hearts were rare and only a small fraction of heart failure patients would be fortunate enough to receive a new heart. Hopes for a definitive heart failure cure were buoyed when researchers isolated human embryonic stem cells in 1998. This discovery paved the way for using highly pliable embryonic stem cells to create new heart muscle cells, which might one day be used to restore the heart’s pump function without  resorting to a heart transplant.


Human heart jigsaw puzzle via Shutterstock
Human heart jigsaw puzzle via Shutterstock

The dreams of using embryonic stem cells to regenerate human hearts were soon squashed when the Bush administration banned the generation of new human embryonic stem cells in 2001, citing ethical concerns. These federal regulations and the lobbying of religious and political groups against human embryonic stem cells were a major blow to research on cardiovascular regeneration. Amidst this looming hiatus in cardiovascular regeneration, Anversa’s papers appeared and showed that one could steer clear of the ethical controversies surrounding embryonic stem cells by using an adult patient’s own stem cells. The Anversa group re-energized the field of cardiovascular stem cell research and cleared the path for the first human stem cell treatments in heart disease.

Instead of having to wait for the US government to reverse its restrictive policy on human embryonic stem cells, one could now initiate clinical trials with adult stem cells, treating heart attack patients with their own cells and without having to worry about an ethical quagmire. Heart failure might soon become a disease of the past. The excitement at all major national and international cardiovascular conferences was palpable whenever the Anversa group, their collaborators or other scientists working on bone marrow and cardiac stem cells presented their dizzyingly successful results. Anversa received numerous accolades for his discoveries and research grants from the NIH (National Institutes of Health) to further develop his research program. He was so successful that some researchers believed Anversa might receive the Nobel Prize for his iconoclastic work which had redefined the regenerative potential of the heart. Many of the world’s top universities were vying to recruit Anversa and his group, and he decided to relocate his research group to Harvard Medical School and Brigham and Women’s Hospital 2008.

There were naysayers and skeptics who had resisted the adult stem cell euphoria. Some researchers had spent decades studying the heart and found little to no evidence for regeneration in the adult heart. They were having difficulties reconciling their own results with those of the Anversa group. A number of practicing cardiologists who treated heart failure patients were also skeptical because they did not see the near-miraculous regenerative power of the heart in their patients. One Anversa paper went as far as suggesting that the whole heart would completely regenerate itself roughly every 8-9 years, a claim that was at odds with the clinical experience of practicing cardiologists.  Other researchers pointed out serious flaws in the Anversa papers. For example, the 2002 paper on stem cells in human heart transplant patients claimed that the hearts were coated with the recipient’s regenerative cells, including cells which contained the stem cell marker Sca-1. Within days of the paper’s publication, many researchers were puzzled by this finding because Sca-1 was a marker of mouse and rat cells – not human cells! If Anversa’s group was finding rat or mouse proteins in human hearts, it was most likely due to an artifact. And if they had mistakenly found rodent cells in human hearts, so these critics surmised, perhaps other aspects of Anversa’s research were similarly flawed or riddled with artifacts.

At national and international meetings, one could observe heated debates between members of the Anversa camp and their critics. The critics then decided to change their tactics. Instead of just debating Anversa and commenting about errors in the Anversa papers, they invested substantial funds and efforts to replicate Anversa’s findings. One of the most important and rigorous attempts to assess the validity of the Orlic paper was published in 2004, by the research teams of Chuck Murry and Loren Field. Murry and Field found no evidence of bone marrow cells converting into heart muscle cells. This was a major scientific blow to the burgeoning adult stem cell movement, but even this paper could not deter the bone marrow cell champions.

Despite the fact that the refutation of the Orlic paper was published in 2004, the Orlic paper continues to carry the dubious distinction of being one of the most cited papers in the history of stem cell research. At first, Anversa and his colleagues would shrug off their critics’ findings or publish refutations of refutations – but over time, an increasing number of research groups all over the world began to realize that many of the central tenets of Anversa’s work could not be replicated and the number of critics and skeptics increased. As the signs of irreplicability and other concerns about Anversa’s work mounted, Harvard and Brigham and Women’s Hospital were forced to initiate an internal investigation which resulted in the retraction of one Anversa paper and an expression of concern about another major paper. Finally, a research group published a paper in May 2014 using mice in which c-kit cells were genetically labeled so that one could track their fate and found that c-kit cells have a minimal – if any – contribution to the formation of new heart cells: a fraction of a percent!

The skeptics who had doubted Anversa’s claims all along may now feel vindicated, but this is not the time to gloat. Instead, the discipline of cardiovascular stem cell biology is now undergoing a process of soul-searching. How was it possible that some of the most widely read and cited papers were based on heavily flawed observations and assumptions? Why did it take more than a decade since the first refutation was published in 2004 for scientists to finally accept that the near-magical regenerative power of the heart turned out to be a pipe dream.

One reason for this lag time is pretty straightforward: It takes a tremendous amount of time to refute papers. Funding to conduct the experiments is difficult to obtain because grant funding agencies are not easily convinced to invest in studies replicating existing research. For a refutation to be accepted by the scientific community, it has to be at least as rigorous as the original, but in practice, refutations are subject to even greater scrutiny. Scientists trying to disprove another group’s claim may be asked to develop even better research tools and technologies so that their results can be seen as more definitive than those of the original group. Instead of relying on antibodies to identify c-kit cells, the 2014 refutation developed a transgenic mouse in which all c-kit cells could be genetically traced to yield more definitive results – but developing new models and tools can take years.

The scientific peer review process by external researchers is a central pillar of the quality control process in modern scientific research, but one has to be cognizant of its limitations. Peer review of a scientific manuscript is routinely performed by experts for all the major academic journals which publish original scientific results. However, peer review only involves a “review”, i.e. a general evaluation of major strengths and flaws, and peer reviewers do not see the original raw data nor are they provided with the resources to replicate the studies and confirm the veracity of the submitted results. Peer reviewers rely on the honor system, assuming that the scientists are submitting accurate representations of their data and that the data has been thoroughly scrutinized and critiqued by all the involved researchers before it is even submitted to a journal for publication. If peer reviewers were asked to actually wade through all the original data generated by the scientists and even perform confirmatory studies, then the peer review of every single manuscript could take years and one would have to find the money to pay for the replication or confirmation experiments conducted by peer reviewers. Publication of experiments would come to a grinding halt because thousands of manuscripts would be stuck in the purgatory of peer review. Relying on the integrity of the scientists submitting the data and their internal review processes may seem naïve, but it has always been the bedrock of scientific peer review. And it is precisely the internal review process which may have gone awry in the Anversa group.

Just like Pygmalion fell in love with Galatea, researchers fall in love with the hypotheses and theories that they have constructed. To minimize the effects of these personal biases, scientists regularly present their results to colleagues within their own groups at internal lab meetings and seminars or at external institutions and conferences long before they submit their data to a peer-reviewed journal. The preliminary presentations are intended to spark discussions, inviting the audience to challenge the veracity of the hypotheses and the data while the work is still in progress. Sometimes fellow group members are truly skeptical of the results, at other times they take on the devil’s advocate role to see if they can find holes in their group’s own research. The larger a group, the greater the chance that one will find colleagues within a group with dissenting views. This type of feedback is a necessary internal review process which provides valuable insights that can steer the direction of the research.

Considering the size of the Anversa group – consisting of 20, 30 or even more PhD students, postdoctoral fellows and senior scientists – it is puzzling why the discussions among the group members did not already internally challenge their hypotheses and findings, especially in light of the fact that they knew extramural scientists were having difficulties replicating the work.

Retraction Watch is one of the most widely read scientific watchdogs which tracks scientific misconduct and retractions of published scientific papers. Recently, Retraction Watch published the account of an anonymous whistleblower who had worked as a research fellow in Anversa’s group and provided some unprecedented insights into the inner workings of the group, which explain why the internal review process had failed:

“I think that most scientists, perhaps with the exception of the most lucky or most dishonest, have personal experience with failure in science—experiments that are unreproducible, hypotheses that are fundamentally incorrect. Generally, we sigh, we alter hypotheses, we develop new methods, we move on. It is the data that should guide the science.

 In the Anversa group, a model with much less intellectual flexibility was applied. The “Hypothesis” was that c-kit (cd117) positive cells in the heart (or bone marrow if you read their earlier studies) were cardiac progenitors that could: 1) repair a scarred heart post-myocardial infarction, and: 2) supply the cells necessary for cardiomyocyte turnover in the normal heart.

 This central theme was that which supplied the lab with upwards of $50 million worth of public funding over a decade, a number which would be much higher if one considers collaborating labs that worked on related subjects.

 In theory, this hypothesis would be elegant in its simplicity and amenable to testing in current model systems. In practice, all data that did not point to the “truth” of the hypothesis were considered wrong, and experiments which would definitively show if this hypothesis was incorrect were never performed (lineage tracing e.g.).”

Discarding data that might have challenged the central hypothesis appears to have been a central principle.


Hood over screen - via Shutterstock
Hood over screen – via Shutterstock

According to the whistleblower, Anversa’s group did not just discard undesirable data, they actually punished group members who would question the group’s hypotheses:

In essence, to Dr. Anversa all investigators who questioned the hypothesis were “morons,” a word he used frequently at lab meetings. For one within the group to dare question the central hypothesis, or the methods used to support it, was a quick ticket to dismissal from your position.

The group also created an environment of strict information hierarchy and secrecy which is antithetical to the spirit of science:

“The day to day operation of the lab was conducted under a severe information embargo. The lab had Piero Anversa at the head with group leaders Annarosa Leri, Jan Kajstura and Marcello Rota immediately supervising experimentation. Below that was a group of around 25 instructors, research fellows, graduate students and technicians. Information flowed one way, which was up, and conversation between working groups was generally discouraged and often forbidden.

 Raw data left one’s hands, went to the immediate superior (one of the three named above) and the next time it was seen would be in a manuscript or grant. What happened to that data in the intervening period is unclear.

 A side effect of this information embargo was the limitation of the average worker to determine what was really going on in a research project. It would also effectively limit the ability of an average worker to make allegations regarding specific data/experiments, a requirement for a formal investigation.

This segregation of information is a powerful method to maintain an authoritarian rule and is more typical for terrorist cells or intelligence agencies than for a scientific lab, but it would definitely explain how the Anversa group was able to mass produce numerous irreproducible papers without any major dissent from within the group.

In addition to the secrecy and segregation of information, the group also created an atmosphere of fear to ensure obedience:

“Although individually-tailored stated and unstated threats were present for lab members, the plight of many of us who were international fellows was especially harrowing. Many were technically and educationally underqualified compared to what might be considered average research fellows in the United States. Many also originated in Italy where Dr. Anversa continues to wield considerable influence over biomedical research.

 This combination of being undesirable to many other labs should they leave their position due to lack of experience/training, dependent upon employment for U.S. visa status, and under constant threat of career suicide in your home country should you leave, was enough to make many people play along.

 Even so, I witnessed several people question the findings during their time in the lab. These people and working groups were subsequently fired or resigned. I would like to note that this lab is not unique in this type of exploitative practice, but that does not make it ethically sound and certainly does not create an environment for creative, collaborative, or honest science.”

Foreign researchers are particularly dependent on their employment to maintain their visa status and the prospect of being fired from one’s job can be terrifying for anyone.

This is an anonymous account of a whistleblower and as such, it is problematic. The use of anonymous sources in science journalism could open the doors for all sorts of unfounded and malicious accusations, which is why the ethics of using anonymous sources was heavily debated at the recent ScienceOnline conference. But the claims of the whistleblower are not made in a vacuum – they have to be evaluated in the context of known facts. The whistleblower’s claim that the Anversa group and their collaborators received more than $50 million to study bone marrow cell and c-kit cell regeneration of the heart can be easily verified at the public NIH grant funding RePORTer website. The whistleblower’s claim that many of the Anversa group’s findings could not be replicated is also a verifiable fact. It may seem unfair to condemn Anversa and his group for creating an atmosphere of secrecy and obedience which undermined the scientific enterprise, caused torment among trainees and wasted millions of dollars of tax payer money simply based on one whistleblower’s account. However, if one looks at the entire picture of the amazing rise and decline of the Anversa group’s foray into cardiac regeneration, then the whistleblower’s description of the atmosphere of secrecy and hierarchy seems very plausible.

The investigation of Harvard into the Anversa group is not open to the public and therefore it is difficult to know whether the university is primarily investigating scientific errors or whether it is also looking into such claims of egregious scientific misconduct and abuse of scientific trainees. It is unlikely that Anversa’s group is the only group that might have engaged in such forms of misconduct. Threatening dissenting junior researchers with a loss of employment or visa status may be far more common than we think. The gravity of the problem requires that the NIH – the major funding agency for biomedical research in the US – should look into the prevalence of such practices in research labs and develop safeguards to prevent the abuse of science and scientists.


Note: An earlier version of this article was first published on

To Err Is Human, To Study Errors Is Science

The family of cholesterol lowering drugs known as ‘statins’ are among the most widely prescribed medications for patients with cardiovascular disease. Large-scale clinical studies have repeatedly shown that statins can significantly lower cholesterol levels and the risk of future heart attacks, especially in patients who have already been diagnosed with cardiovascular disease. A more contentious issue is the use of statins in individuals who have no history of heart attacks, strokes or blockages in their blood vessels. Instead of waiting for the first major manifestation of cardiovascular disease, should one start statin therapy early on to prevent cardiovascular disease?

If statins were free of charge and had no side effects whatsoever, the answer would be rather straightforward: Go ahead and use them as soon as possible. However, like all medications, statins come at a price. There is the financial cost to the patient or their insurance to pay for the medications, and there is a health cost to the patients who experience potential side effects. The Guideline Panel of the American College of Cardiology (ACC) and the American Heart Association (AHA) therefore recently recommended that the preventive use of statins in individuals without known cardiovascular disease should be based on personalized risk calculations. If the risk of developing disease within the next 10 years is greater than 7.5%, then the benefits of statin therapy outweigh its risks and the treatment should be initiated. The panel also indicated that if the 10-year risk of cardiovascular disease is greater than 5%, then physicians should consider prescribing statins, but should bear in mind that the scientific evidence for this recommendation was not as strong as that for higher-risk individuals.


Oops button - via Shutterstock
Oops button – via Shutterstock

Using statins in low risk patients

The recommendation that individuals with comparatively low risk of developing future cardiovascular disease (10-year risk lower than 10%) would benefit from statins was met skepticism by some medical experts. In October 2013, the British Medical Journal (BMJ) published a paper by John Abramson, a lecturer at Harvard Medical School, and his colleagues which re-evaluated the data from a prior study on statin benefits in patients with less than 10% cardiovascular disease risk over 10 years. Abramson and colleagues concluded that the statin benefits were over-stated and that statin therapy should not be expanded to include this group of individuals. To further bolster their case, Abramson and colleagues also cited a 2013 study by Huabing Zhang and colleagues in the Annals of Internal Medicine which (according to Abramson et al.) had reported that 18 % of patients discontinued statins due to side effects. Abramson even highlighted the finding from the Zhang study by including it as one of four bullet points summarizing the key take-home messages of his article.

The problem with this characterization of the Zhang study is that it ignored all the caveats that Zhang and colleagues had mentioned when discussing their findings. The Zhang study was based on the retrospective review of patient charts and did not establish a true cause-and-effect relationship between the discontinuation of the statins and actual side effects of statins. Patients may stop taking medications for many reasons, but this does not necessarily mean that it is due to side effects from the medication. According to the Zhang paper, 17.4% of patients in their observational retrospective study had reported a “statin related incident” and of those only 59% had stopped the medication. The fraction of patients discontinuing statins due to suspected side effects was at most 9-10% instead of the 18% cited by Abramson. But as Zhang pointed out, their study did not include a placebo control group. Trials with placebo groups document similar rates of “side effects” in patients taking statins and those taking placebos, suggesting that only a small minority of perceived side effects are truly caused by the chemical compounds in statin drugs.


Admitting errors is only the first step

Whether 18%, 9% or a far smaller proportion of patients experience significant medication side effects is no small matter because the analysis could affect millions of patients currently being treated with statins. A gross overestimation of statin side effects could prompt physicians to prematurely discontinue medications that have been shown to significantly reduce the risk of heart attacks in a wide range of patients. On the other hand, severely underestimating statin side effects could result in the discounting of important symptoms and the suffering of patients. Abramson’s misinterpretation of statin side effect data was pointed out by readers of the BMJ soon after the article published, and it prompted an inquiry by the journal. After re-evaluating the data and discussing the issue with Abramson and colleagues, the journal issued a correction in which it clarified the misrepresentation of the Zhang paper.

Fiona Godlee, the editor-in-chief of the BMJ also wrote an editorial explaining the decision to issue a correction regarding the question of side effects and that there was not sufficient cause to retract the whole paper since the other points made by Abramson and colleagues – the lack of benefit in low risk patients – might still hold true. Instead, Godlee recognized the inherent bias of a journal’s editor when it comes to deciding on whether or not to retract a paper. Every retraction of a peer reviewed scholarly paper is somewhat of an embarrassment to the authors of the paper as well as the journal because it suggests that the peer review process failed to identify one or more major flaws. In a commendable move, the journal appointed a multidisciplinary review panel which includes leading cardiovascular epidemiologists. This panel will review the Abramson paper as well as another BMJ paper which had also cited the inaccurately high frequency of statin side effects, investigate the peer review process that failed to identify the erroneous claims and provide recommendations regarding the ultimate fate of the papers.


Reviewing peer review

Why didn’t the peer reviewers who evaluated Abramson’s article catch the error prior to its publication? We can only speculate as to why such a major error was not identified by the peer reviewers. One has to bear in mind that “peer review” for academic research journals is just that – a review. In most cases, peer reviewers do not have access to the original data and cannot check the veracity or replicability of analyses and experiments. For most journals, peer review is conducted on a voluntary (unpaid) basis by two to four expert reviewers who routinely spend multiple hours analyzing the appropriateness of the experimental design, methods, presentation of results and conclusions of a submitted manuscript. The reviewers operate under the assumption that the authors of the manuscript are professional and honest in terms of how they present the data and describe their scientific methodology.

In the case of Abramson and colleagues, the correction issued by the BMJ refers not to Abramson’s own analysis but to the misreading of another group’s research. Biomedical research papers often cite 30 or 40 studies, and it is unrealistic to expect that peer reviewers read all the cited papers and ensure that they are being properly cited and interpreted. If this were the expectation, few peer reviewers would agree to serve as volunteer reviewers since they would have hardly any time left to conduct their own research. However, in this particular case, most peer reviewers familiar with statins and the controversies surrounding their side effects should have expressed concerns regarding the extraordinarily high figure of 18% cited by Abramson and colleagues. Hopefully, the review panel will identify the reasons for the failure of BMJ’s peer review system and point out ways to improve it.


To err is human, to study errors is science

All researchers make mistakes, simply because they are human. It is impossible to eliminate all errors in any endeavor that involves humans, but we can construct safeguards that help us reduce the occurrence and magnitude of our errors. Overt fraud and misconduct are rare causes of errors in research, but their effects on any given research field can be devastating. One of the most notorious occurrences of research fraud is the case of the Dutch psychologist Diederik Stapel who published numerous papers based on blatant fabrication of data – showing ‘results’ of experiments on non-existent study subjects. The field of cell therapy in cardiovascular disease recently experienced a major setback when a university review of studies headed by the German cardiologist Bodo Strauer found evidence of scientific misconduct. The significant discrepancies and irregularities in Strauer’s studies have now lead to wide-ranging skepticism about the efficacy of using bone marrow cell infusions to treat heart disease.


It is difficult to obtain precise numbers to quantify the actual extent of severe research misconduct and fraud since it may go undetected. Even when such cases are brought to the attention of the academic leadership, the involved committees and administrators may decide to keep their findings confidential and not disclose them to the public. However, most researchers working in academic research environments would probably agree that these are rare occurrences. A far more likely source of errors in research is the cognitive bias of the researchers. Researchers who believe in certain hypotheses and ideas are prone to interpreting data in a manner most likely to support their preconceived notions. For example, it is likely that a researcher opposed to statin usage will interpret data on side effects of statins differently than a researcher who supports statin usage. While Abramson may have been biased in the interpretation of the data generated by Zhang and colleagues, the field of cardiovascular regeneration is currently grappling in what appears to be a case of biased interpretation of one’s own data. An institutional review by Harvard Medical School and Brigham and Women’s Hospital recently determined that the work of Piero Anversa, one of the world’s most widely cited stem cell researchers, was significantly compromised and warranted a retraction. His group had reported that the adult human heart exhibited an amazing regenerative potential, suggesting that roughly every 8 to 9 years the adult human heart replaces its entire collective of beating heart cells (a 7% – 19% yearly turnover of beating heart cells). These findings were in sharp contrast to a prior study which had found only a minimal turnover of beating heart cells (1% or less per year) in adult humans. Anversa’s finding was also at odds with the observations of clinical cardiologists who rarely observe a near-miraculous recovery of heart function in patients with severe heart disease. One possible explanation for the huge discrepancy between the prior research and Anversa’s studies was that Anversa and his colleagues had not taken into account the possibility of contaminations that could have falsely elevated the cell regeneration counts.


Improving the quality of research: peer review and more

Despite the fact that researchers are prone to make errors due to inherent biases does not mean we should simply throw our hands up in the air, say “Mistakes happen!” and let matters rest. High quality science is characterized by its willingness to correct itself, and this includes improving methods to detect and correct scientific errors early on so that we can limit their detrimental impact. The realization that lack of reproducibility of peer-reviewed scientific papers is becoming a major problem for many areas of research such as psychology, stem cell research and cancer biology has prompted calls for better ways to track reproducibility and errors in science.

One important new paradigm that is being discussed to improve the quality of scholar papers is the role of post-publication peer evaluation. Instead of viewing the publication of a peer-reviewed research paper as an endpoint, post publication peer evaluation invites fellow scientists to continue commenting on the quality and accuracy of the published research even after its publication and to engage the authors in this process. Traditional peer review relies on just a handful of reviewers who decide about the fate of a manuscript, but post publication peer evaluation opens up the debate to hundreds or even thousands of readers which may be able to detect errors that could not be identified by the small number of traditional peer reviewers prior to publication. It is also becoming apparent that science journalists and science writers can play an important role in the post-publication evaluation of published research papers by investigating and communicating research flaws identified in research papers. In addition to helping dismantle the Science Mystique, critical science journalism can help ensure that corrections, retractions or other major concerns about the validity of scientific findings are communicated to a broad non-specialist audience.

In addition to these ongoing efforts to reduce errors in science by improving the evaluation of scientific papers, it may also be useful to consider new pro-active initiatives which focus on how researchers perform and design experiments. As the head of a research group at an American university, I have to take mandatory courses (in some cases on an annual basis) informing me about laboratory hazards, ethics of animal experimentation or the ethics of how to conduct human studies. However, there are no mandatory courses helping us identify our own research biases or how to minimize their impact on the interpretation of our data. There is an underlying assumption that if you are no longer a trainee, you probably know how to perform and interpret scientific experiments. I would argue that it does not hurt to remind scientists regularly – no matter how junior or senior- that they can become victims of their biases. We have to learn to continuously re-evaluate how we conduct science and to be humble enough to listen to our colleagues, especially when they disagree with us.


Note: A shorter version of this article was first published at The Conversation with excellent editorial input provided by Jo Adetunji.
Abramson, J., Rosenberg, H., Jewell, N., & Wright, J. (2013). Should people at low risk of cardiovascular disease take a statin? BMJ, 347 (oct22 3) DOI: 10.1136/bmj.f6123

Does Human Fat Contain Stem Cells?

Aeon Magazine recently published my longform essay on our research with human liposuction samples and our attempts to use fat for regenerative and therapeutic purposes. Many research groups, including our own group, have been able to isolate stem cells from human fat. However, when it came to using this cells for treating cardiovascular disease, the cells behaved in a manner that we had not anticipated.

Undifferentiated mesenchymal stem cells (left) and their fat neighbors (right)
Undifferentiated mesenchymal stem cells (left) and their fat neighbors (right) – From our PLOS One paper

We were unable to convert them into heart muscle cells or blood vessel endothelial cells, but we found that they could help build large networks of blood vessels by releasing important growth factors. Within a few years of our initial publication, clinical trials with patients with blocked arteries or legs were already being planned, and are currently underway.

We decided to call the cells “adipose stromal cells” because we wanted to emphasize that they were acting as a “stroma” (i.e. supportive environment for blood vessels) and not necessarily as stem cells (i.e. cells that convert from an undifferentiated state into mature cell types). In other contexts, these same cells were indeed able to act like “stem cells”, because they could be converted into bone-forming or cartilage-forming cells, thus showing the enormous versatility and value of the cells that reside within our fat tissues.

The answer to the question “Does Human Fat Contain Stem Cells?” is Yes, but these cells cannot be converted into all desired tissues. Instead, they have important supportive functions that can be used to engineer new blood vessels, which is a critical step in organ engineering.

In addition to describing our scientific work, the essay also mentions the vagaries of research, the frustrations I had as a postdoctoral fellow when my results were not turning out as I had expected, and how some predatory private clinics are already marketing “fat-derived stem cell therapies” to paying customers, even though the clinical results are still rather preliminary.


For the readers who want to dig a bit deeper, here are some references and links:


1. The original paper by Patricia Zuk and colleagues which described the presence of stem cells in human liposuction fat:

Zuk, P et al (2001) “Multilineage Cells from Human Adipose Tissue: Implications for Cell-Based Therapies


2. Our work on how the cells can help grow blood vessels by releasing proteins:

Rehman, J et al (2004) “Secretion of Angiogenic and Antiapoptotic Factors by Human Adipose Stromal Cells


3. Preliminary findings from ongoing clinical studies in which heart attack patients receive infusions of fat derived cells into their hearts to improve heart function and blood flow to the heart:

Houtgraf, J et al (2012) “First Experience in Humans Using Adipose Tissue–Derived Regenerative Cells in the Treatment of Patients With ST-Segment Elevation Myocardial Infarction


4. Preliminary results from an ongoing trial using the fat-derived cells in patients with severe blockages of leg arteries:

Bura, A et al (2014) “Phase I trial: the use of autologous cultured adipose-derived stroma/stem cells to treat patients with non-revascularizable critical limb ischemia


5. Example of how “cell therapies” (in this case bone marrow cells) are sometimes marketed as “stem cells” but hardly contain any stem cells:

The Largest Cell Therapy Trial in Heart Attack Patients Uses Hardly Any Stem Cells


6. The major scientific society devoted to studying the science of fat and its cells as novel therapies is called International Federation for Adipose Therapeutics and Science (IFATS).

I am not kidding, it is I-FATS!

Explore their website if you want to learn about all the exciting new research with fat derived cells.


7. Some of our newer work on how bone marrow mesenchymal stem cells turn into fat cells and what role their metabolism plays during this process:

Zhang, Y et al (2013) “Mitochondrial Respiration Regulates Adipogenic Differentiation of Human Mesenchymal Stem Cells

Zuk PA, Zhu M, Mizuno H, Huang J, Futrell JW, Katz AJ, Benhaim P, Lorenz HP, & Hedrick MH (2001). Multilineage cells from human adipose tissue: implications for cell-based therapies. Tissue engineering, 7 (2), 211-28 PMID: 11304456

Rehman J, Traktuev D, Li J, Merfeld-Clauss S, Temm-Grove CJ, Bovenkerk JE, Pell CL, Johnstone BH, Considine RV, & March KL (2004). Secretion of angiogenic and antiapoptotic factors by human adipose stromal cells. Circulation, 109 (10), 1292-8 PMID: 14993122

Lab Grown Organs and Artistic Computers in Fifty Years?

The Pew Research Center released the 2014 survey of U.S. adults (1,001 participants, surveyed by land-line or cell phone interviews) regarding their views on technological advancements in the next 50 years.

Robot – via Shutterstock

Over eighty percent of the participants said that “People in need of an organ transplant will have new organs custom made for them in a lab” and roughly half of the participants felt that “Computers will be as effective as people at creating important works of art such as music, novels, movies, or paintings” within the next 50 years. The vast majority did not think that humans will be able to control the weather during the next few decades.

As someone working in the field of vascular and tissue engineering, I think that the perception of scientists being able to engineer transplantable organs within 50 years is realistic. We have made quite a bit of progress in the past decade when it comes to deriving functional tissues from stem cells, but we still need more research before we will be able to build functional organs. It may take a decade or two before we can reliably generate these organs, and even longer to teat and optimize them for therapeutic purposes, and to ensure their long-term survival in transplant recipients.

50 year predictions

The reason to be optimistic about engineering organs is that we have already seen examples of engineered tissues and small organoids being implanted into animal models. There are also ongoing early clinical trials with patches of engineered tissues and engineered blood vessels. Scaling up these successes to whole organ engineering in humans will be challenging but sounds feasible.

I am surprised by the fact that half of the U.S. adults believe computers will be “effective” at creating works of art within the next 50 years. Do we have preliminary evidence – even at a small scale – that computers can currently “create” art? Perhaps this comes down to our definitions of what constitutes “creativity”. One could envision computers generating paintings, music and novels based on existing art created by humans. But is that true creativity? Then again, when humans “create” art, they also base their new product on their experiences and prior art created by other humans. Maybe computer-created art in fifty years isn’t  far-fetched after all.


Attitudes towards changes

Not everyone is enthusiastic about new technologies.


When asked whether it would be a change for the better or a change for the worse……


1) “If most people wear implants or other devices that constantly show them information about the world around them”

2) “If lifelike robots become the primary caregivers for the elderly and people in poor health”

3) “If personal and commercial drones are given permission to fly through most U.S. airspace”

4) “If prospective parents can alter the DNA of their children to produce smarter, healthier, or more athletic offspring”


…the majority of participants felt they would be worse off with these changes.

The way the questions were phrased did not leave room for a more nuanced response. For example, would it be ok to change DNA to “produce” healthier children (i.e. correct lethal genetic defects using genome editing) without necessarily “producing” smarter and more athletic children?

Conflating health, intelligence and athleticism into one question makes it difficult to ascertain how the public feels about using genome editing to help children survive versus using it to make kids run faster.

Most participants did not think they would want to eat lab grown meat or use brain implants to improve their mental capacity but roughly half of them seemed fine with using driverless cars.


Lab grown meat


When asked about what futuristic invention they would like to own, younger participants seemed most excited about time travel and other travel gadgets (flying cars, bikes and space crafts), whereas older participants wanted to inventions to prolong life or cure diseases.

What do people want

I was a bit surprised that this final question did not elicit responses such as inventions that would help reduce or reverse global warming and pollution or inventions that could remedy world hunger and the global scarcity of resources. Maybe it has to also do with how the question was phrased. Here is the actual question:

Science fiction writers have always imagined new inventions that change the world of the future. How about you? If there was one futuristic invention that you could own, what would it be?


Here is the actual data (PDF) of the responses people gave:


Improved health and longevity/Cure for diseases                        9%

Time machine/Time travel                                                           9%

Flying car/Flying bike                                                                6%

Personal robot/Robot servants                                                 4%

Personal space craft                                                                 4%

Self-driving car                                                                         3%

Teleporter/Teleportation/Transporter                                           3%

World peace/Stop wars/Improved understanding/Better planet     2%

New energy source/efficient cars/other environment                    2%

Invention to make household tasks easier                                   1%

Ability to live forever/Immortality                                                  1%

Jetpack                                                                                    1%

Money/Scheme to get rich/Ability to read future                         1%

Brain implant/Improve memory                                                   1%

Hovercar/Hoverboard                                                                1%

Hologram/Holodeck                                                                  *

Remote communications (via device or ESP)                              *

Other                                                                                        9%

None/Nothing/Not interested in futuristic inventions                     11%


The science fiction reference in the question may have prompted participants to think of technologies described in sci-fi novels and movies. Perhaps the majority of respondents did not think that world peace or climate-control could be achieved with specific sci-fi style inventions. Or perhaps the participants did not realize that climate change, global scarcity of food or other resources and violent conflicts are some of the biggest threats that humankind has ever faced.

Many of the responses to this final question tend to fall into the category of “how could my life become more convenient“, such as using personal robots and flying cars. But will these conveniences even matter if we cannot curb the major threats that our planet faces?