Blissful Ignorance: How Environmental Activists Shut Down Molecular Biology Labs in High Schools

Hearing about the HannoverGEN project made me feel envious and excited. Envious, because I wish my high school had offered the kind of hands-on molecular biology training provided to high school students in Hannover, the capital of the German state of Niedersachsen. Excited, because it reminded me of the joy I felt when I first isolated DNA and ran gels after restriction enzyme digests during my first year of university in Munich. I knew that many of the students at the HannoverGEN high schools would be similarly thrilled by their laboratory experience and perhaps even pursue careers as biologists or biochemists.

What did HannoverGEN entail? It was an optional pilot program initiated and funded by the state government of Niedersachsen at four high schools in the Hannover area. Students enrolled in the HannoverGEN classes would learn to use molecular biology tools typically reserved for college-level or graduate school courses in order to study plant genetics. Some of the basic experiments involved isolating DNA from cabbage or how learning how bacteria transfer genes to plants, more advanced experiments enabled the students to analyze whether or not the genome of a provided maize sample had been genetically modified. Each experimental unit was accompanied by relevant theoretical instruction on the molecular mechanisms of gene expression and biotechnology as well as ethical discussions regarding the benefits and risks of generating genetically modified organisms (“GMOs”). The details of the HannoverGEN program are only accessible through the the Wayback Machine Internet archive because the award-winning educational program and the associated website were shut down in 2013 at the behest of German anti-GMO activist groups, environmental activists, Greenpeace, the Niedersachsen Green Party and the German organic food industry.

Why did these activists and organic food industry lobbyists oppose a government-funded educational program which improved the molecular biology knowledge and expertise of high school students? A press release entitled “Keine Akzeptanzbeschaffung für Agro-Gentechnik an Schulen!” (“No Acceptance for Agricultural Gene Technology at Schools“) in 2012 by an alliance representing “organic” or “natural food” farmers accompanied by the publication of a critical “study” with the same title (PDF), which was funded by this alliance as well as its anti-GMO partners, gives us some clues. They feared that the high school students might become too accepting of biotechnology in agriculture and that the curriculum did not sufficiently highlight all the potential dangers of GMOs. By allowing the ethical discussions to not only discuss the risks but also mention the benefits of genetically modifying crops, students might walk away with the idea that GMOs could be beneficial for humankind. The group believed that taxpayer money should not be used to foster special interests such as those of the agricultural industry which may want to use GMOs.

A response by the University of Hannover (PDF), which had helped develop the curriculum and coordinated the classes for the high school students, carefully analyzed the complaints of the anti-GMO activists. The author of the anti-HannoverGEN “study” had not visited the HannoverGEN laboratories, nor had he had interviewed the biology teachers or students enrolled in the classes. In fact, his critique was based on weblinks that were not even used in the curriculum by the HannoverGEN teachers or students. His analysis ignored the balanced presentation of biotechnology that formed the basis of the HannoverGEN curriculum and that discussing potential risks of genetic modification was a core topic in all the classes.

Unfortunately, this shoddily prepared “study” had a significant impact, in part because it was widely promoted by partner organizations. Its release in the autumn of 2012 came at an opportune time for political activists because Niedersachsen was about to have an election. Campaigning against GMOs seemed like a perfect cause for the Green Party and a high school program which taught the use of biotechnology to high school students became a convenient lightning rod. When the Social Democrats and the Green Party formed a coalition after winning the election in early 2013, nixing the HannoverGEN high school program was formally included in the so-called coalition contract. This is a document in which coalition partners outline the key goals for the upcoming four year period. When one considers how many major issues and problems the government of a large German state has to face, such as healthcare, education, unemployment or immigration, it is mind-boggling that de-funding a program involving only four high schools received so much attention that it needed to be anchored in the coalition contract. In fact, it is a testimony to the influence and zeal of the anti-GMO lobby.

Once the cancellation of HannoverGEN was announced, the Hannover branch of Greenpeace also took credit for campaigning against this high school program and celebrated its victory. The Greenpeace anti-GMO activist David Petersen said that the program was too cost intensive because equipping high school laboratories with state-of-the-art molecular biology equipment had already cost more than 1 million Euros. The previous center-right government which had initiated the HannoverGEN project was planning on expanding the program to even more high schools because of the program’s success and national recognition for innovative teaching. According to Petersen, this would have wasted even more taxpayer money without adequately conveying the dangers of using GMOs in agriculture.

The scientific community was shaken up by the decision of the new Social Democrat-Green Party coalition government in Niedersachsen. This was an attack on the academic freedom of schools under the guise of accusing them of promoting special interests while ignoring that the anti-GMO activists were representing their own special interests. The “study” attacking HannoverGEN was funded by the lucrative “organic” or “natural food” food industry! Scientists and science writers such as Martin Ballaschk or Lars Fischer wrote excellent critical articles stating that squashing high-quality, hand-on science programs could not lead to better decision-making. How could ignorant students have a better grasp of GMO risks and benefits than those who receive relevant formal science education and thus make truly informed decisions? Sadly, this outcry by scientists and science writers did not make much of a difference. It did not seem that the media felt this was much of a cause to fight for. I wonder if the media response would have been just as lackluster if the government had de-funded a hands-on science lab to study the effects of climate change.

In 2014, the government of Niedersachsen then announced that they would resurrect an advanced biology laboratory program for high schools with the generic and vague title “Life Science Lab”. By removing the word “Gen” from its title which seems to trigger visceral antipathy among anti-GMO activists, de-emphasizing genome science and by also removing any discussion of GMOs from the curriculum, this new program would leave students in the dark about GMOs. Ignorance is bliss from an anti-GMO activist perspective because the void of scientific ignorance can be filled with fear.

From the very first day that I could vote in Germany during the federal election of 1990, I always viewed the Green Party as a party that represented my generation. A party of progressive ideas, concerned about our environment and social causes. However, the HannoverGEN incident is just one example of how the Green Party is caving in to ideologies, thus losing its open-mindedness and progressive nature. In the United States, the anti-science movement, which attacks teaching climate change science or evolutionary biology at schools, tends to be rooted in the right wing political spectrum. Right wingers or libertarians are the ones who always complain about taxpayer dollars being wasted and used to promote agendas in schools and universities. But we should not forget that there is also a different anti-science movement rooted in the leftist and pro-environmental political spectrum – not just in Germany. As a scientist, I feel that it is becoming increasingly difficult to support the Green Party because of its anti-science stance.

I worry about all anti-science movements, especially those which attack science education. There is nothing wrong with questioning special interests and ensuring that school and university science curricula are truly balanced. But the balance needs to be rooted in scientific principles, not political ideologies. Science education has a natural bias – it is biased towards knowledge that is backed up by scientific evidence. We can hypothetically discuss dangers of GMOs but the science behind the dangers of GMO crops is very questionable. Just like environmental activists and leftists agree with us scientists that we do not need to give climate change deniers and creationists “balanced” treatment in our science curricula, they should also accept that much of the “anti-GMO science” is currently more based on ideology than on actual scientific data. Our job is to provide excellent science education so that our students can critically analyze and understand scientific research, independent of whether or not it supports our personal ideologies.


Note: An earlier version of this article was first published on the 3Quarksdaily blog.

Synthetic Biology: Engineering Life To Examine It

Two scientific papers that were published in the journal Nature in the year 2000 marked the beginning of engineering biological circuits in cells. The paper “Construction of a genetic toggle switch in Escherichia coli” by Timothy Gardner, Charles Cantor and James Collins created a genetic toggle switch by simultaneously introducing an artificial DNA plasmid into a bacterial cell. This DNA plasmid contained two promoters (DNA sequences which regulate the expression of genes) and two repressors (genes that encode for proteins which suppress the expression of genes) as well as a gene encoding for green fluorescent protein that served as a read-out for the system. The repressors used were sensitive to either selected chemicals or temperature. In one of the experiments, the system was turned ON by adding the chemical IPTG (a modified sugar) and nearly all the cells became green fluorescent within five to six hours. Upon raising the temperature to activate the temperature-sensitive repressor, the cells began losing their green fluorescence within an hour and returned to the OFF state. Many labs had used chemical or temperature switches to turn gene expression on in the past, but this paper was the first to assemble multiple genes together and construct a switch which allowed switching cells back and forth between stable ON and OFF states.


The same issue of Nature contained a second land-mark paper which also described the engineering of gene circuits. The researchers Michael Elowitz and Stanislas Leibler describe the generation of an engineered gene oscillator in their article “A synthetic oscillatory network of transcriptional regulators“. By introducing three repressor genes which constituted a negative feedback loop and a green fluorescent protein as a marker of the oscillation, the researchers created a molecular clock in bacteria with an oscillation period of roughly 150 minutes. The genes and proteins encoded by the genes were not part of any natural biological clock and none of them would have oscillated if they had been introduced into the bacteria on their own. The beauty of the design lay in the combination of three serially repressing genes and the periodicity of this engineered clock reflected the half-life of the protein encoded by each gene as well as the time it took for the protein to act on the subsequent member of the gene loop.

Both papers described the introduction of plasmids encoding for multiple genes into bacteria but this itself was not novel. In fact, this has been a routine practice since the 1970s for many molecular biology laboratories. The panache of the work lay in the construction of functional biological modules consisting of multiple genes which interacted with each other in a controlled and predictable manner. Since the publication of these two articles, hundreds of scientific papers have been published which describe even more intricate engineered gene circuits. These newer studies take advantage of the large number of molecular tools that have become available to query the genome as well as newer DNA plasmids which encode for novel biosensors and regulators.

Synthetic biology is an area of science devoted to engineering novel biological circuits, devices, systems, genomes or even whole organisms. This rather broad description of what “synthetic biology” encompasses reflects the multidisciplinary nature of this field which integrates ideas derived from biology, engineering, chemistry and mathematical modeling as well as a vast arsenal of experimental tools developed in each of these disciplines. Specific examples of “synthetic biology” include the engineering of microbial organisms that are able to mass produce fuels or other valuable raw materials, synthesizing large chunks of DNA to replace whole chromosomes or even the complete genome in certain cells, assembling synthetic cells or introducing groups of genes into cells so that these genes can form functional circuits by interacting with each other. Synthesis in the context of synthetic biology can signify the engineering of artificial genes or biological systems that do not exist in nature (i.e. synthetic = artificial or unnatural), but synthesis can also stand for integration and composition, a meaning which is closer to the Greek origin of the word.  It is this latter aspect of synthetic biology which makes it an attractive area for basic scientists who are trying to understand the complexity of biological organisms. Instead of the traditional molecular biology focus on studying just one single gene and its function, synthetic biology is engineering biological composites that consist of multiple genes and regulatory elements of each gene. This enables scientists to interrogate the interactions of these genes, their regulatory elements and the proteins encoded by the genes with each other. Synthesis serves as a path to analysis.

One goal of synthetic biologists is to create complex circuits in cells to facilitate biocomputing, building biological computers that are as powerful or even more powerful that traditional computers. While such gene circuits and cells that have been engineered have some degree of memory and computing power, they are no match for the comparatively gigantic computing power of even small digital computers. Nevertheless, we have to keep in mind that the field is very young and advances are progressing at a rapid pace.

One of the major recent advances in synthetic biology occurred in 2013 when an MIT research team led by Rahul Sarpeshkar and Timothy Lu at MIT created analog computing circuits in cells. Most synthetic biology groups that engineer gene circuits in cells to create biological computers have taken their cues from contemporary computer technology. Nearly all of the computers we use are digital computers, which process data using discrete values such as 0’s and 1’s. Analog data processing on the other hand uses a continuous range of values instead of 0’s and 1’s. Digital computers have supplanted analog computing in nearly all areas of life because they are easy to program, highly efficient and process analog signals by converting them into digital data. Nature, on the other hand, processes data and information using both analog and digital approaches. Some biological states are indeed discrete, such as heart cells which are electrically depolarized and then repolarized in periodical intervals in order to keep the heart beating. Such discrete states of cells (polarized / depolarized) can be modeled using the ON and OFF states in the biological circuit described earlier. However, many biological processes, such as inflammation, occur on a continuous scale. Cells do not just exist in uninflamed and inflamed states; instead there is a continuum of inflammation from minimal inflammatory activation of cells to massive inflammation. Environmental signals that are critical for cell behavior such as temperature, tension or shear stress occur on a continuous scale and there is little evidence to indicate that cells convert these analog signals into digital data.

Most of the attempts to create synthetic gene circuits and study information processing in cells have been based on a digital computing paradigm. Sarpeshkar and Lu instead wondered whether one could construct analog computation circuits and take advantage of the analog information processing systems that may be intrinsic to cells. The researchers created an analog synthetic gene circuit using only three proteins that regulate gene expression and the fluorescent protein mCherry as a read-out. This synthetic circuit was able to perform additions or ratiometric calculations in which the cumulative fluorescence of the mCherry was either the sum or the ratio of selected chemical input concentrations. Constructing a digital circuit with similar computational power would have required a much larger number of components.

The design of analog gene circuits represents a major turning point in synthetic biology and will likely spark a wave of new research which combines analog and digital computing when trying to engineer biological computers. In our day-to-day lives, analog computers have become more-or-less obsolete. However, the recent call for unconventional computing research by the US Defense Advanced Research Projects Agency (DARPA) is seen by some as one indicator of a possible paradigm shift towards re-examining the value of analog computing. If other synthetic biology groups can replicate the work of Sarpeshkar and Lu and construct even more powerful analog or analog-digital hybrid circuits, then the renaissance of analog computing could be driven by biology.  It is difficult to make any predictions regarding the construction of biological computing machines which rival or surpass the computing power of contemporary digital computers. What we can say is that synthetic biology is becoming one of the most exciting areas of research that will provide amazing insights into the complexity of biological systems and may provide a path to revolutionize biotechnology.

ResearchBlogging.orgDaniel R, Rubens JR, Sarpeshkar R, & Lu TK (2013). Synthetic analog computation in living cells. Nature, 497 (7451), 619-23 PMID: 23676681





An earlier version of this article was first published here on the 3Quarksdaily blog.

Cellular Alchemy: Converting Fibroblasts Into Heart Cells

Medieval alchemists devoted their lives to the pursuit of the infamous Philosopher’s Stone, an elusive substance that was thought to convert base metals into valuable gold. Needless to say, nobody ever discovered the Philosopher’s Stone. Well, perhaps some alchemist did get lucky but was wise enough to keep the discovery secret. Instead of publishing the discovery and receiving the Nobel Prize for Alchemy, the lucky alchemist probably just walked around in junkyards, surreptitiously collected scraps of metal and brought them to home to create a Scrooge-McDuck-style money bin.  Today, we view the Philosopher’s Stone as just a myth that occasionally resurfaces in the titles of popular fantasy novels, but cell biologists have discovered their own version of the Philosopher’s Stone: The conversion of fibroblast cells into precious heart cells (cardiomyocytes) or brain cells (neurons).


Fibroblasts are an abundant cell type, found in many organs such as the heart, liver and the skin. One of their main functions is to repair wounds and form scars in this process. They are fairly easy to grow or to expand, both in the body as well as in a culture dish. The easy access to large quantities of fibroblasts makes them analogous to the “base metals” of the alchemist. Adult cardiomyocytes, on the other hand, are not able to grow, which is why a heart attack which causes death of cardiomyocytes can be so devastating. There is a tiny fraction of regenerative stem-cell like cells in the heart that are activated after a heart attack and regenerate some cardiomyocytes, but most of the damaged and dying heart cells are replaced by a scar – formed by the fibroblasts in the heart. This scar keeps the heart intact so that the wall of the heart does not rupture, but it is unable to contract or beat, thus weakening the overall pump function of the heart. In a large heart attack, a substantial portion of cardiomycoytes are replaced with scar tissue, which can result in heart failure and heart failure.

A few years back, a research group at the Gladstone Institute of Cardiovascular Disease (University of California, San Francisco) headed by Deepak Srivastava pioneered a very interesting new approach to rescuing heart function after a heart attack.  In a 2010 paper published in the journal Cell, the researchers were able to show that plain-old fibroblasts from the heart or from the tail of a mouse could be converted into beating cardiomyocytes! The key to this cellular alchemy was the introduction of three genes – Gata4, Mef2C and Tbx5 also known as the GMT cocktail– into the fibroblasts. These genes encode for developmental cardiac transcription factors, i.e. proteins that regulate the expression of genes which direct the formation of heart cells. The basic idea was that by introducing these regulatory factors, they would act as switches that turn on the whole heart gene program machinery. Unlike the approach of the Nobel Prize laureate Shinya Yamanaka, who had developed a method to generate stem cells (induced pluripotent stem cells or iPSCs) from fibroblasts, Srivastava’s group bypassed the whole stem cell generation process and directly created heart cells from fibroblasts. In a follow-up paper published in the journal Nature in 2012, the Srivastava group took this research to the next level by introducing the GMT cocktail directly into the heart of mice and showing that this substantially improved heart function after a heart attack. Instead of merely forming scars, the fibroblasts in the heart were being converted into functional, beating heart cells – cellular alchemy with great promise for new cardiovascular therapies.

As exciting as these discoveries were, many researchers remained skeptical because the cardiac stem cell field has so often seen paradigm-shifting discoveries appear on the horizon, only to later on find out that they cannot be replicated by other laboratories. Fortunately, Eric Olson’s group at the University of Texas, Southwestern Medical Center also published a paper in Nature in 2012, independently confirming that cardiac fibroblasts could indeed be converted into cardiomyocytes. They added on a fourth factor to the GMT cocktail because it appeared to increase the success of conversion. Olson’s group was also able to confirm Srivastava’s finding that directly treating the mouse hearts with these genes helped convert cardiac fibroblasts into heart cells. They also noticed an interesting oddity. Their success of creating heart cells from fibroblasts in the living mouse was far better than what they would have expected from their experiments in a dish. They attributed this to the special cardiac environment and the presence of other cells in the heart that may have helped the fibroblasts convert to beating heart cells. However, another group of scientists attempted to replicate the findings of the 2010 Cell paper and found that their success rate was far lower than that of the Srivastava group. In the paper entitled “Inefficient Reprogramming of Fibroblasts into Cardiomyocytes Using Gata4, Mef2c, and Tbx5” published in the journal Circulation Research in 2012, Chen and colleagues found that very few fibroblasts could be converted into cardiomyocytes and that the electrical properties of the newly generated heart cells did not match up to those of adult heart cells. One of the key differences between this Circulation Research paper and the 2010 paper of the Srivastava group was that Chen and colleagues used fibroblasts from older mice, whereas the Srivastava group had used fibroblasts from newly born mice. Arguably, the use of older cells by Chen and colleagues might be a closer approximation to the cells one would use in patients. Most patients with heart attacks are older than 40 years and not newborns.

These studies were all performed on mouse fibroblasts being converted into heart cells, but they did not address the question whether human fibroblasts would behave the same way. A recent paper in the Proceedings of the National Academy of Sciences by Eric Olson’s laboratory (published online before print on March 4, 2013 by Nam and colleagues) has now attempted to answer this question. Their findings confirm that human fibroblasts can also be converted into beating heart cells, however the group of genes required to coax the fibroblasts into converting is slightly different and also requires the introduction of microRNAs – tiny RNA molecules that can also regulate the expression of a whole group of genes. Their paper also points out an important caveat.  The generated heart-like cells were not uniform and showed a broad range of function, with only some of the spontaneously contracting and with an electrical activity pattern that was not the same as in adult heart cells.

Where does this whole body of work leave us? One major finding seems to be fairly solid. Fibroblasts can be converted into beating heart cells. The efficiency of conversion and the quality of the generated heart cells – from mouse or human fibroblasts – still needs to be optimized. Even though the idea of cellular alchemy sounds fascinating, there are many additional obstacles that need to be overcome before such therapies could ever be tested in humans. The method to introduce these genes into the fibroblasts used viruses which permanently integrate into the DNA of the fibroblast and could cause genetic anomalies in the fibroblasts. It is unlikely that such viruses could be used in patients. The fact that the generated heart cells show heterogeneity in their electrical activity could become a major problem for patients because patches of newly generated heart cells in one portion of the heart might be beating at a different rate of rhythm than other patches. Such electrical dyssynchony can cause life threatening heart rhythm problems, which means that the electrical properties of the generated cells need to be carefully understood and standardized. We also know little about the long-term survival of these converted cells in the heart and whether the converted cells maintain their heart-cell-like activity for months or years. The idea of directly converting fibroblasts by introducing the genes into the heart instead of first obtaining the fibroblasts, then converting them in a dish and lastly implanting the converted cells back into the heart sounds very convenient. But this convenience comes at a price. It requires human gene therapy which has its own risks and it is very difficult to control the cell conversion process in an intact heart of a patient. On the other hand, if cells are converted in a dish, one can easily test and discard the suboptimal cells and only implant the most mature or functional heart cells.

This process of cellular alchemy is still in its infancy. It is one of the most exciting new areas in the field of regenerative medicine, because it shows how plastic cells are. Hopefully, as more and more labs begin to investigate the direct reprogramming of cells, we will be able to address the obstacles and challenges posed by this emerging field.


Image credit: Painting in 1771 by Joseph Wright of Derby – The Alchymist, In Search of the Philosopher’s Stone via Wikimedia Commons
Nam, Y., Song, K., Luo, X., Daniel, E., Lambeth, K., West, K., Hill, J., DiMaio, J., Baker, L., Bassel-Duby, R., & Olson, E. (2013). Reprogramming of human fibroblasts toward a cardiac fate Proceedings of the National Academy of Sciences, 110 (14), 5588-5593 DOI: 10.1073/pnas.1301019110

The ENCODE Controversy And Professionalism In Science

The ENCODE (Encyclopedia Of DNA Elements) project received quite a bit of attention when its results were publicized last year. This project involved a very large consortium of scientists with the goal to identify all the functional elements in the human genome. In September 2012, 30 papers were published in a coordinated release and their extraordinary claim was that roughly 80% of the human genome was “functional”. This was in direct contrast to the prevailing view among molecular biologists that the bulk of human DNA was just “junk DNA”, i.e. sequences of DNA for which one could not assign any specific function. The ENCODE papers contained huge amounts of data, collating the work of hundreds of scientists who had worked on this for nearly a decade. But what garnered most attention, among scientists, the media and the public was the “80%” claim and the supposed “death of junk DNA“.

Soon after the discovery of DNA, the primary function ascribed to DNA was its role as a template from which messenger RNA could be transcribed and then translated into functional proteins. Using this definition of “function”, only 1-2% of the human DNA would be functional because they actually encoded for proteins. The term “junk DNA” was coined to describe the 98-99% of non-coding DNA which appeared to primarily represent genetic remnants of our evolutionary past without any specific function in our present day cells.

However, in the past decades, scientists have uncovered more and more functions for the non-coding DNA segments that were previously thought to be merely “junk”. Non-coding DNA can, for example, act as a binding site for regulatory proteins and exert an influence on protein-coding DNA. There has also been an increasing awareness of the presence of various types of non-coding RNA molecules, i.e. RNA molecules which are transcribed from the DNA but not subsequently translated into proteins. Some of these non-coding RNAs have known regulatory functions, others may not have any or their functions have not yet been established.

Despite these discoveries, most scientists were in agreement that only a small fraction of DNA was “functional”, even when all the non-coding pieces of DNA with known functions were included. The bulk of our genome was still thought to be non-functional. The term “junk DNA” was used less frequently by scientists, because it was becoming apparent that we were probably going to discover even more functional elements in the non-coding DNA.

In September 2012, everyone was talking about “junk DNA” again, because the ENCODE scientists claimed their data showed that 80% of the human genome was “functional”. Most scientists had expected that the ENCODE project would uncover some new functions for non-coding DNA, but the 80% figure was way out of proportion to what everyone had expected. The problem was that the ENCODE project used a very low bar for “function”. Binding to the DNA or any kind of chemical DNA modification was already seen as a sign of “function”, without necessarily proving that these pieces of DNA had any significant impact on the function of a cell.

The media hype with the “death of junk DNA” headlines and the lack of discussion about what constitutes function were appropriately criticized by many scientists, but the recent paper by Dan Graur and colleagues “On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE” has grabbed everyone’s attention. Not necessarily because of the fact that it criticizes the claims made by the ENCODE scientists, but because of the sarcastic tone it uses to ridicule ENCODE.

There have been so many other blog posts and articles that either praise or criticize the Graur paper, so I decided to list some of them here:

1. PZ Myers writes “ENCODE gets a public reaming” and seems to generally agree with Graur and colleagues.

2. Ashutosh Jogalekar says Graur’s paper is a “devastating takedown of ENCODE in which they pick apart ENCODE’s claims with the tenacity and aplomb of a vulture picking apart a wildebeest carcass.”

3. Ryan Gregory highlights some of the “zingers” in the Graur paper

Other scientists, on the other hand, agree with some of the conclusions of the Graur paper and its criticism of how the ENCODE data was presented, but disagree with the sarcastic tone:

1. OpenHelix reminds us that this kind of spanking” should not distract from all the valuable data that ENCODE has generated.

2. Mick Watson shows how Graur and colleagues could have presented their key critiques in a very non-confrontational manner and foster a constructive debate.

3. Josh Witten points out the irony of Graur accusing ENCODE of seeking hype, even though Graur and his colleagues seem to use sarcasm and ridicule to also increase the visibility of their work. I think Josh’s blog post is an excellent analysis of the problems with ENCODE and the problems associated with Graur’s tone.

On Twitter, I engaged in a debate with Benoit Bruneau, my fellow Scilogs blogger Malcolm Campbell and Jonathan Eisen and I thought it would be helpful to share the Storify version here. There was a general consensus that even though some of the points mentioned by Graur and colleagues are indeed correct, their sarcastic tone was uncalled for. Scientists can be critical of each other, but can and should do so in a respectful and professional manner, without necessarily resorting to insults or mockery.

[<a href=”//” target=”_blank”>View the story “ENCODE controversy and professionalism in scientific debates” on Storify</a>]
Graur D, Zheng Y, Price N, Azevedo RB, Zufall RA, & Elhaik E (2013). On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE. Genome biology and evolution PMID: 23431001

Flipping the Switch: Using Optogenetics to Treat Seizures

Optogenetics is emerging as one of the most exciting new tools in biomedical research. This method is based on introducing genes that encode for light-sensitive proteins into cells. A laser beam can then be used to activate the light-sensitive proteins. Many of the currently used optogenetic proteins respond to the laser activation by changing the membrane voltage potential inside the cells. This is the reason why neurons and other cells that can be excited by electrical impulses, are ideally suited for studying optogenetic responses.

The recent paper “On-demand optogenetic control of spontaneous seizures in temporal lobe epilepsy” by Esther Krook-Magnuson and colleagues in Nature Communications (published online on January 22, 2013) applies the optogenetic approach to treat seizures in mice. The researchers used mice that had been genetically modified to express the inhibitory light sensitive protein halorhodopsin (normally only found in single cell organisms but not in mammals) in neurons. They placed an optical fiber to deliver the laser light to an area of the brain where they chemically induced a specific type of seizures (temporal lobe epilepsy or TLE) in the mice.

The results were quite remarkable. Activation of the laser light reduced the seizure duration by half within just five seconds. Krook-Magnuson and colleagues then also chose a second optogenetic approach to treat the seizures. Instead of using mice that contained the inhibitory light-sensitive protein halorhodopsin, they opted for mice with the excitatory (activating) light-sensitive protein channelrhodopsin (Chr2). This may seem a bit counter-intuitive, since the problem in epilepsy is that there is too much activation of neurons. One would not necessarily want to introduce activating light-sensitive proteins into neurons that are already too active. The key to understanding their strategy is the choice of the target: a subset of GABAergic cells, which can inhibit the seizure activity in neighboring neurons. This second approach was just as effective as the first approach, which used the halorhodopsin protein.

This means that one can substantially cut down seizure duration by more than half, either by directly inhibiting seizing neurons, or by activating inhibitory neurons. This research shows that there is tremendous potential for developing novel optogenetic treatments for epilepsy. Specifically targeting selected neurons that are involved in seizure activity would be preferable to generalized treatment with medications that affect global neuronal activity and could cause side effects (as is often the case with current epilepsy medications).

It is not yet clear whether this treatment can be easily applied to humans. The linchpin of the experiment was genetically introducing the light-sensitive proteins into selected neurons of the mice. This type of targeted neuronal gene therapy would be far more difficult in humans. The other obstacle is that the light activation in the mice required implantation of an optical fiber which directed the light into a specific area of the brain. Performing such an invasive procedure in patients could carry potential risks that would need to be carefully balanced with the risks and benefits of just continuing to use anti-seizure medications. Hopefully, future improvements in gene therapy methods and light stimulation will be able to help overcome these obstacles and pave the way for a whole new class of optogenetics-based therapies in patients with epilepsy and other neurological disorders.


Image credit: Confocal image of an eGFP filled striatal medium spiny neuron, National Institutes of Health (NIH), Margaret I. Davis
Krook-Magnuson, E., Armstrong, C., Oijala, M., & Soltesz, I. (2013). On-demand optogenetic control of spontaneous seizures in temporal lobe epilepsy Nature Communications, 4 DOI: 10.1038/ncomms2376

Immune Cells Can Remember Past Lives

The generation of induced pluripotent stem cells (iPSCs) is one of the most fascinating discoveries in the history of stem cell biology. John Gurdon and Shinya Yamanaka received the 2012 Nobel Prize for showing that adult cells could be induced to become embryonic-like stem cells (iPSCs). Many stem cell laboratories now routinely convert skin cells or blood cells from an adult patient into iPSCs. The stem cell properties of the generated iPSCs then allow researchers to convert them into a desired cell type, such as heart cells (cardiomyocytes) or brain cells (neurons), which can then be used for cell-based therapies or for the screening of novel drugs. The initial conversion of adult cells to iPSCs is referred to as “reprogramming” and is thought to represent a form of rejuvenation, because the adult cell appears to lose its adult cell identity and reverts to an immature embryonic-like state. However, we know surprisingly little about the specific mechanisms that allow adult cells to become embryonic-like. For example, how does a blood immune cell such as a lymphocyte lose its lymphocyte characteristics during the reprogramming process? Does the lymphocyte that is converted into an immature iPSC state “remember” that it used to be a lymphocyte? If yes, does this memory affect what types of cells the newly generated iPSCs can be converted into, i.e. are iPSCs derived from lymphocytes very different from iPSCs that are derived from skin cells?

There have been a number of recent studies that have tried to address the question of the “memory” in iPSCs, but two recent papers published in the January 3, 2013 issue of the journal Cell Stem Cell provide some of the most compelling proofs of an iPSC “memory” and also show that this “memory” could be used for therapeutic purposes. In the paper “Regeneration of Human Tumor Antigen-Specific T Cells from iPSCs Derived from Mature CD8+ T Cells“, Vizcardo and colleagues studied the reprogramming of T-lymphocytes derived from the tumor of a melanoma patient. Mature T-lymphocytes are immune cells that can recognize specific targets, depending on what antigen they have been exposed to. The tumor infiltrating cells used by Vizcardo and colleagues have been previously shown to recognize the melanoma tumor antigen MART-1. The researchers were able to successfully generate iPSCs from the T-lymphocytes, and they then converted the iPSCs back to T-lymphocytes. What they found was that the newly generated T-lymphocytes expressed a receptor that was specific for the MART tumor antigen. Even though the newly generated T-lymphocytes had not been exposed to the tumor, they had retained their capacity to respond to the melanoma antigen. The most likely explanation for this is that the generated iPSCs “remembered” their previous exposure to the tumor in their past lives as T-lymphocytes before they had been converted to embryonic-like iPSCs and then “reborn” as new T-lymphocytes. The iPSC reprogramming apparently did not wipe out their “memory”.

This finding has important therapeutic implications. One key problem that the immune system faces when fighting a malignant tumor is that the demand for immune cells outpaces their availability. The new study suggests that one can take activated immune cells from a cancer patient, convert them to the iPSC state, differentiate them back into rejuvenated immune cells, expand them and inject them back into the patient. The expanded and rejuvenated immune cells would retain their prior anti-tumor memory, be primed to fight the tumor and thus significantly augment the ability of the immune system to slow down the tumor growth.

The paper by Vizcardo and colleagues did not actually show the rejuvenation and anti-tumor efficacy of the iPSC-derived T-lymphocytes and this needs to be addressed in future studies. However, the paper “Generation of Rejuvenated Antigen-Specific T Cells by Reprogramming to Pluripotency and Redifferentiation” by Nishimura and colleagues in the same issue of Cell Stem Cell, did address the rejuvenation question, albeit in a slightly different context. This group of researchers obtained T-lymphocytes from a patient with HIV, then generated iPSC and re-differentiated the iPSCs back into T-lymphocytes. Similar to what Vizcardo and colleagues had observed, Nishimura and colleagues found that their iPSC derived T-lymphocytes retained an immunological memory against HIV antigens. Importantly, the newly derived T-lymphocytes were highly proliferative and had longer telomeres. The telomeres are chunks of DNA that become shorter as cells age, so the lengthening of telomeres and the high growth rate of the iPSC derived T-lymphocytes were both indicators that the iPSC reprogramming process had made the cells younger while also retaining their “memory” or ability to respond to HIV.

Further studies are now needed to test whether adding the rejuvenated cells back into the body does actually help prevent tumor growth and can treat HIV infections. There is also a need to ensure that the cells are safe and the rejuvenation process itself did not cause any harmful genetic changes. Long telomeres have been associated with the formation of tumors and one has to make sure that the iPSC-derived lymphocytes do not become malignant. These two studies represent an exciting new development in iPSC research. They not only clearly document that iPSCs retain a memory of the original adult cell type they are derived from but they also show that this memory can be put to good use. This is especially true for immune cells, because retaining an immunological memory allows rejuvenated iPSC-derived immune cells to resume the fight against a tumor or a virus.


Image credit: “Surface of HIV infected macrophage” by Sriram Subramaniam at the National Cancer Institute (NCI) via National Institutes of Health Image Bank

Can The Heart Regenerate Itself After A Heart Attack?

Some cardiovascular researchers believe that the heart contains cardiac stem cells or progenitor cells which can become mature cardiomyocytes (beating heart cells) following an injury and regenerate the damaged heart. The paper “Mammalian heart renewal by pre-existing cardiomyocytes” published in the journal Nature by Senyo and colleagues (online publication on December 5, 2012), on the other hand, suggests that the endogenous regenerative potential of the adult heart is very limited. The researchers studied the regeneration of cardiomyocytes in mice using a genetic label that marks cardiomyocytes with a green fluorescent protein and they also used the nonradioactive stable isotope 15N (Nitrogen-15) to track the growth of cardiomyocytes. They found that the adult mouse heart has a very low rate of cardiomyocyte regeneration and projected the annual proliferation rate to be only 0.76%. This means that less than one out of a hundred cardiomyocytes in the adult heart undergoes cell division during a one year period. Even though this number is derived from studying the turnover of cardiomyocytes in mice, it correlates very well with the proposed rate of annual cardiomyocyte self-renewal (0.5% to 1%) that Bergmann and colleagues estimated for the human heart in a 2009 paper published in Science. The key novelty of the paper by Senyo and colleagues is that they identified the source of these new cardiomyocytes. They do not arise from cardiac stem cells or cardiac progenitor cells, but are primarily derived from pre-existing adult cardiomyocytes. Does this low rate of cardiomyocyte turnover increase after an injury? Senyo and colleagues found that eight weeks after a heart attack, only 3.2% of the mouse cardiomyocytes located near the injured areas had undergone cell division.


This low rate of self-renewal in the adult heart sounds like bad news for researchers who thought that the adult heart had the ability to heal itself after a heart attack. However, the journal Nature also published the paper “Functional screening identifies miRNAs inducing cardiac regeneration” by Eulalio and colleagues on the same day (online publication on December 5, 2012), which indicates that the low levels of cardiomyocyte growth can be increased using certain microRNAs. A microRNA is a small RNA molecule that can regulate the expression of hundreds of genes and can play an important role in controlling many cellular processes such as cell growth, cell metabolism and cell survival. Eulalio and colleagues performed a broad screen using 875 microRNA mimics in new-born rat cardiomyocytes and identified 204 microRNAs that increase the growth of the cells. They narrowed down the number of microRNAs and were able to show that two distinct microRNAs increased the growth of cardiomyocytes after heart attacks in mice. The effect was quite significant and mice treated with these microRNAs had near-normal heart function 60 days after a heart attack.

Based on these two Nature papers, it appears that the cardiomyocytes in the adult heart have a kind of “brake” that prevents them from proliferating. Addition of specific microRNAs seems to relieve the “brake” and allow the adult heart cells to regenerate the heart after a heart attack. This could lead to potential new therapies for patients who suffer from heart attacks, but some important caveats need to be considered. MicroRNAs (and many other cardiovascular therapies) that work in mice or rats do not necessarily have the same beneficial effects in humans. The mice in the study by Eulalio and colleagues also did not receive any medications that patients routinely receive after a heart attack. Patients usually show some improvement in their heart function after a heart attack, if they are treated with the appropriate medications. Since the mice were not treated with the medications, it is difficult to assess whether the microRNAs would have a benefit beyond that what is achieved by conventional post-heart attack medications. Finally, the delivery and dosing of microRNAs is comparatively easy in mice but much more challenging in a heterogeneous group of patients.

The studies represent an important step forward towards identifying the self-renewal mechanisms in the adult heart and suggest that microRNAs are major regulators of these processes, but many additional studies are necessary before their therapeutic value for patients can be assessed.


Image credit: Wikimedia Commons

Somatic Mosaicism: Genetic Differences Between Individual Cells

The cells in the body of a healthy person all have the same DNA, right? Not really! It has been known for quite some time now that there are genetic differences between cells within one person. The expression to describe these between-cell differences is “somatic mosaicism“, because cells can represent a mosaic of genetic profiles, even within a single organ. During embryonic development, all cells are derived from one fertilized egg and ought to be genetically identical. However, during every cell division errors and differences during DNA replication can occur and this can lead to genetic differences between cells. This process not only occurs during embryonic development but continues after birth.

As we age, our cells are exposed to numerous factors such as radiation, chemicals or other stressors which can causes genetic alterations, ranging from single nucleotide mutations to duplications and deletions of large chunks of DNA. Some mutations are known to cause cancer by making a single cell grow rapidly, but not all mutations lead to cancer. Many spontaneous mutations can either result in the death of a cell or do not even impact its function in any significant manner. DNA copy number variations (CNVs) is an expression used to describe a variable copy number of larger DNA segments of one kilobase (kb). Most recent studies on CNVs have compared CNVs between people, i.e. how many CNVs does person A have when compared to person B. It turns out that there may be quite a bit of genetic diversity between people that had previously been overlooked.

A new paper published in the journal Nature takes this one step further. It not only shows that there are significant CNVs between people, but even within a single person. In the study “Somatic copy number mosaicism in human skin revealed by induced pluripotent stem cells“, Alexej Abyzov and colleagues found significant CNVs in induced pluripotent stem cells (iPSCs) that they had generated from the adult skin cells of human subjects. Importantly, most of these CNVs were not the result of reprogramming adult skin cells to the stem cell state. They were already present in the skin fibroblasts obtained from the human subjects. Most analyses of CNVs are performed on whole tissues or biopsies, but not on single cells, which is why so little is known about between cell CNV differences. However, when iPSCs are generated from skin fibroblast cells, they are often derived from a single cell. This enables the evaluation of genetic diversity between cells.

Abyzov and colleagues estimate that 30% of adult skin fibroblasts carry large CNVs. This estimate is based on a very small number of fibroblast samples. It is not clear whether other cells such as neurons or heart cells also have similar CNVs and whether the 30% estimate would hold up in a larger sample. Their work leads to the intriguing question: What percentage of neighboring cells in a single heart, brain or kidney are actually genetically identical? Cell types, such as heart cells or adult neurons cannot be clonally expanded so it may be difficult to determine the genetic diversity within a heart or a brain using the methods employed by Abyzov and colleagues.

What are the implications of this work? On a practical level, this study suggests that it may be important to derive multiple iPSC clones from a subject’s or patient’s skin cells, if one wants to use the iPSCs for disease modeling. This will help control for the genetic diversity that exists among the skin cells. However, a much more profound implication of this work is that we have to think about between-cell diversity within a single organ. We need to develop better tools for how to analyze genetic diversity between individual cells, and more importantly, we have to understand how this genetic diversity impacts health and disease.

Image Credit: Wikimedia / Alexander Mosaic (Public Domain)

Abyzov A, Mariani J, Palejev D, Zhang Y, Haney MS, Tomasini L, Ferrandino AF, Rosenberg Belmaker LA, Szekely A, Wilson M, Kocabas A, Calixto NE, Grigorenko EL, Huttner A, Chawarska K, Weissman S, Urban AE, Gerstein M, & Vaccarino FM (2012). Somatic copy number mosaicism in human skin revealed by induced pluripotent stem cells. Nature, 492 (7429), 438-42 PMID: 23160490

Is the Analysis of Gene Expression Based on an Erroneous Assumption?

The MIT-based researcher Rick Young is one of the world’s top molecular biologists. His laboratory at the Whitehead Institute for Biomedical Research has helped define many of the key principles of how gene expression is regulated, especially in stem cells and cancer cells. At a symposium organized by the International Society for Stem Cell Research (ISSCR), Rick presented some very provocative data today, which is bound to result in controversial discussions about how researchers should assess gene expression.

Ptolemey’s world map from Harmonica Macrocosmica

It has become very common for molecular biology laboratories to use global gene expression analyses to understand the molecular signature of a cell. These global analyses can measure the gene expression of thousands of genes in a single experiment. By comparing the gene expression profiles of different groups of cells, such as cancer cells and their healthy counterparts, many important new genes or new roles for known genes have been uncovered. The Gene Expression Omnibus is a public repository for the huge amount of molecular information that is generated. So far, more than 800,000 samples have been analyzed, covering the gene expression in a vast array of organisms and disease states.

Rick himself has extensively used such expression analyses to characterize cancer cells and stem cells, but at the ISSCR symposium, he showed that most of these analyses are based on the erroneous assumption that the total RNA content in cells remains constant. When the gene expression in cancer cells is compared to that of healthy non-cancer cells, the analysis is routinely performed by normalizing or standardizing the RNA content. The same amount of RNA from cancer cells and non-cancer cells is obtained and the global analyses are able to detect relative differences in gene expression. However, a problem arises when one cell type is generating far more RNA than the cell type it is being compared to.

In a paper that was published today in the journal Cell entitled “Revisiting Global Gene Expression Analysis”, Rick Young and his colleagues discuss their recent discovery that the cancer-linked gene regulator c-Myc increases total gene expression by two to three-fold. Cells expressing the c-Myc gene therefore contain far more total RNA than cells that don’t express it. This means that most genes will be expressed at substantially higher levels in the c-Myc cells. However, if one were to perform a traditional gene expression analysis comparing c-Myc cells versus cells without c-Myc, one would “control” for these differences in RNA amount by using the same amount of RNA for both cell types. This traditional standardization makes a lot of sense; after all, how would one be able to compare the gene expression profile in the two samples, if we loaded different amounts of RNA? The problem with this common-sense standardization is that it misses out on global shifts of gene expression, such as those initiated by potent regulators such as c-Myc. According to Rick Young, one answer to the problem is to include an additional control by “spiking” the samples with defined amounts of known RNA. This additional control would allow us to then analyze if there is also an absolute change in gene expression, in addition to the relative changes that current gene analyses can detect.

In some ways, this seems like a minor technical point, but I think that it actually points to a very central problem in how we perform gene expression analysis, as well as many other assays in cell biology and molecular biology. One is easily tempted to use exciting large scale analyses to study the genome, epigenome, proteome or phenome of cells. These high-tech analyses generate mountains of data and we spend an inordinate amount of time trying to make sense of the data. However, we sometimes forget to question the very basic assumptions that we have made. My mentor Till Roenneberg taught me how important it was to use the right controls in every experiment. The key word here is “right” controls, because merely including controls without thinking about their appropriateness is not sufficient. I think that Rick Young’s work is an important reminder for all of us to continuously re-evaluate the assumptions we make, because such a re-evaluation is a pre-requisite for good research practice.