The Open Access Debate Continues

The New England Journal of Medicine has just published four articles that comment on the issue of “open access”. I will list these four articles and briefly comment on the two papers which are critical of open access publishing.

1. The Downside of Open-Access Publishing by Charlotte Haug

This article discusses potential problems associated with open access publishing but also conflates the issue of open access with the issue of inadequate peer review, as can be seen in this excerpt:

Of course, the terms “international,” “scientific,” “peer-reviewed,” “journal,” “article,” “editor,” and “publisher” do not have copyrighted or patented definitions and can have varied meanings, especially in the Internet age. Must an article be different from a submitted paper? Isn’t everything published online automatically international? Is there anything wrong with a situation in which the editor and publisher are just one person who has set up a website where researchers can submit their papers and pay a fee to have them laid out in a professional way and made available to all interested parties? Isn’t it a good thing that this vast number of new publishers and journals will make it possible to get all research — whatever its quality level — into the public domain? Perhaps. But describing a simple online-posting service as “an international, scientific, peer-reviewed journal” leads authors and readers to believe that they are submitting to or reading something they aren’t.

One central flaw of this argument is that open access does not necessarily mean lack of peer review, as previously discussed.

2. Open but Not Free — Publishing in the 21st Century by Martin Frank

The article by Martin Frank tries to make the case that “open access” publishing itself costs quite a bit of money and that these funds could be better used for research purposes. He distinguishes between “gold open access”, where published articles are immediately available upon publication to the general public without any fees for the readers and “green open access”, which gives free access to the public after an initial period of pay-for-access. With green open access, the publisher generates some revenue during this initial period, whereas in “gold open access” publishing, the researchers usually pay a fee that covers the publication charges so that readers do not have to pay anything.

One section of the article especially caught my eye:

…..assuming that all articles had to be published with gold open access, Harvard Medical School would have to pay $13.5 million (at $1,350 per article) to publish the 10,000 articles authored by its faculty in 2010 — considerably more than the $3.75 million that was in its serials-acquisition budget that year. Research-intensive institutions will thus bear the burden of funding free access to the research literature, subsidizing access for less-research-intensive institutions, including pharmaceutical companies.

This calculation assumes that current pay-for-access journals do not charge researchers for the publication of their articles. I have previously addressed this issue, citing a specific example which shows that pay-for-access journals often charge the researchers several hundred dollars to publish an article. If researchers use color figures, the charges can run up to $2000 or $3000 per manuscript. These author fees are in addition to the fees that publishers of pay-for-access journals charge the readers. Martin Frank’s calculation ignores the author fees that Harvard researchers might be currently paying to publish in pay-for-access journals.

He also mentions the pharmaceutical companies as potential beneficiaries but fails to include other important beneficiaries:

1) Members of the general public, whose taxes paid for most of the biomedical research conducted in the United States and who should thus have a right to access the results of this publicly funded research

2) Individuals in countries who cannot afford the fees to read papers published in pay-for-access journals.

3. Creative Commons and the Openness of Open Access by Michael Carroll

4. For the Sake of Inquiry and Knowledge — The Inevitability of Open Access by Ann Wolpert


Are Scientists Divided Over Divining Rods?

When I read a statement which starts with “Scientists are divided over……“, I expect to learn about a scientific controversy involving scientists who offer distinct interpretations or analyses of published scientific data. This is not uncommon in stem cell biology. For example, scientists disagree about the differentiation capacity of adult bone marrow stem cells. Some scientists are convinced that these adult stem cells have a broad differentiation capacity and that a significant proportion can turn into heart cells or brain cells. On the other hand, there are many stem cell researchers who disagree and instead believe that adult bone marrow stem cells are very limited in their differentiation capacity. Both groups of scientists can point to numerous experiments and papers published in peer-reviewed scientific journals which back up their respective points of view. At any given stem cell meeting, the percentages of scientists favoring one view over the other can range from 30% to 70%, depending on who is attending and who is organizing that specific stem cell conference. We still have not reached a consensus in this field, so I think it is reasonable to say “scientists are divided over the differentiation capacity of adult bone marrow stem cells“.

In contrast, when it comes to the issue of global warming, there is a broad consensus in the scientific community. A 2010 study in the Proceedings of the National Academy of Sciences by Anderegg and colleagues reviewed published papers and statements made by climate researchers. The authors found that 97% to 98% of climate researchers were convinced by the scientific evidence for anthropogenic climate change, i.e. that humans are primarily responsible for global warming. When there is such a broad consensus among scientists and such overwhelming scientific data that supports anthropogenic climate change, one cannot really say “scientists are divided” merely because two or three scientists out of one hundred are not convinced.

Today, when I saw the headline “Scientists divided over device that ‘remotely detects hepatitis C’ ” in the Guardian, I assumed that a major scientific study had been published describing a new way to diagnose Hepatitis C and that there was considerable disagreement among Hepatitis C experts as to the value of this new device. To my surprise, I found this description in the Guardian:

The device the doctor held in his hand was not a contraption you expect to find in a rural hospital near the banks of the Nile.

 For a start, it was adapted from a bomb detector used by the Egyptian army. Second, it looked like the antenna for a car radio. Third, and most bizarrely, it could – the doctor claimed – remotely detect the presence of liver disease in patients sitting several feet away, within seconds.

 The antenna was a prototype for a device called C-Fast. If its Egyptian developers are to be believed, C-Fast is a revolutionary means of using bomb detection technology to scan for hepatitis C – a strongly contested discovery that, if proven, would contradict received scientific understanding, and potentially change the way many diseases are diagnosed.

This “C-Fast device”, co-developed by the Egyptian liver specialist Gamal Shiha, sounded like magic, and sure enough, even the Guardian referred to it as a “mechanical divining rod“.

Witnessed in various contexts by the Guardian, the prototype operates like a mechanical divining rod – though there are digital versions. It appears to swing towards people who suffer from hepatitis C, remaining motionless in the presence of those who don’t. Shiha claimed the movement of the rod was sparked by the presence of a specific electromagnetic frequency that emanates from a certain strain of hepatitis C.

After I read the remainder of the article, it turned out there are no published scientific studies to confirm that this rod, antenna or wand can detect hepatitis viruses at a distance.  The article says it “has been successfully trialled in 1,600 cases across three countries, without ever returning a false negative result“, but this data has not been published in a peer-reviewed journal. As a scientist and a physician, I am of course very skeptical. The physicians using this device claim it has 100% sensitivity without presenting the data in a peer-reviewed forum. But what is even more surprising is the suggestion that electromagnetic signals travel from the virus in the body of a patient to this remote device, without any scientific evidence to back this up.

The Guardian then also quotes a University College London expert:

“If the application can be expanded, it is actually a revolution in medicine,” said Pinzani, head of UCL’s liver institute. “It means that you can detect any problem you want.”

 By way of example, Pinzani said the device could conceivably be used to instantaneously detect certain kinds of cancer symptoms: “You could go into a clinic, and a GP could find out if you had a tumour marker.”

This expert is already fantasizing about cancer diagnostics with this divining rod even though there is no credible published scientific data. The Guardian article also mentions that well-known scientific journals have rejected articles about this new device and that the “scientific basis has been strongly questioned by other scientists“, but the Guardian is compromising its journalistic integrity by presenting this as a legitimate scientific debate and claiming that “scientists are divided” in the title of the article. How can scientists be divided if the data has not been made public and if it has not undergone peer review? For now, this claim of a diagnostic divining rod is pure sensationalism and not an actual scientific controversy. Such sensationalism will attract many readers, but it should not be an excuse for shoddy journalism.


Image Credit: Public domain image of Otto Edler von Graeve in 1913 with a divining rod via Wikimedia Commons

UPDATE: The comment thread of the Guardian article indicates that Pinzani feels misrepresented by the article and cites a letter that Pinzani has purportedly written in response to the article. I am not able to verify whether this letter was indeed written by him and how exactly Pinzani was misrepresented by the Guardian.

UPDATE February 26, 2012: The Guardian has now changed the headline to Scientists sceptical about device that ‘remotely detects hepatitis C’. I think this headline is much better than the previous one which suggested that “scientists were divided”. I still think that newspapers and magazines sometimes unnecessarily portray pseudo-scientific viewpoints as legitimate, equal partners in a scientific debate. This type of even-handedness only makes sense if certain viewpoints are backed up by rigorous scientific studies.

The ENCODE Controversy And Professionalism In Science

The ENCODE (Encyclopedia Of DNA Elements) project received quite a bit of attention when its results were publicized last year. This project involved a very large consortium of scientists with the goal to identify all the functional elements in the human genome. In September 2012, 30 papers were published in a coordinated release and their extraordinary claim was that roughly 80% of the human genome was “functional”. This was in direct contrast to the prevailing view among molecular biologists that the bulk of human DNA was just “junk DNA”, i.e. sequences of DNA for which one could not assign any specific function. The ENCODE papers contained huge amounts of data, collating the work of hundreds of scientists who had worked on this for nearly a decade. But what garnered most attention, among scientists, the media and the public was the “80%” claim and the supposed “death of junk DNA“.

Soon after the discovery of DNA, the primary function ascribed to DNA was its role as a template from which messenger RNA could be transcribed and then translated into functional proteins. Using this definition of “function”, only 1-2% of the human DNA would be functional because they actually encoded for proteins. The term “junk DNA” was coined to describe the 98-99% of non-coding DNA which appeared to primarily represent genetic remnants of our evolutionary past without any specific function in our present day cells.

However, in the past decades, scientists have uncovered more and more functions for the non-coding DNA segments that were previously thought to be merely “junk”. Non-coding DNA can, for example, act as a binding site for regulatory proteins and exert an influence on protein-coding DNA. There has also been an increasing awareness of the presence of various types of non-coding RNA molecules, i.e. RNA molecules which are transcribed from the DNA but not subsequently translated into proteins. Some of these non-coding RNAs have known regulatory functions, others may not have any or their functions have not yet been established.

Despite these discoveries, most scientists were in agreement that only a small fraction of DNA was “functional”, even when all the non-coding pieces of DNA with known functions were included. The bulk of our genome was still thought to be non-functional. The term “junk DNA” was used less frequently by scientists, because it was becoming apparent that we were probably going to discover even more functional elements in the non-coding DNA.

In September 2012, everyone was talking about “junk DNA” again, because the ENCODE scientists claimed their data showed that 80% of the human genome was “functional”. Most scientists had expected that the ENCODE project would uncover some new functions for non-coding DNA, but the 80% figure was way out of proportion to what everyone had expected. The problem was that the ENCODE project used a very low bar for “function”. Binding to the DNA or any kind of chemical DNA modification was already seen as a sign of “function”, without necessarily proving that these pieces of DNA had any significant impact on the function of a cell.

The media hype with the “death of junk DNA” headlines and the lack of discussion about what constitutes function were appropriately criticized by many scientists, but the recent paper by Dan Graur and colleagues “On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE” has grabbed everyone’s attention. Not necessarily because of the fact that it criticizes the claims made by the ENCODE scientists, but because of the sarcastic tone it uses to ridicule ENCODE.

There have been so many other blog posts and articles that either praise or criticize the Graur paper, so I decided to list some of them here:

1. PZ Myers writes “ENCODE gets a public reaming” and seems to generally agree with Graur and colleagues.

2. Ashutosh Jogalekar says Graur’s paper is a “devastating takedown of ENCODE in which they pick apart ENCODE’s claims with the tenacity and aplomb of a vulture picking apart a wildebeest carcass.”

3. Ryan Gregory highlights some of the “zingers” in the Graur paper

Other scientists, on the other hand, agree with some of the conclusions of the Graur paper and its criticism of how the ENCODE data was presented, but disagree with the sarcastic tone:

1. OpenHelix reminds us that this kind of spanking” should not distract from all the valuable data that ENCODE has generated.

2. Mick Watson shows how Graur and colleagues could have presented their key critiques in a very non-confrontational manner and foster a constructive debate.

3. Josh Witten points out the irony of Graur accusing ENCODE of seeking hype, even though Graur and his colleagues seem to use sarcasm and ridicule to also increase the visibility of their work. I think Josh’s blog post is an excellent analysis of the problems with ENCODE and the problems associated with Graur’s tone.

On Twitter, I engaged in a debate with Benoit Bruneau, my fellow Scilogs blogger Malcolm Campbell and Jonathan Eisen and I thought it would be helpful to share the Storify version here. There was a general consensus that even though some of the points mentioned by Graur and colleagues are indeed correct, their sarcastic tone was uncalled for. Scientists can be critical of each other, but can and should do so in a respectful and professional manner, without necessarily resorting to insults or mockery.

[<a href=”//” target=”_blank”>View the story “ENCODE controversy and professionalism in scientific debates” on Storify</a>]
Graur D, Zheng Y, Price N, Azevedo RB, Zufall RA, & Elhaik E (2013). On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE. Genome biology and evolution PMID: 23431001

Breakthrough Prize in Life Sciences: Hopefully Not Just A Nobel Prize in Medicine 2.0

The recent announcement of the Breakthrough Prize in Life Sciences” and its inaugural 11 recipients is causing quite a bit of buzz in the research community. The Silicon Valley celebrities Art Levinson, Sergey Brin, Anne Wojcicki, Mark Zuckerberg and Priscilla Chan, and Yuri Milner have established the Breakthrough Prize in Life Sciences Foundation, which intends to award five annual prizes in the amount of $3 million each to honor “extraordinary achievements of the outstanding minds in the field of life sciences, enhance medical innovation, and ultimately become a platform for recognizing future discoveries”.


The inaugural recipients are:

1. Cornelia I. Bargmann: For the genetics of neural circuits and behavior, and synaptic guidepost molecules

2. David Botstein: For linkage mapping of Mendelian disease in humans using DNA polymorphisms.

3. Lewis C. Cantley: For the discovery of PI 3-Kinase and its role in cancer metabolism.

4. Hans Clevers: For describing the role of Wnt signaling in tissue stem cells and cancer.

5. Titia de Lange: For research on telomeres, illuminating how they protect chromosome ends and their role in genome instability in cancer.

6. Napoleone Ferrara: For discoveries in the mechanisms of angiogenesis that led to therapies for cancer and eye diseases.

7. Eric S. Lander: For the discovery of general principles for identifying human disease genes, and enabling their application to medicine through the creation and analysis of genetic, physical and sequence maps of the human genome.

8. Charles L. Sawyers: For cancer genes and targeted therapy.

9. Bert Vogelstein: For cancer genomics and tumor suppressor genes.

10. Robert A. Weinberg: For characterization of human cancer genes.

11. Shinya Yamanaka: For induced pluripotent stem cells.


Anyone familiar with cell biology or molecular biology will recognize most, if not all of these names, because this list consists of many important leaders in these areas. As a stem cell biologist, I am happy to see at least two other stem cell researchers on the list: 1) Shinya Yamanaka (who received the 2012 Nobel Prize in Physiology or Medicine) for discovering that adult skin cells could be converted into pluripotent stem cells by just introducing four genes into the cells and 2) Hans Clevers, who is one of the world’s leading researchers in the field of adult stem cell biology and has been instrumental in characterizing stem cells in the intestinal tissue and defining the role of the Wnt signaling pathway, which regulates both proliferation and differentiation of adult stem cells.

The amount awarded to the recipients seems staggeringly high – $3 million is nearly triple the size of the Nobel Prize. However, one also needs to keep in mind that the Breakthrough Prize not only honors past achievements, but also has “the aim of providing the recipients with more freedom and opportunity to pursue even greater future accomplishments.” This means that the laureates are expected (but not necessarily required) to use some of the funds to pursue new directions of research. Biomedical research is expensive. A typical NIH R01 grant, which is the lifeblood of most federally funded biomedical research labs in the United States, has a budget of $250,000 per year and $1,250,000 over a five year period for which funds are usually requested for a single project. The annual amount of $250,000 has to cover the salaries of the employees working in the laboratory, employee benefits such as health insurance, maintenance contracts to keep up existing equipment and thus leaves very little money to buy the actual materials and equipment needed to conduct the experiments. This relatively small amount of money to conduct experiments forces many scientists to be rather conservative in their work. They do not want to invest money in innovative and high-risk projects, because these do not always yield definitive results, and inconclusive results could jeopardize future grant funding and put the jobs of one’s employees or trainees at risk.

The $3 million amount of the Breakthrough Prize, on the other hand, gives the researchers the freedom to try out exciting and high-risk ideas, without having to spend months writing grant proposals. The $3 million amount is enough to fund two high risk NIH R01 grant size projects for five years, that is, if the laureates choose to use all their award money for their research instead of buying a luxury yacht.

Even though all the laureates above are established and internationally renowned scientists, they are at different stages in their research career. David Botstein, for example, is 70 years old and was already a molecular biology legend when I was a grad student in the 1990s. On the other hand, Shinya Yamanaka is only 50 years old and is in the prime of his research career, with hopefully many more decades of research ahead of him. I also like the fact that the foundation will accept nominations from the public, and I hope that its selection process will be more transparent than the closed door policy involved in the selection of Nobel prize laureates.

Despite all my enthusiasm for the new Breakthrough Prize and my hope that it will help re-energize research in the life sciences, I am concerned by the medical focus of the Foundation’s aims. The title of the prize is “Breakthrough Prize in Life Sciences”, but the aims are to recognize excellence in research aimed at curing intractable diseases and extending human life.” Why is there such a focus on human life and disease? The field of “life sciences” comprises much more than just human life. It includes areas as diverse as ecology, evolutionary biology and botany, even if they do not have any direct implications for human disease. All of the announced laureates worked on areas that are more or less directly connected to human diseases such as cancer or human physiology. In this sense, this new prize is not too different from the Nobel Prize in Physiology or Medicine, merely larger in size, a 2.0 version of the current Nobel Prize in Physiology or Medicine. I have previously written about the lack of a Nobel Prize equivalent that honors efforts in non-medical life sciences. I hope that the Breakthrough Prize foundation reconsiders the medical focus of the prize and that future awards will also be made to life scientists who do not work in areas that directly relate to human life and human disease.


UPDATE: I would like to thank some of the readers for their comments, including those who commented on Twitter and I thought it might be helpful to respond to them in this update. One important point raised by some readers is that it should not be our place to tell philanthropists what to do with their money. It is their money and they get to choose what kind of prizes and charitable foundations they establish. In this particular case, some of the founders of the Breakthrough Prize in Life Sciences may have been influenced by personal experiences of their family members or friends with certain illnesses. This could explain the medical or biomedical focus of the prize.

I completely agree that philanthropists should decide what the goals of an established foundation are, but I still think that it is not wrong to engage in a debate. Especially in the case of the Breakthrough Prize in Life Sciences, I think there are at least three good reasons, why this debate is necessary and helpful.

1. The foundation website indicates that it will soon accept online nominations for future awards from the public. This suggests that the philanthropists are open to outside suggestions and perhaps this openness can be extended to engaging in a dialogue about the actual aims of the prize itself. The philanthropists do not have to listen to what scientists say about including awards in the non-medical life sciences, but we scientists should at the very least voice our concerns.

2. The name of the prize is “Breakthrough Prize in Life Sciences”, but the explicit aims are very much focused on human disease and extending human life. This is a bit of a disconnect, because the broadly phrased title “life sciences” encompasses far more than just medical research.

3. There are already numerous honors and prizes available for outstanding achievements in medical research or biological research with direct medical impact. What we lack is a Nobel Prize equivalent in the non-medical life sciences. This is not a big surprise, because the human suffering associated with illness probably motivates many philanthropists. It is thus understandable that many philanthropic foundations might gravitate towards valuing research with medical implications more than non-medical research. However, as scientists, we need to remind philanthropists that in the 21st century, we recognize the importance of biodiversity. We want to understand the biology of plants and the wonderful multitude of animal species. We need to work together to preserve the biodiversity on our planet, even if there is no direct link between this type of research and specific human diseases.

Charles Darwin was one of the most brilliant life scientists in the past two centuries. His work has revolutionized how we think in biology. Would Charles Darwin receive a Breakthrough Prize in Life Sciences? His work was not necessarily directed at extending human life or treating specific human diseases, but the revolution in biological thought that he initiated ultimately did have a major impact on medical sciences, too. I think we should try our best to establish a prize that honors and supports excellence in the life sciences without obvious or direct medical applications. Such prizes should be awarded to the contemporary Charles Darwins in our midst, without requiring them to prove or justify the medical relevance of their work.

Stemming the Flow: Using Stem Cells To Treat Urinary Bladder Dysfunction

Neurogenic bladder is a disorder which occurs in spinal cord diseases such as spina bifida and is characterized by an inability of the nervous system to properly control the urinary bladder and the muscle tissue contained in the bladder wall. This can lead to spasms and a build-up of pressure in the bladder, often resulting in urinary incontinence. Children with spina bifida and neurogenic bladder often feel urges to urinate after drinking comparatively small amounts of liquid and they can also involuntarily leak urine. This is a source of a lot of emotional stress, especially in social settings such as when they are around friends or in school. If untreated, the long-standing and frequent pressure build-up in the bladder can have even more devastating effects such as infections or kidney damage.

Current treatments for neurogenic bladder involve surgeries which reconstruct and increase the size of the bladder by using tissue patches obtained from the bowel of the patient. Since such a gastrointestinal patch is derived from the patient’s own body, it is less likely to elicit an immune response and these intestinal tissue patches tend to be strong enough to withstand the pressures in the bladder. Unfortunately, the incompatibility of intestinal tissue and bladder tissue can lead to long-term complications, such as urinary tract infections, formation of urinary tract stones and in some rare cases even cancers. For this reason, researchers have been searching for newer safer patches which resemble the actual bladder wall.


A team of researchers at Northwestern University recently published a study which used stem cells of children with spina bifida to generate tissue patches that could be used for bladder surgery. In the paper “Cotransplantation with specific populations of spina bifida bone marrow stem/progenitor cells enhances urinary bladder regeneration” published in the Proceedings of the National Academy of Sciences (online publication on February 19, 2013), Arun Sharma and colleagues isolated two types of cells from the bone marrow of children with spina bifida: Mesenchymal stem cells (MSCs) and CD34+ cells (stem and progenitor cells which usually give rise to blood cells). They then coated a special polymer scaffold called POC with the cells and implanted this newly created patch into a rat bladder after performing a bladder augmentation surgery, similar to what is performed in patients with spina bifida. They then assessed the survival and formation of human muscle tissue on the implanted patch. When both human cell types (MSCs and CD34+) were combined, more than half of the implanted patch was covered with muscle tissue, four weeks after the implantation. If they only used CD34+ cells, they found that only a quarter of the patch was covered with muscle tissue. What is even more remarkable is that in addition to the newly formed muscle tissue, the implanted patch also showed evidence of some peripheral nerve growth and of blood vessel formation, both of which are found in healthy, normal bladder walls. These findings suggest that a patient’s own bone marrow stem cells can be used to help construct a tissue patch which could be used for bladder augmentation surgeries. The observation of some nerve growth in the implanted patch is also an exciting finding. One could conceivably try to re-connect the reconstructed bladder tissue with the main nervous system, but its success would largely depend on the severity of the neurologic disease.

One has to keep in mind that there are some key limitations to this study. The authors of the paper believe that the newly formed muscle tissue on the implanted patches was all derived from the patients’ bone marrow stem cells. However, there were no experiments performed to convincingly demonstrate this. The authors report that in previous studies, merely implanting the empty POC scaffold without any human stem cells resulted in 20% coverage with muscle tissue. This suggests that a big chunk of the newly formed muscle tissue is actually derived from the host rat and not from human stem cells. The authors also did not compare the effectiveness of this newly formed stem cell patch to the currently used intestinal patches, and there is no assessment of whether the newly formed muscle tissue on the reconstructed bladder is less prone to spasms and involuntary contractions. Lastly, all the in vivo testing of the tissue patches was performed in rats without neurogenic bladder and it is possible that the highly successful formation of muscle tissue may have been diminished if the animals had a neurologic disease.

A second study published in PLOS One took a different approach. In “Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells” (online publication February 7, 2013), Debra Franck and colleagues describe how they coated a scaffold consisting of silk threads with extracellular matrix proteins such as fibronectin. Instead of using bone marrow stem cells, they converted induced pluripotent stem cells into the smooth muscle cells that are typically found inside the bladder wall and placed these newly differentiated cells on the silk scaffold. The induced pluripotent stem cells (iPSCs) used by Franck and colleagues can be generated from a patient’s own skin cells which reduces the risk of being rejected by a patient’s immune system. The advantage of this approach is that it starts out with a pure and truly pluripotent stem cell population, which is easier to direct and control than bone marrow stem cells. There are also a few important limitations to this second study. Franck and colleagues used mouse pluripotent stem cells and it is not clear that their approach would necessarily work with human pluripotent stem cells. They also did not test the function of these differentiated cells on the silk scaffold to check if they actually behaved like true bladder wall smooth muscle cells. Unlike the first study, Franck and colleagues did not evaluate the newly created patch in an animal model.

Both studies are purely experimental and much additional work is needed before they can be tested in humans, but both show promising new approaches to help improve bladder dysfunction. It is heartening to see that researchers are developing new cell-based therapies to help children and adults who suffer from neurogenic bladder. The results from these two experimental studies are still too preliminary to predict whether cell-based therapies can be successfully used in patients, but they represent important first steps.


Image credit: Taken from Franck D, Gil ES, Adam RM, Kaplan DL, Chung YG, et al. (2013) Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells. PLoS ONE 8(2): e56237. doi:10.1371/journal.pone.0056237- Figure 6 B: Differentiated mouse induced pluripotent stem cells cultured on fibronectin-coated silk matrices show protein markers typically found in bladder smooth muscle cells.
Franck, D., Gil, E., Adam, R., Kaplan, D., Chung, Y., Estrada, C., & Mauney, J. (2013). Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells PLoS ONE, 8 (2) DOI: 10.1371/journal.pone.0056237

Resisting Valentine’s Day

To celebrate Valentine’s Day (as a geeky scientist), I decided to search the “Web of Science” database for published articles with the phrase “Valentine’s Day” in the title. The article with the most citations was “Market-resistance and Valentine’s Day events” published in the Journal of Business Research in 2009, by the authors Angeline Close and George Zinkhan. I had never heard of the journal before, but the title sounded rather interesting so I decided to read it.

The authors reported the results of a survey of college students and consumers conducted in 2003-2005 regarding their thoughts about gift-giving on Valentine’s Day:

1) Most males (63%) and some females (31%) feel obligated to give a gift to their partner for this holiday

2) Males in a new relationship (i.e. less than six months) feel most obligated (81%), females in a new relationship are the second most obligated group (50%)

3) Less than half of males (44%) in a more established relationship feel obligated, and this number is even lower for females in more established relationships (13%)


The authors also conducted interviews using open-ended questions and reviewed diaries and E-diaries to investigate whether people indicated a “resistance” to giving gifts. They found that people expressed three different types of resistance, either opposing or severely limiting the giving of gifts (gift resistance), resisting the purchase of gifts (retail resistance) or broadly opposing the Valentine’s Day business in general (market resistance). All of these forms of “resistance” appeared to be connected to an anti-consumption attitude, the desire to not be drawn into a culture of excessive consumerism.

Here are a couple of quotes from the participants:

Valentine’s Day is a marketing strategy by the flower and candy companies. It’s a cheesy, overblown, stupid “holiday” to force you to spend money on each other.

Valentine’s Day is a way for retailers to get you to spend money in their stores. People get caught up in the B.S. and I should not have to spend extra to show I care, and my girlfriend agrees. But we both still spent plenty!

The survey results indicating differences between men and women are interesting but the paper also shows that even though the majority of people in the US might feel obligated to give each other gifts on Valentine’s Day, there is a strong anti-consumption attitude. People are not willing to succumb to the pressure to spend a lot of money that ultimately benefits retailers. They are instead expressing their affection for each other in ways that do not involve purchasing expensive gifts.

If you forgot to get a Valentine’s Day gift for your partner or spouse, just print out a copy of this paper and give it to them instead, saying that your lack of gift-giving is your expression of anti-consumption resistance. If that person is just as geeky as you are, you might be able to pull it off.

There is one caveat: The Journal of Business Research is not open access, so you may hit a paywall asking for $31.50 to read the article, which is more than a typical box of chocolates. Don’t panic, fortunately, you can read it for free here.


Image credit: Early 20th century Valentine’s Day card, showing woman holding heart shaped decoration and flowers, ca. 1910 – via Wikimedia Commons – Public Domain

Close, A., & Zinkhan, G. (2009). Market-resistance and Valentine’s Day events Journal of Business Research, 62 (2), 200-207 DOI: 10.1016/j.jbusres.2008.01.027

New Directions In Scientific Peer Review

Most scientists have a need-hate relationship with scientific peer review. We know that we need some form of peer review, because it is an important quality control measure that is supposed to help prevent the publication of scientifically invalid results. However, we also tend to hate scientific peer review in its current form, because we have had many frustrating experiences with it.

We recently submitted a manuscript to a journal, where it was stuck for more than one year, undergoing multiple rounds revisions in response to requests by the editors and the reviewers, after which they finally rejected it. The reviewers did not necessarily question the validity of our results, but they wanted us to test additional cell lines, confirm many of the findings with multiple methods and identify additional mechanisms that might explain our findings so that the paper started ballooning in size. I was frustrated because I felt that there was no end in sight. There are always novel mechanisms that one has not investigated. A scientific paper is not meant to investigate every possible explanation for a phenomenon, because that would turn the paper into a never-ending saga –every new finding usually raises even more questions.

We received a definitive rejection after multiple rounds of revisions (taking more than a year), but I was actually relieved because the demands of the reviewers were becoming quite excessive. We resubmitted the manuscript to a different journal, for which we had to scale back the manuscript. The new journal had different size restrictions and some of the revisions only made sense in the context of those specific reviewer requests and did not necessarily belong in the manuscript. This new set of reviewers also made some requests for revisions, but once we had made those revisions, the manuscript was published within a matter of months.

I have also had frustrating experiences as a scientific peer reviewer. Some authors completely disregard suggestions for improving the manuscript, and it is really up to the individual editors to decide who they side with. Scientific peer review in its current form also does not involve testing for reproducibility. As reviewers, we have to accept the authors’ claims that they have conducted sufficient experiments to test the reproducibility and validity of their data. Reviewers do not check whether their own laboratory or other laboratories can replicate the results described in the manuscript. Scientific peer reviewers have to rely on the scientific integrity of the authors, even if their gut instinct tells them that these results may not be reproducible by other laboratories.

Due to these experiences, many scientists like to say that the current peer review system is “broken”, and we know that we need radical changes to make the peer review process more reliable and fair. There are two new developments in scientific peer review that sound very interesting: Portable peer review and open peer review.

Richard Van Noorden describes the concept of portable peer review that will soon be offered by a new company called Rubriq, which will conduct the scientific peer review and provide the results for a fee to the editors of the journal. Interestingly, Rubriq will also pay peer reviewers, something which is quite unusual in the current peer review system, which relies on scientists volunteering their time as peer reviewers.  The basic idea is that if journal rejects a paper after the peer review conducted by Rubriq, the comments of the reviewers would still used by the editors of the new journal as long as it also subscribes to the Rubriq service. This would cut down on the review time at the new journal, because the editors could base their decision of acceptance or rejection on the existing reviews instead of sending out the paper for another new, time consuming review. I like this idea, because it “recycles” the efforts of the first round of review and will likely streamline the review process. My only concern is that reviewers currently use different review criteria, depending on what journal they review for. When reviewing for a “high prestige” journal, reviewers tend to set a high bar for novelty and impact and their comments likely reflect this. It may not be very easy for editors to use these reviews for a very different journal. Furthermore, editors get to know their reviewers over time and pick certain reviewers that they believe will give the most appropriate reviews for a submitted manuscript. I am not sure that editors of journals would be that pleased by “farming out” this process to a third party.

The second new development is the concept of open peer review, as proposed by the new open access scientific journal PeerJ. I briefly touched on this when discussing a paper on the emotional impact of genetic testing, but I would like to expand on this, because I am very intrigued by the idea of open peer review. In this new peer review system, the scientific peer reviewers can choose to either remain anonymous or disclose their names. One would think that peer reviewers should be able to stand by their honest, constructive peer reviews so there should be no need for anonymity. On the other hand, some scientists might worry about (un)professional repercussions because some authors may be offended by the critiques. Therefore, I think it is quite reasonable that PeerJ permits anonymity of the reviewers.

The true novelty of the open review system is that the authors can choose to disclose the peer review correspondence, which includes the initial comments by the reviewers as well as their own rebuttal and revisions. I think that this is a very important and exciting development in peer review. It forces the peer reviewers to remain civil and reasonable in their comments. Even if a reviewer chooses to remain anonymous, they are probably still going to be more thoughtful in their reviews of the manuscript if they realize that potentially hundreds or thousands of other scientists could have a peek at their comments. Open peer review allows the public and the scientific community to peek behind the usually closed doors of scientific peer reviews. This provides a certain form of public accountability for the editors. They cannot just arbitrarily accept or reject manuscripts without good reasons, because by opening up the review process to the public they may have to justify their decisions based on the reviews they solicited. One good example for the civil tone and reasonable review requests and responses can be found in the review of the BRCA gene testing paper. The reviewers (one of them chooses to remain anonymous) ask many excellent questions, including questions about the demographics and educational status of the participants. The authors’ rebuttal to some of the questions was that they did not collect the data and cannot include it in the manuscript, but they also expand some of the presented data and mention caveats of their study in the revised discussion. The openness of the review process now permits the general reader to take advantage of the insights of the reviewers, such as the missing information about the educational status of the participants.

The open review system is one of the most important new advances in scientific peer review and I hope that other journals (even the more conservative, traditional and non-open access journals) will implement a similar open peer review system. This will increase accountability of reviewers and editors, and hopefully improve the efficiency and quality of scientific peer review.

Good Can Come From Bad: Genetic Testing For The BRCA Breast Cancer Genes

Our ability to test for the presence of genetic mutations has become extremely cost-efficient and private companies, such as 23andMe now offer genetic testing for consumers who want to find out about their predisposition for genetic diseases. The results of such tests are sent directly to the consumers, without the involvement of genetic counselors or other healthcare providers. This has lead to a growing concern about how people will respond to finding out that they are carriers of mutations that predispose them to certain life-threatening diseases. Will the individuals be burdened by excessive anxiety? Will they tell their relatives and their healthcare providers that they carry mutations?

A study published in the new open access journal PeerJ addressed these questions by contacting male and female individuals who has received genetic testing by 23andMe for mutations in the BRCA genes which are strongly associated with breast cancer. The study “Dealing with the unexpected: consumer responses to direct-access BRCA mutation testing” by Uta Francke and colleagues (who are all employees of 23andMe) surveyed 16 women and 16 men who had received the news that they were carriers of BRCA1 or BRCA2 mutations, as well as control subjects who received the fortunate news that they did not carry any of the common BRCA mutations. Among the 16 women who tested positive (i.e. found out that they had a significant likelihood of developing breast cancer), none were extremely upset and six were either mildly or moderately upset. Surprisingly, nine mutation-positive women reported that they felt “neutral”.

The majority of the mutation-positive participants shared the test results with their spouses / partners or their blood relatives. Importantly, 13 of the 16 mutation-positive women contacted their primary care physician, gynecologist or oncologist for medical advice. There were 11 mutation-positive women who received this information through 23andMe for the first time (the others had already been diagnosed with breast cancer or had previously undergone testing), and these women indicated that they were planning to either undergo surgeries or have further breast cancer work-up and regular exams. The majority of mutation-positive men, on the other hand, did not consult their physicians, but did indicate that they would participate in future cancer screening.

Nearly all the participants said that they would undergo the testing again and felt good about knowing the results, independent of whether they positive or negative for the BRCA mutations. Only one of the participants (a mutation positive man with a family history of breast cancer) said that he would have preferred not to know, because of the “emotional cost”. A significant proportion of the participants who tested positive also had their relatives tested. This lead to the identification of 13 additional carriers, many of whom received medical counseling and were planning to take risk-reducing measures.

These findings suggest that the identification of mutations that indicate a high risk for developing breast cancer did not lead to severe anxiety or panic, but actually resulted in pro-active steps and medical care to help reduce their risk of developing breast cancer. One has to bear in mind that the sample size is small and that the study and the salaries of the authors were all funded by 23andMe, a genetic testing company that would financially benefit from widespread genetic testing. Nevertheless, the presented data seem solid and the responses of the participants do suggest that such testing was on the whole very beneficial for the participants. Hopefully, we will see more data emerge in the future regarding the psychological impact of genetic testing and whether the findings of this small study hold up in larger cohorts and when it comes to other genetic diseases.

On a side note, there is a very intriguing aspect to this paper that will be of benefit to many readers. The PeerJ journal gives the authors of a manuscript the option of disclosing the peer review process to the public. The authors of this paper took advantage of this option and we can all have a close look at the peer reviewer comments as well as the rebuttal of the authors. For anyone who is not used to reviewing scientific manuscripts, this is an excellent opportunity to learn about the inner workings of the peer review process.


Image Credit: Cartoon representation of the molecular structure of BRCA1 by Jawahar Swaminathan and MSD staff at the European Bioinformatics Institute, via Wikimedia Commons

Francke, U., Dijamco, C., Kiefer, A., Eriksson, N., Moiseff, B., Tung, J., & Mountain, J. (2013). Dealing with the unexpected: consumer responses to direct-access BRCA mutation testing PeerJ, 1 DOI: 10.7717/peerj.8

Some Highlights of the Live Chat: “Are We Doing Science the Right Way?”

On February 7, 2013, ScienceNOW organized a Live Chat with the microbiologists Ferric Fang and Arturo Casadevall that was moderated by the Science staff writer Jennifer Couzin-Frankel and discussed a very broad range of topics related to how we currently conduct science. For those who could not participate in the Live Chat, I will summarize some key comments made by Fang and Casadevall, Couzin-Frankel or other commenters.


I have grouped the comments into key themes and also added some of my own thoughts.


1. Introduction to the goals of the Live Chat:

Jennifer Couzin-Frankel: …..For several years (at least) researchers have worried about where their profession is heading. As much as most of them love working in the lab, they’re also facing sometimes extreme pressure to land grants and publish hot papers. And surveys have shown that a subset are even bending or breaking the rules to accomplish that.….With us today are two guests who are studying the “science of science” together, and considering how to nurture discovery and reduce misconduct…


Pressure to publish, the difficulties to obtain grant funding, scientific misconduct – these are all topics that should be of interest to all of us who are actively engaged in science.


2. Science funding:

Ferric Fang: ….the way in which science is funded has a profound effect on how and what science is done. Paula Stephan has recently written an excellent book on this subject called “How Economics Shapes Science.”

Ferric Fang: Many are understandably reluctant to ask for more funding given the global recession and halting recovery. But I believe a persuasive economic case can be made for greater investment in R&D paying off in the long run. Paula Stephan notes that the U.S. spends twice as much on beer as on science each year.


These are great points. I often get the sense that federal funding for science and education is portrayed as an unnecessary luxury, charity or a form of waste. We have to remind people that investments in science and education are a very important investment with long-term returns.


3. Reproducibility and the self-correcting nature of science:

Arturo Casadevall: Is science self-correcting? Yes and No. In areas where there is a lot of interest in a subject experiments will be repeated and bad science will be ferreted out. However, until someone sets out to repeat an experiment we do not know whether it is reproducible. We do not know what percentage of the literature is right because no one has ever done a systematic study to see what fraction is reproducible.


I think that the reproducibility crisis is one of the biggest challenges for contemporary science. Thousands of scientific papers are published every day, and only a tiny fraction of them will ever be tested for reproducibility. There is minimal funding for attempting to replicate published data and also very little incentive for scientists, because even if they are able to replicate the published work, they will have a hard time publishing a confirmatory study. The lack of attempts to replicate scientific data creates a lot of uncertainty, because we do not really know, how much of the published data is truly valid.


Comment From David R Van Houten: …The absence of these weekly [lab] meetings was the single biggest factor allowing for the data fabrication and falsification that I observed 20 years ago as a PhD student. I pushed to get these meetings organized, and when they did occur, it made it easier to get the offender to stop, and easier to “salvage” original data…


I agree that regular lab meetings and more supervision by senior researchers and principal investigators can help contain and prevent data fabrication and falsification. However, overt data fabrication and fraud are probably not as common as “data fudging”, where experiments or data points are conveniently ignored because they do not fit the desired model. This kind of “data fudging” is not just a problem of junior scientists, but also occurs with senior scientists.


Ferric Fang: Peer review plays an important role in self-correction of science but as nearly everyone recognizes, it is not perfect. Mechanisms of post-publication review to address the problems are very important– these include errata, retractions, correspondences, follow up publications, and nowadays, public discussion on blogs and other websites.


I am glad that Fang (who is an editor-in-chief of an academic journal) recognizes the importance of post-publication review, and mentions blog discussions as one such form of post publication review.


4. Are salaries of scientists too low?

Comment From Shabbir: When an hedge fund manager makes 100 times more than a theoretical physicist, how can we expect the bright minds to go to science?


I agree that academic salaries for scientists are on the lower side, especially when compared with the salary that one can make in the private industry. However, I do not think that obscene salaries of hedge fund managers are the correct comparison. If the US wants to attract and retain excellent scientists, raising their salaries is definitely important. Scientists are routinely over-worked, balancing their research work, teaching, mentoring and administrative duties and receive very inadequate compensation. I have also observed a near-cynical attitude of many elite universities, which try to portray working as a scientist as an “honor” that should not require much compensation. This kind of abuse really needs to end.


5. Communicating science to the public

Arturo Casadevall: … Many scientists cannot explain their work at a dinner party and keep the other guests interested. We are passionate about what we do but we are often terrible in communicating the excitement that we feel. I think this is one area where perhaps better public communicating skills are needed and maybe some attention should be given to mastering these arts in training.


I could not agree more. Communicating science should be part of every PhD program, postdoctoral training and an ongoing effort when a scientist becomes an independent principal investigator.


6. Are we focusing on quantity rather than quality in science?

Ferric Fang: …. There are now in excess of 50,000,000 scientific publications according to one estimate, and we are in danger of creating a Library of Babel in which it is impossible to find the truth buried amidst poor quality or unimportant publications. This is in part a consequence of the “publish or perish” mentality in academia. A focus on quality rather than quantity in promotion decisions might help.


It is correct that the amount of scientific data being generated is overwhelming, but I am not sure that there is an easy way to find the “truth”. Scientific “truth” is very dynamic and I think it is becoming more and more difficult to publish in the high impact journals. A typical paper in a high-impact journal now has anywhere between 5 and 20 supplemental figures and tables, and that same paper could have been published as two or three separate papers just a few decades ago. We now just have many more active scientists all over the world that have begun publishing in English and we all have tools that generate huge amounts of data in a matter of weeks (such as microarrays, proteomics and metabolomics). It is likely that the number of publications will continue to rise in the next years and we need to come up with an innovative system to manage scientific information. Hopefully, scientists will realize that managing and evaluating existing scientific information is just as valuable as generating new scientific datasets.


This was a great and inspiring discussion and I look forward to other such Live Chat events.