The Open Access Debate Continues

The New England Journal of Medicine has just published four articles that comment on the issue of “open access”. I will list these four articles and briefly comment on the two papers which are critical of open access publishing.

1. The Downside of Open-Access Publishing by Charlotte Haug

This article discusses potential problems associated with open access publishing but also conflates the issue of open access with the issue of inadequate peer review, as can be seen in this excerpt:

Of course, the terms “international,” “scientific,” “peer-reviewed,” “journal,” “article,” “editor,” and “publisher” do not have copyrighted or patented definitions and can have varied meanings, especially in the Internet age. Must an article be different from a submitted paper? Isn’t everything published online automatically international? Is there anything wrong with a situation in which the editor and publisher are just one person who has set up a website where researchers can submit their papers and pay a fee to have them laid out in a professional way and made available to all interested parties? Isn’t it a good thing that this vast number of new publishers and journals will make it possible to get all research — whatever its quality level — into the public domain? Perhaps. But describing a simple online-posting service as “an international, scientific, peer-reviewed journal” leads authors and readers to believe that they are submitting to or reading something they aren’t.

One central flaw of this argument is that open access does not necessarily mean lack of peer review, as previously discussed.

2. Open but Not Free — Publishing in the 21st Century by Martin Frank

The article by Martin Frank tries to make the case that “open access” publishing itself costs quite a bit of money and that these funds could be better used for research purposes. He distinguishes between “gold open access”, where published articles are immediately available upon publication to the general public without any fees for the readers and “green open access”, which gives free access to the public after an initial period of pay-for-access. With green open access, the publisher generates some revenue during this initial period, whereas in “gold open access” publishing, the researchers usually pay a fee that covers the publication charges so that readers do not have to pay anything.

One section of the article especially caught my eye:

…..assuming that all articles had to be published with gold open access, Harvard Medical School would have to pay $13.5 million (at $1,350 per article) to publish the 10,000 articles authored by its faculty in 2010 — considerably more than the $3.75 million that was in its serials-acquisition budget that year. Research-intensive institutions will thus bear the burden of funding free access to the research literature, subsidizing access for less-research-intensive institutions, including pharmaceutical companies.

This calculation assumes that current pay-for-access journals do not charge researchers for the publication of their articles. I have previously addressed this issue, citing a specific example which shows that pay-for-access journals often charge the researchers several hundred dollars to publish an article. If researchers use color figures, the charges can run up to $2000 or $3000 per manuscript. These author fees are in addition to the fees that publishers of pay-for-access journals charge the readers. Martin Frank’s calculation ignores the author fees that Harvard researchers might be currently paying to publish in pay-for-access journals.

He also mentions the pharmaceutical companies as potential beneficiaries but fails to include other important beneficiaries:

1) Members of the general public, whose taxes paid for most of the biomedical research conducted in the United States and who should thus have a right to access the results of this publicly funded research

2) Individuals in countries who cannot afford the fees to read papers published in pay-for-access journals.

3. Creative Commons and the Openness of Open Access by Michael Carroll

4. For the Sake of Inquiry and Knowledge — The Inevitability of Open Access by Ann Wolpert


Are Scientists Divided Over Divining Rods?

When I read a statement which starts with “Scientists are divided over……“, I expect to learn about a scientific controversy involving scientists who offer distinct interpretations or analyses of published scientific data. This is not uncommon in stem cell biology. For example, scientists disagree about the differentiation capacity of adult bone marrow stem cells. Some scientists are convinced that these adult stem cells have a broad differentiation capacity and that a significant proportion can turn into heart cells or brain cells. On the other hand, there are many stem cell researchers who disagree and instead believe that adult bone marrow stem cells are very limited in their differentiation capacity. Both groups of scientists can point to numerous experiments and papers published in peer-reviewed scientific journals which back up their respective points of view. At any given stem cell meeting, the percentages of scientists favoring one view over the other can range from 30% to 70%, depending on who is attending and who is organizing that specific stem cell conference. We still have not reached a consensus in this field, so I think it is reasonable to say “scientists are divided over the differentiation capacity of adult bone marrow stem cells“.

In contrast, when it comes to the issue of global warming, there is a broad consensus in the scientific community. A 2010 study in the Proceedings of the National Academy of Sciences by Anderegg and colleagues reviewed published papers and statements made by climate researchers. The authors found that 97% to 98% of climate researchers were convinced by the scientific evidence for anthropogenic climate change, i.e. that humans are primarily responsible for global warming. When there is such a broad consensus among scientists and such overwhelming scientific data that supports anthropogenic climate change, one cannot really say “scientists are divided” merely because two or three scientists out of one hundred are not convinced.

Today, when I saw the headline “Scientists divided over device that ‘remotely detects hepatitis C’ ” in the Guardian, I assumed that a major scientific study had been published describing a new way to diagnose Hepatitis C and that there was considerable disagreement among Hepatitis C experts as to the value of this new device. To my surprise, I found this description in the Guardian:

The device the doctor held in his hand was not a contraption you expect to find in a rural hospital near the banks of the Nile.

 For a start, it was adapted from a bomb detector used by the Egyptian army. Second, it looked like the antenna for a car radio. Third, and most bizarrely, it could – the doctor claimed – remotely detect the presence of liver disease in patients sitting several feet away, within seconds.

 The antenna was a prototype for a device called C-Fast. If its Egyptian developers are to be believed, C-Fast is a revolutionary means of using bomb detection technology to scan for hepatitis C – a strongly contested discovery that, if proven, would contradict received scientific understanding, and potentially change the way many diseases are diagnosed.

This “C-Fast device”, co-developed by the Egyptian liver specialist Gamal Shiha, sounded like magic, and sure enough, even the Guardian referred to it as a “mechanical divining rod“.

Witnessed in various contexts by the Guardian, the prototype operates like a mechanical divining rod – though there are digital versions. It appears to swing towards people who suffer from hepatitis C, remaining motionless in the presence of those who don’t. Shiha claimed the movement of the rod was sparked by the presence of a specific electromagnetic frequency that emanates from a certain strain of hepatitis C.

After I read the remainder of the article, it turned out there are no published scientific studies to confirm that this rod, antenna or wand can detect hepatitis viruses at a distance.  The article says it “has been successfully trialled in 1,600 cases across three countries, without ever returning a false negative result“, but this data has not been published in a peer-reviewed journal. As a scientist and a physician, I am of course very skeptical. The physicians using this device claim it has 100% sensitivity without presenting the data in a peer-reviewed forum. But what is even more surprising is the suggestion that electromagnetic signals travel from the virus in the body of a patient to this remote device, without any scientific evidence to back this up.

The Guardian then also quotes a University College London expert:

“If the application can be expanded, it is actually a revolution in medicine,” said Pinzani, head of UCL’s liver institute. “It means that you can detect any problem you want.”

 By way of example, Pinzani said the device could conceivably be used to instantaneously detect certain kinds of cancer symptoms: “You could go into a clinic, and a GP could find out if you had a tumour marker.”

This expert is already fantasizing about cancer diagnostics with this divining rod even though there is no credible published scientific data. The Guardian article also mentions that well-known scientific journals have rejected articles about this new device and that the “scientific basis has been strongly questioned by other scientists“, but the Guardian is compromising its journalistic integrity by presenting this as a legitimate scientific debate and claiming that “scientists are divided” in the title of the article. How can scientists be divided if the data has not been made public and if it has not undergone peer review? For now, this claim of a diagnostic divining rod is pure sensationalism and not an actual scientific controversy. Such sensationalism will attract many readers, but it should not be an excuse for shoddy journalism.


Image Credit: Public domain image of Otto Edler von Graeve in 1913 with a divining rod via Wikimedia Commons

UPDATE: The comment thread of the Guardian article indicates that Pinzani feels misrepresented by the article and cites a letter that Pinzani has purportedly written in response to the article. I am not able to verify whether this letter was indeed written by him and how exactly Pinzani was misrepresented by the Guardian.

UPDATE February 26, 2012: The Guardian has now changed the headline to Scientists sceptical about device that ‘remotely detects hepatitis C’. I think this headline is much better than the previous one which suggested that “scientists were divided”. I still think that newspapers and magazines sometimes unnecessarily portray pseudo-scientific viewpoints as legitimate, equal partners in a scientific debate. This type of even-handedness only makes sense if certain viewpoints are backed up by rigorous scientific studies.

The ENCODE Controversy And Professionalism In Science

The ENCODE (Encyclopedia Of DNA Elements) project received quite a bit of attention when its results were publicized last year. This project involved a very large consortium of scientists with the goal to identify all the functional elements in the human genome. In September 2012, 30 papers were published in a coordinated release and their extraordinary claim was that roughly 80% of the human genome was “functional”. This was in direct contrast to the prevailing view among molecular biologists that the bulk of human DNA was just “junk DNA”, i.e. sequences of DNA for which one could not assign any specific function. The ENCODE papers contained huge amounts of data, collating the work of hundreds of scientists who had worked on this for nearly a decade. But what garnered most attention, among scientists, the media and the public was the “80%” claim and the supposed “death of junk DNA“.

Soon after the discovery of DNA, the primary function ascribed to DNA was its role as a template from which messenger RNA could be transcribed and then translated into functional proteins. Using this definition of “function”, only 1-2% of the human DNA would be functional because they actually encoded for proteins. The term “junk DNA” was coined to describe the 98-99% of non-coding DNA which appeared to primarily represent genetic remnants of our evolutionary past without any specific function in our present day cells.

However, in the past decades, scientists have uncovered more and more functions for the non-coding DNA segments that were previously thought to be merely “junk”. Non-coding DNA can, for example, act as a binding site for regulatory proteins and exert an influence on protein-coding DNA. There has also been an increasing awareness of the presence of various types of non-coding RNA molecules, i.e. RNA molecules which are transcribed from the DNA but not subsequently translated into proteins. Some of these non-coding RNAs have known regulatory functions, others may not have any or their functions have not yet been established.

Despite these discoveries, most scientists were in agreement that only a small fraction of DNA was “functional”, even when all the non-coding pieces of DNA with known functions were included. The bulk of our genome was still thought to be non-functional. The term “junk DNA” was used less frequently by scientists, because it was becoming apparent that we were probably going to discover even more functional elements in the non-coding DNA.

In September 2012, everyone was talking about “junk DNA” again, because the ENCODE scientists claimed their data showed that 80% of the human genome was “functional”. Most scientists had expected that the ENCODE project would uncover some new functions for non-coding DNA, but the 80% figure was way out of proportion to what everyone had expected. The problem was that the ENCODE project used a very low bar for “function”. Binding to the DNA or any kind of chemical DNA modification was already seen as a sign of “function”, without necessarily proving that these pieces of DNA had any significant impact on the function of a cell.

The media hype with the “death of junk DNA” headlines and the lack of discussion about what constitutes function were appropriately criticized by many scientists, but the recent paper by Dan Graur and colleagues “On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE” has grabbed everyone’s attention. Not necessarily because of the fact that it criticizes the claims made by the ENCODE scientists, but because of the sarcastic tone it uses to ridicule ENCODE.

There have been so many other blog posts and articles that either praise or criticize the Graur paper, so I decided to list some of them here:

1. PZ Myers writes “ENCODE gets a public reaming” and seems to generally agree with Graur and colleagues.

2. Ashutosh Jogalekar says Graur’s paper is a “devastating takedown of ENCODE in which they pick apart ENCODE’s claims with the tenacity and aplomb of a vulture picking apart a wildebeest carcass.”

3. Ryan Gregory highlights some of the “zingers” in the Graur paper

Other scientists, on the other hand, agree with some of the conclusions of the Graur paper and its criticism of how the ENCODE data was presented, but disagree with the sarcastic tone:

1. OpenHelix reminds us that this kind of spanking” should not distract from all the valuable data that ENCODE has generated.

2. Mick Watson shows how Graur and colleagues could have presented their key critiques in a very non-confrontational manner and foster a constructive debate.

3. Josh Witten points out the irony of Graur accusing ENCODE of seeking hype, even though Graur and his colleagues seem to use sarcasm and ridicule to also increase the visibility of their work. I think Josh’s blog post is an excellent analysis of the problems with ENCODE and the problems associated with Graur’s tone.

On Twitter, I engaged in a debate with Benoit Bruneau, my fellow Scilogs blogger Malcolm Campbell and Jonathan Eisen and I thought it would be helpful to share the Storify version here. There was a general consensus that even though some of the points mentioned by Graur and colleagues are indeed correct, their sarcastic tone was uncalled for. Scientists can be critical of each other, but can and should do so in a respectful and professional manner, without necessarily resorting to insults or mockery.

[<a href=”//” target=”_blank”>View the story “ENCODE controversy and professionalism in scientific debates” on Storify</a>]
Graur D, Zheng Y, Price N, Azevedo RB, Zufall RA, & Elhaik E (2013). On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE. Genome biology and evolution PMID: 23431001

Breakthrough Prize in Life Sciences: Hopefully Not Just A Nobel Prize in Medicine 2.0

The recent announcement of the Breakthrough Prize in Life Sciences” and its inaugural 11 recipients is causing quite a bit of buzz in the research community. The Silicon Valley celebrities Art Levinson, Sergey Brin, Anne Wojcicki, Mark Zuckerberg and Priscilla Chan, and Yuri Milner have established the Breakthrough Prize in Life Sciences Foundation, which intends to award five annual prizes in the amount of $3 million each to honor “extraordinary achievements of the outstanding minds in the field of life sciences, enhance medical innovation, and ultimately become a platform for recognizing future discoveries”.


The inaugural recipients are:

1. Cornelia I. Bargmann: For the genetics of neural circuits and behavior, and synaptic guidepost molecules

2. David Botstein: For linkage mapping of Mendelian disease in humans using DNA polymorphisms.

3. Lewis C. Cantley: For the discovery of PI 3-Kinase and its role in cancer metabolism.

4. Hans Clevers: For describing the role of Wnt signaling in tissue stem cells and cancer.

5. Titia de Lange: For research on telomeres, illuminating how they protect chromosome ends and their role in genome instability in cancer.

6. Napoleone Ferrara: For discoveries in the mechanisms of angiogenesis that led to therapies for cancer and eye diseases.

7. Eric S. Lander: For the discovery of general principles for identifying human disease genes, and enabling their application to medicine through the creation and analysis of genetic, physical and sequence maps of the human genome.

8. Charles L. Sawyers: For cancer genes and targeted therapy.

9. Bert Vogelstein: For cancer genomics and tumor suppressor genes.

10. Robert A. Weinberg: For characterization of human cancer genes.

11. Shinya Yamanaka: For induced pluripotent stem cells.


Anyone familiar with cell biology or molecular biology will recognize most, if not all of these names, because this list consists of many important leaders in these areas. As a stem cell biologist, I am happy to see at least two other stem cell researchers on the list: 1) Shinya Yamanaka (who received the 2012 Nobel Prize in Physiology or Medicine) for discovering that adult skin cells could be converted into pluripotent stem cells by just introducing four genes into the cells and 2) Hans Clevers, who is one of the world’s leading researchers in the field of adult stem cell biology and has been instrumental in characterizing stem cells in the intestinal tissue and defining the role of the Wnt signaling pathway, which regulates both proliferation and differentiation of adult stem cells.

The amount awarded to the recipients seems staggeringly high – $3 million is nearly triple the size of the Nobel Prize. However, one also needs to keep in mind that the Breakthrough Prize not only honors past achievements, but also has “the aim of providing the recipients with more freedom and opportunity to pursue even greater future accomplishments.” This means that the laureates are expected (but not necessarily required) to use some of the funds to pursue new directions of research. Biomedical research is expensive. A typical NIH R01 grant, which is the lifeblood of most federally funded biomedical research labs in the United States, has a budget of $250,000 per year and $1,250,000 over a five year period for which funds are usually requested for a single project. The annual amount of $250,000 has to cover the salaries of the employees working in the laboratory, employee benefits such as health insurance, maintenance contracts to keep up existing equipment and thus leaves very little money to buy the actual materials and equipment needed to conduct the experiments. This relatively small amount of money to conduct experiments forces many scientists to be rather conservative in their work. They do not want to invest money in innovative and high-risk projects, because these do not always yield definitive results, and inconclusive results could jeopardize future grant funding and put the jobs of one’s employees or trainees at risk.

The $3 million amount of the Breakthrough Prize, on the other hand, gives the researchers the freedom to try out exciting and high-risk ideas, without having to spend months writing grant proposals. The $3 million amount is enough to fund two high risk NIH R01 grant size projects for five years, that is, if the laureates choose to use all their award money for their research instead of buying a luxury yacht.

Even though all the laureates above are established and internationally renowned scientists, they are at different stages in their research career. David Botstein, for example, is 70 years old and was already a molecular biology legend when I was a grad student in the 1990s. On the other hand, Shinya Yamanaka is only 50 years old and is in the prime of his research career, with hopefully many more decades of research ahead of him. I also like the fact that the foundation will accept nominations from the public, and I hope that its selection process will be more transparent than the closed door policy involved in the selection of Nobel prize laureates.

Despite all my enthusiasm for the new Breakthrough Prize and my hope that it will help re-energize research in the life sciences, I am concerned by the medical focus of the Foundation’s aims. The title of the prize is “Breakthrough Prize in Life Sciences”, but the aims are to recognize excellence in research aimed at curing intractable diseases and extending human life.” Why is there such a focus on human life and disease? The field of “life sciences” comprises much more than just human life. It includes areas as diverse as ecology, evolutionary biology and botany, even if they do not have any direct implications for human disease. All of the announced laureates worked on areas that are more or less directly connected to human diseases such as cancer or human physiology. In this sense, this new prize is not too different from the Nobel Prize in Physiology or Medicine, merely larger in size, a 2.0 version of the current Nobel Prize in Physiology or Medicine. I have previously written about the lack of a Nobel Prize equivalent that honors efforts in non-medical life sciences. I hope that the Breakthrough Prize foundation reconsiders the medical focus of the prize and that future awards will also be made to life scientists who do not work in areas that directly relate to human life and human disease.


UPDATE: I would like to thank some of the readers for their comments, including those who commented on Twitter and I thought it might be helpful to respond to them in this update. One important point raised by some readers is that it should not be our place to tell philanthropists what to do with their money. It is their money and they get to choose what kind of prizes and charitable foundations they establish. In this particular case, some of the founders of the Breakthrough Prize in Life Sciences may have been influenced by personal experiences of their family members or friends with certain illnesses. This could explain the medical or biomedical focus of the prize.

I completely agree that philanthropists should decide what the goals of an established foundation are, but I still think that it is not wrong to engage in a debate. Especially in the case of the Breakthrough Prize in Life Sciences, I think there are at least three good reasons, why this debate is necessary and helpful.

1. The foundation website indicates that it will soon accept online nominations for future awards from the public. This suggests that the philanthropists are open to outside suggestions and perhaps this openness can be extended to engaging in a dialogue about the actual aims of the prize itself. The philanthropists do not have to listen to what scientists say about including awards in the non-medical life sciences, but we scientists should at the very least voice our concerns.

2. The name of the prize is “Breakthrough Prize in Life Sciences”, but the explicit aims are very much focused on human disease and extending human life. This is a bit of a disconnect, because the broadly phrased title “life sciences” encompasses far more than just medical research.

3. There are already numerous honors and prizes available for outstanding achievements in medical research or biological research with direct medical impact. What we lack is a Nobel Prize equivalent in the non-medical life sciences. This is not a big surprise, because the human suffering associated with illness probably motivates many philanthropists. It is thus understandable that many philanthropic foundations might gravitate towards valuing research with medical implications more than non-medical research. However, as scientists, we need to remind philanthropists that in the 21st century, we recognize the importance of biodiversity. We want to understand the biology of plants and the wonderful multitude of animal species. We need to work together to preserve the biodiversity on our planet, even if there is no direct link between this type of research and specific human diseases.

Charles Darwin was one of the most brilliant life scientists in the past two centuries. His work has revolutionized how we think in biology. Would Charles Darwin receive a Breakthrough Prize in Life Sciences? His work was not necessarily directed at extending human life or treating specific human diseases, but the revolution in biological thought that he initiated ultimately did have a major impact on medical sciences, too. I think we should try our best to establish a prize that honors and supports excellence in the life sciences without obvious or direct medical applications. Such prizes should be awarded to the contemporary Charles Darwins in our midst, without requiring them to prove or justify the medical relevance of their work.

Stemming the Flow: Using Stem Cells To Treat Urinary Bladder Dysfunction

Neurogenic bladder is a disorder which occurs in spinal cord diseases such as spina bifida and is characterized by an inability of the nervous system to properly control the urinary bladder and the muscle tissue contained in the bladder wall. This can lead to spasms and a build-up of pressure in the bladder, often resulting in urinary incontinence. Children with spina bifida and neurogenic bladder often feel urges to urinate after drinking comparatively small amounts of liquid and they can also involuntarily leak urine. This is a source of a lot of emotional stress, especially in social settings such as when they are around friends or in school. If untreated, the long-standing and frequent pressure build-up in the bladder can have even more devastating effects such as infections or kidney damage.

Current treatments for neurogenic bladder involve surgeries which reconstruct and increase the size of the bladder by using tissue patches obtained from the bowel of the patient. Since such a gastrointestinal patch is derived from the patient’s own body, it is less likely to elicit an immune response and these intestinal tissue patches tend to be strong enough to withstand the pressures in the bladder. Unfortunately, the incompatibility of intestinal tissue and bladder tissue can lead to long-term complications, such as urinary tract infections, formation of urinary tract stones and in some rare cases even cancers. For this reason, researchers have been searching for newer safer patches which resemble the actual bladder wall.


A team of researchers at Northwestern University recently published a study which used stem cells of children with spina bifida to generate tissue patches that could be used for bladder surgery. In the paper “Cotransplantation with specific populations of spina bifida bone marrow stem/progenitor cells enhances urinary bladder regeneration” published in the Proceedings of the National Academy of Sciences (online publication on February 19, 2013), Arun Sharma and colleagues isolated two types of cells from the bone marrow of children with spina bifida: Mesenchymal stem cells (MSCs) and CD34+ cells (stem and progenitor cells which usually give rise to blood cells). They then coated a special polymer scaffold called POC with the cells and implanted this newly created patch into a rat bladder after performing a bladder augmentation surgery, similar to what is performed in patients with spina bifida. They then assessed the survival and formation of human muscle tissue on the implanted patch. When both human cell types (MSCs and CD34+) were combined, more than half of the implanted patch was covered with muscle tissue, four weeks after the implantation. If they only used CD34+ cells, they found that only a quarter of the patch was covered with muscle tissue. What is even more remarkable is that in addition to the newly formed muscle tissue, the implanted patch also showed evidence of some peripheral nerve growth and of blood vessel formation, both of which are found in healthy, normal bladder walls. These findings suggest that a patient’s own bone marrow stem cells can be used to help construct a tissue patch which could be used for bladder augmentation surgeries. The observation of some nerve growth in the implanted patch is also an exciting finding. One could conceivably try to re-connect the reconstructed bladder tissue with the main nervous system, but its success would largely depend on the severity of the neurologic disease.

One has to keep in mind that there are some key limitations to this study. The authors of the paper believe that the newly formed muscle tissue on the implanted patches was all derived from the patients’ bone marrow stem cells. However, there were no experiments performed to convincingly demonstrate this. The authors report that in previous studies, merely implanting the empty POC scaffold without any human stem cells resulted in 20% coverage with muscle tissue. This suggests that a big chunk of the newly formed muscle tissue is actually derived from the host rat and not from human stem cells. The authors also did not compare the effectiveness of this newly formed stem cell patch to the currently used intestinal patches, and there is no assessment of whether the newly formed muscle tissue on the reconstructed bladder is less prone to spasms and involuntary contractions. Lastly, all the in vivo testing of the tissue patches was performed in rats without neurogenic bladder and it is possible that the highly successful formation of muscle tissue may have been diminished if the animals had a neurologic disease.

A second study published in PLOS One took a different approach. In “Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells” (online publication February 7, 2013), Debra Franck and colleagues describe how they coated a scaffold consisting of silk threads with extracellular matrix proteins such as fibronectin. Instead of using bone marrow stem cells, they converted induced pluripotent stem cells into the smooth muscle cells that are typically found inside the bladder wall and placed these newly differentiated cells on the silk scaffold. The induced pluripotent stem cells (iPSCs) used by Franck and colleagues can be generated from a patient’s own skin cells which reduces the risk of being rejected by a patient’s immune system. The advantage of this approach is that it starts out with a pure and truly pluripotent stem cell population, which is easier to direct and control than bone marrow stem cells. There are also a few important limitations to this second study. Franck and colleagues used mouse pluripotent stem cells and it is not clear that their approach would necessarily work with human pluripotent stem cells. They also did not test the function of these differentiated cells on the silk scaffold to check if they actually behaved like true bladder wall smooth muscle cells. Unlike the first study, Franck and colleagues did not evaluate the newly created patch in an animal model.

Both studies are purely experimental and much additional work is needed before they can be tested in humans, but both show promising new approaches to help improve bladder dysfunction. It is heartening to see that researchers are developing new cell-based therapies to help children and adults who suffer from neurogenic bladder. The results from these two experimental studies are still too preliminary to predict whether cell-based therapies can be successfully used in patients, but they represent important first steps.


Image credit: Taken from Franck D, Gil ES, Adam RM, Kaplan DL, Chung YG, et al. (2013) Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells. PLoS ONE 8(2): e56237. doi:10.1371/journal.pone.0056237- Figure 6 B: Differentiated mouse induced pluripotent stem cells cultured on fibronectin-coated silk matrices show protein markers typically found in bladder smooth muscle cells.
Franck, D., Gil, E., Adam, R., Kaplan, D., Chung, Y., Estrada, C., & Mauney, J. (2013). Evaluation of Silk Biomaterials in Combination with Extracellular Matrix Coatings for Bladder Tissue Engineering with Primary and Pluripotent Cells PLoS ONE, 8 (2) DOI: 10.1371/journal.pone.0056237

Resisting Valentine’s Day

To celebrate Valentine’s Day (as a geeky scientist), I decided to search the “Web of Science” database for published articles with the phrase “Valentine’s Day” in the title. The article with the most citations was “Market-resistance and Valentine’s Day events” published in the Journal of Business Research in 2009, by the authors Angeline Close and George Zinkhan. I had never heard of the journal before, but the title sounded rather interesting so I decided to read it.

The authors reported the results of a survey of college students and consumers conducted in 2003-2005 regarding their thoughts about gift-giving on Valentine’s Day:

1) Most males (63%) and some females (31%) feel obligated to give a gift to their partner for this holiday

2) Males in a new relationship (i.e. less than six months) feel most obligated (81%), females in a new relationship are the second most obligated group (50%)

3) Less than half of males (44%) in a more established relationship feel obligated, and this number is even lower for females in more established relationships (13%)


The authors also conducted interviews using open-ended questions and reviewed diaries and E-diaries to investigate whether people indicated a “resistance” to giving gifts. They found that people expressed three different types of resistance, either opposing or severely limiting the giving of gifts (gift resistance), resisting the purchase of gifts (retail resistance) or broadly opposing the Valentine’s Day business in general (market resistance). All of these forms of “resistance” appeared to be connected to an anti-consumption attitude, the desire to not be drawn into a culture of excessive consumerism.

Here are a couple of quotes from the participants:

Valentine’s Day is a marketing strategy by the flower and candy companies. It’s a cheesy, overblown, stupid “holiday” to force you to spend money on each other.

Valentine’s Day is a way for retailers to get you to spend money in their stores. People get caught up in the B.S. and I should not have to spend extra to show I care, and my girlfriend agrees. But we both still spent plenty!

The survey results indicating differences between men and women are interesting but the paper also shows that even though the majority of people in the US might feel obligated to give each other gifts on Valentine’s Day, there is a strong anti-consumption attitude. People are not willing to succumb to the pressure to spend a lot of money that ultimately benefits retailers. They are instead expressing their affection for each other in ways that do not involve purchasing expensive gifts.

If you forgot to get a Valentine’s Day gift for your partner or spouse, just print out a copy of this paper and give it to them instead, saying that your lack of gift-giving is your expression of anti-consumption resistance. If that person is just as geeky as you are, you might be able to pull it off.

There is one caveat: The Journal of Business Research is not open access, so you may hit a paywall asking for $31.50 to read the article, which is more than a typical box of chocolates. Don’t panic, fortunately, you can read it for free here.


Image credit: Early 20th century Valentine’s Day card, showing woman holding heart shaped decoration and flowers, ca. 1910 – via Wikimedia Commons – Public Domain

Close, A., & Zinkhan, G. (2009). Market-resistance and Valentine’s Day events Journal of Business Research, 62 (2), 200-207 DOI: 10.1016/j.jbusres.2008.01.027

New Directions In Scientific Peer Review

Most scientists have a need-hate relationship with scientific peer review. We know that we need some form of peer review, because it is an important quality control measure that is supposed to help prevent the publication of scientifically invalid results. However, we also tend to hate scientific peer review in its current form, because we have had many frustrating experiences with it.

We recently submitted a manuscript to a journal, where it was stuck for more than one year, undergoing multiple rounds revisions in response to requests by the editors and the reviewers, after which they finally rejected it. The reviewers did not necessarily question the validity of our results, but they wanted us to test additional cell lines, confirm many of the findings with multiple methods and identify additional mechanisms that might explain our findings so that the paper started ballooning in size. I was frustrated because I felt that there was no end in sight. There are always novel mechanisms that one has not investigated. A scientific paper is not meant to investigate every possible explanation for a phenomenon, because that would turn the paper into a never-ending saga –every new finding usually raises even more questions.

We received a definitive rejection after multiple rounds of revisions (taking more than a year), but I was actually relieved because the demands of the reviewers were becoming quite excessive. We resubmitted the manuscript to a different journal, for which we had to scale back the manuscript. The new journal had different size restrictions and some of the revisions only made sense in the context of those specific reviewer requests and did not necessarily belong in the manuscript. This new set of reviewers also made some requests for revisions, but once we had made those revisions, the manuscript was published within a matter of months.

I have also had frustrating experiences as a scientific peer reviewer. Some authors completely disregard suggestions for improving the manuscript, and it is really up to the individual editors to decide who they side with. Scientific peer review in its current form also does not involve testing for reproducibility. As reviewers, we have to accept the authors’ claims that they have conducted sufficient experiments to test the reproducibility and validity of their data. Reviewers do not check whether their own laboratory or other laboratories can replicate the results described in the manuscript. Scientific peer reviewers have to rely on the scientific integrity of the authors, even if their gut instinct tells them that these results may not be reproducible by other laboratories.

Due to these experiences, many scientists like to say that the current peer review system is “broken”, and we know that we need radical changes to make the peer review process more reliable and fair. There are two new developments in scientific peer review that sound very interesting: Portable peer review and open peer review.

Richard Van Noorden describes the concept of portable peer review that will soon be offered by a new company called Rubriq, which will conduct the scientific peer review and provide the results for a fee to the editors of the journal. Interestingly, Rubriq will also pay peer reviewers, something which is quite unusual in the current peer review system, which relies on scientists volunteering their time as peer reviewers.  The basic idea is that if journal rejects a paper after the peer review conducted by Rubriq, the comments of the reviewers would still used by the editors of the new journal as long as it also subscribes to the Rubriq service. This would cut down on the review time at the new journal, because the editors could base their decision of acceptance or rejection on the existing reviews instead of sending out the paper for another new, time consuming review. I like this idea, because it “recycles” the efforts of the first round of review and will likely streamline the review process. My only concern is that reviewers currently use different review criteria, depending on what journal they review for. When reviewing for a “high prestige” journal, reviewers tend to set a high bar for novelty and impact and their comments likely reflect this. It may not be very easy for editors to use these reviews for a very different journal. Furthermore, editors get to know their reviewers over time and pick certain reviewers that they believe will give the most appropriate reviews for a submitted manuscript. I am not sure that editors of journals would be that pleased by “farming out” this process to a third party.

The second new development is the concept of open peer review, as proposed by the new open access scientific journal PeerJ. I briefly touched on this when discussing a paper on the emotional impact of genetic testing, but I would like to expand on this, because I am very intrigued by the idea of open peer review. In this new peer review system, the scientific peer reviewers can choose to either remain anonymous or disclose their names. One would think that peer reviewers should be able to stand by their honest, constructive peer reviews so there should be no need for anonymity. On the other hand, some scientists might worry about (un)professional repercussions because some authors may be offended by the critiques. Therefore, I think it is quite reasonable that PeerJ permits anonymity of the reviewers.

The true novelty of the open review system is that the authors can choose to disclose the peer review correspondence, which includes the initial comments by the reviewers as well as their own rebuttal and revisions. I think that this is a very important and exciting development in peer review. It forces the peer reviewers to remain civil and reasonable in their comments. Even if a reviewer chooses to remain anonymous, they are probably still going to be more thoughtful in their reviews of the manuscript if they realize that potentially hundreds or thousands of other scientists could have a peek at their comments. Open peer review allows the public and the scientific community to peek behind the usually closed doors of scientific peer reviews. This provides a certain form of public accountability for the editors. They cannot just arbitrarily accept or reject manuscripts without good reasons, because by opening up the review process to the public they may have to justify their decisions based on the reviews they solicited. One good example for the civil tone and reasonable review requests and responses can be found in the review of the BRCA gene testing paper. The reviewers (one of them chooses to remain anonymous) ask many excellent questions, including questions about the demographics and educational status of the participants. The authors’ rebuttal to some of the questions was that they did not collect the data and cannot include it in the manuscript, but they also expand some of the presented data and mention caveats of their study in the revised discussion. The openness of the review process now permits the general reader to take advantage of the insights of the reviewers, such as the missing information about the educational status of the participants.

The open review system is one of the most important new advances in scientific peer review and I hope that other journals (even the more conservative, traditional and non-open access journals) will implement a similar open peer review system. This will increase accountability of reviewers and editors, and hopefully improve the efficiency and quality of scientific peer review.