Failure is an indispensable part of science.
Any graduate student in a scientific field has a tale of an unexpected result, an experiment gone wrong, or data that tells no clear story.
"There is this myth that scientific progress is achieved by some sort of a direct march to the truth, but nothing is actually further from the truth than that," said Mario Livio, an astrophysicist at the Space Telescope Science Institute, home of the science program for the Hubble telescope.
"Progress is achieved through the zigzag path with many false starts, many blind alleys," Livio said. "Mistakes and failures are at some level part and parcel of the scientific process."
The kind of people who first discover that the earth actually rotates around the sun or that humans are related to apes have to think way outside of the box to get to those ground-breaking discoveries.
"Well guess what?" Livio said. "When you think outside the box, you may make a mistake."
In his book "Brilliant Blunders," Livio wrote about a mistake made by Linus Pauling, a Nobel prize-winning chemist. He was the first to model protein molecules, but when he tried to develop a model for the structure of DNA, he failed miserably.
"He had three strands instead of two, his molecule was built inside out, and it virtually disobeyed some basic rules of chemistry," Livio said. "And this was coming from the greatest chemist to have ever lived."
Pauling's model was wrong, but it turned out that his method of testing it with X-rays was right. Those methods were adopted by Rosalind Franklin, whose data was used by James Watson and Francis Crick to discover DNA's double helix structure.
"Watson and Crick actually adopted precisely the methods and almost the way of thinking that they actually learned from Pauling," Livio said.
This is the norm in science: Each scientist builds on the failures of the one who came before in a messy, zigzag path toward the truth. Because of this, science works best when new results are shared widely, even so-called "negative results" that don't confirm the original hypothesis posed by the researcher.
But increasingly, these negative results are getting stuck in filing cabinet drawers instead of being published or presented at conferences.
The dangerous side of the 'file drawer effect'
British doctor and advocate Ben Goldacre wants to change that, at least in one field. With his "AllTrials" campaign, he is pushing the pharmaceutical industry to publish all of the clinical trials it conducts.
When they don't, he argues, the consequences can be disastrous.
"We saw a very vivid illustration of that in the TGN1412 trail," Goldacre said.
The TGN 1412 trial is a notorious British trial that tested a new drug targeting the immune system. When it was first given to humans in a Phase I clinical trial in 2006, all six men who received the drug went into multisystem organ failure and nearly died.
"They ended up on intensive care, their fingers and toes fell off," Goldacre said.
"At the time, everybody thought...this was completely unforeseeable, but in actual fact, in retrospect, it turned out that somebody else had tried a treatment like that."
Goldacre wrote in his 2013 book "Bad Pharma" that an expert panel analyzing the disaster after the fact found that a similar drug had been tested a decade before in a patient who had also gotten sick.
"But nobody knew about it, because the results of these Phase I trials, these first in human trials, are almost never made publically available," Goldacre said.
Most drug researchers in the U.S. now have to share data in a public registry, but compliance is spotty.
This week, Goldacre brought his "AllTrials" campaign to the U.S., to increase reporting and get older clinical trial data for drugs currently on the market.
Publication bias a problem across disciplines
There are billions of dollars tied to the results of clinical trials in the pharmaceutical industry.
Even though the financial stakes are lower in other fields, this positive publishing bias exists to varying degrees and for different reasons across disciplines, from basic biomedical research to psychology.
Emma Granqvist, a journal publisher at Elsevier, said even if patient's health is not at risk, there are other reasons sharing inconclusive or negative results is important.
"The most obvious one is to avoid wasted resources," said Granqvist. "You don't want another lab at the other end of the world doing exactly (the same thing you did) a week later."
Another reason is to correct the scientific record.
"If something has been published on this topic saying A is true, and you repeatedly find that A is not true, it's important to share that," Granqvist said.
Nearly 9 out of 10 published results are positive
A 2012 study by Stanford research bias expert Daniele Fanelli found that 86 percent of published research results in 2007 were positive. That was up from 70 percent in 1990-1991.
Despite what those odds would suggest, that is likely not because scientists have become better at making educated guesses.
Fanelli believes the negative research results are just not getting published.
"One of the biggest sources of publication bias, I believe, is the sheer lack of motivation on behalf of authors for writing up results that, in their own eyes, are not as interesting or exciting," Fanelli said.
Granqvist and others argue that lack of motivation comes from an incentive system in science that is out of whack.
Many professors have publication quotas, and researchers have to include lists of articles in grant applications. That all leads to a widely held belief that "funding will come from publishing a lot, and from publishing in well-respected journals," Granqvist said.
And as Fanelli's research illustrates, journals publish very few negative findings.
"So, in essence, you know that if you're going to try to get funding for future grants, it's just a waste of time to publish your negative data," Granqvist said.
As a result, the open data sharing that is good for science may not be good for the careers of individual scientists. Many individuals and institutions are beginning to see this as a problem.
Or even, as Ben Goldacre puts it, "a cultural blind spot for the whole of medicine and academia."
Diverse solutions emerging to a widespread issue
Open notebook scientists advocate logging experimental data in near real-time online, so it can be shared with colleagues without being written up and published.
The Center for Open Science at the University of Virginia is incentivizing replication studies, to verify if previous research is accurate.
Scientific publisher Emma Granqvist is launching "New Negatives in Plant Science" at the publishing giant Elsevier this fall, which will join titles like "Journal of Negative Results in Biomedicine".
The goal, Granqvist said, is to highlight the issue of positive publication biases and provide a platform for plant scientists to publish negative, controversial and inconclusive results.
Even the funding giant National Institutes of Health is getting into the game. Sally Rockey, who hands out most of institutes' $30 billion in grants as head of the Office of Extramural Research, said this year the agency changed its applicant resume, known as a "biographical sketch".
"We're trying to change the culture in what we've done with the biosketch here, away from that notion of publish or perish," Rockey said.
According to Rockey, the old biography required applicants to list 15 publications. The new version requires five professional accomplishments that can be annotated with anything from an important data set to a new mouse model.
"There could be a lot of different things that allow you to demonstrate that accomplishment, so not just a publication," Rockey said.
Even reformers say changing the culture of science is going to be hard.
But if it makes researchers a little more likely to try something new without worrying too much about publishing their results, they argue it will be worth it.
After all, embracing failure is essential to the pursuit of science.
"You should not be afraid to think outside the box, even if that will occasionally come at a cost of a blunder," said Mario Livio.
Support provided by