Ivan Oranskyretraction watch

These days it’s hard to know whom to trust. In recent years, horse meat was sold as beef; honourable people in Parliament lied about their expenses; banks; the media; the police; the list goes on. Many of our institutions have taken a beating. Science seeks to understand the universe, and our civilisation depends on it for both health and treasure. But scientific research is not immune to miscalculation and dishonesty – hundreds of articles are retracted every year because of this. Recently it started getting new help from the outside. New kinds of journalistic activities are contributing to improving science, thanks to communities on the internet, open-source software, and the hard work of a few individuals – none of whom are scientists. 

Ivan Oransky is the vice president of MedPage Today, and teaches medical journalism at New York University. He previously held posts at various prestigious scientific journals. He helps to run two blogs, Retraction Watch and Embargo Watch, that complement the quality-assurance done in science. The first airs the dirty laundry on sloppy or fraudulent scientific output, while the second shines a light on unreasonable constraints imposed on science journalists. Since they were started in 2010, these blogs have grown into a goldmine of information about many important cases, and continue to gain wider attention.

Oransky is not confined to any one specialism. “I have a BA in Biology, which really doesn't make me a scientist," he tells me. "I have an MD but I did the first year of residency then left to be a journalist, so I'm not really a physician. I'm a practising journalist".

But how do scientists trust what they know? Scientific results are often checked by other scientists to evaluate the importance of findings, how they relate to existing work, and the plausibility of the methods used. Should there be the will and resources, other scientists might also attempt to reproduce the findings. Science appears to have the mechanisms to regulate itself, and self-correct. Oransky is quick to inject realism. “Actually it doesn't. It's wonderful to have a rosy picture that science is self-correcting. Too often the refrain we get when we report on a retraction is, ‘Oh, everyone knew that that [article] was crap’. If a lot of people knew that a particular paper, or researcher's work, was crap, and it's in the literature, scientists shouldn’t be allowed to talk about self-correction, because they're clearly not [self-correcting]; they're clearly far more interested in maintaining the status quo and not rocking the boat than they are in correcting the record. So this veneer – scientists saying, ‘You should trust us, because we peer review, we have self-correction’ - I think they need to work harder to actually earn that."

Oransky may not be a scientist, but his blog does great service to science. First, it pressures universities and journals to investigate scientific results that have been brought into question by the community. Universities seem pretty opposed to this. “The vast majority have no interest in transparency, or in correcting the record, because they see this as a black mark on their record. Many universities refuse to talk about investigations, they refuse to do investigation, or they often find that their researchers didn't commit misconduct.”

Second, retractions are often not widely publicised. Retraction Watch helps to magnify the dissemination of retraction information. Lack of awareness about retractions can lead to subsequent experiments, and even products, being designed on faulty research. “Corporate interests are deeply concerned with the really poor level of science that is coming out of universities." He cites an example of Bayer, a pharmaceutical company, trying to replicate results from a scientific journal: "Most of the results – compounds that [Bayer] licensed – didn't do what the [articles] said they did. Companies are very disturbed by this. We often think that companies warp things, but here I think that they can be a check on the process.” 

The questions that drive initiatives like Oransky’s offer plenty of thought fodder for anybody who wonders how science comes to promote some fantasies into fact. Often the visible outcome of research consists of a scientific article. Oransky sees the excessive focus on articles as being a problem, and advocates a much broader measure of scientific progress. Indeed, he believes that this would help science progress. “We need to change the incentives. We need to value and give credit for the entire process of science, not just papers. We need to give credit for peer review, depositing data, writing software, mentorship, post-publication peer review…We need to have a robust framework for all of that.”