Author: Joanna Carpenter
We tend to think of science as a systematic process, steadily revealing truth about the world around us. Yet in 2005, John Ioannidis of the University of Ioannina School of Medicine in Greece and the Tufts University School of Medicine in Massachusetts published a paper in PLoS Medicine with the title ‘Why most published research findings are false’.
Professor Ioannidis suggested that the probability of a research finding being true depended on the design and was a maximum of 0.85 for an “adequately powered randomised controlled trial (RCT)” and as low as 0.0015 for some “discovery-oriented exploratory research with massive testing”.
A simple response to this would be that other researchers should try to reproduce a research finding to test whether it is true.
But – “I hate the term reproducibility,” Dr Shai Silberberg, program director at the US National Institute of Neurological Disorders and Stroke (NINDS), tells me. “There are many valid reasons why research might not be reproduced… It’s better to talk about rigor and transparency.”
Dr Silberberg has been concerned at the quality of research reports in his field and is pushing for changes to enhance rigour and transparency. He’s not alone in seeing the problem. “Across disciplines, it is not difficult to make the case now that reproducibility is important, that transparency is important, and that we can do better,” says Professor Brian Nosek of the Center for Open Science and the University of Virginia.
In October 2012, Dr Silberberg and NINDS colleagues published ‘A call for transparent reporting to optimize the predictive value of preclinical research’, in Nature. They argued that “poorly designed animal studies, obscured by deficient reporting, may, in aggregate, serve erroneously as the scientific rationale for large, expensive and ultimately unsuccessful clinical trials” that “may unnecessarily expose patients to potentially harmful agents”.
They also noted that “evidence that clinical trials can yield biased results if they lack methodological rigor led to the development and implementation of the CONSORT guidelines for randomized controlled trials (among other guidelines)… [which] have improved the transparency of clinical study reporting in journals that have adopted them.”
They proposed that grant applicants and authors of research papers should meet a core set of reporting standards on randomization, blinding, sample-size estimation and data handling. Separately, the NINDS and subsequently the US National Institutes of Health have made plans to improve training of researchers on experimental design, ensure more systematic evaluation of experimental design in grant applications, and make available the raw data that supports published research.
Working with journals
Dr Silberberg and the NIH have also been working with journals. In November last year, Science, Nature Publishing Group, the NIH and journal editors representing over 30 basic/preclinical science journals agreed principles and guidelines for reporting preclinical research to support robust and transparent reporting of research.
The guidelines cover rigorous statistical analysis, transparency in reporting, sharing of data and material, consideration of refutations and commit journals to considering establishing best practice guidelines for image-based data and describing biological material.
…guidelines cover rigorous statistical analysis, transparency in reporting, sharing of data and material, consideration of refutations…establishing best practice guidelines for image-based data and describing biological material.
TOP guidelines
Professor Nosek at the Center for Open Science has also been working with Science and other leading journals, societies, funders, and researchers on better guidelines for reporting research in a range of disciplines, signatories to which are currently being gathered.
Called the TOP (Transparency and Openness Promotion) guidelines, they build on the NIH preclinical life sciences guidelines and also the work of COPDESS (the Coalition for Publishing Data in the Earth and Space Sciences) and DA-RT (Data Access and Research Transparency) for the social sciences.
There are three levels to each of eight standards, each becoming progressively more stringent.
Badges
In addition, the Center for Open Science has established a system of badges for open data, open materials and pre-registration – that participating journals can award to compliant research papers.
The open data badge, for instance, is awarded where the raw data is made available to others, and the open materials badge is for when the methodology and materials used in the research has been made publicly available in sufficient detail for others to be able to reproduce the work.
“It’s a very simple thing, it’s very low risk, it’s very resource light, and yet it’s effective. The journal Psychological Science adopted badges in January 2014 and over the year their sharing rates of data and materials went from near-zero to about 25% of the articles,” Professor Nosek tells me.
Pre-reg prize
The final badge is for pre-registration of a study. As well as the badge, the Center for Open Science now has a fund of $1million as an incentivise to researchers to pre-register their studies. The first thousand teams to publish an eligible study will each get $1000. “They have to describe and certify the design of the study and the analysis plan prior to conducting the study.”
This is to stop people from ‘torturing the data’ to get some finding out of their data, even if they didn’t find what they thought they would get. “Exploratory analysis is not a bad thing. The problem is when exploratory analysis is confused with confirmatory analysis. You can’t generate a hypothesis and test a hypothesis with the same data,” Nosek explains. “You’re enhancing the likelihood of discovering falsely when you do many analyses.”
The goal of pre-registration is to make clear what are the confirmatory elements of my design. Other analyses can still be done but they must be reported as exploratory work.
Professor Nosek explains that pre-registering changes the workflow: “My lab has been pre-registering for the last two years and it’s not been easy to write it all down in advance, because we just didn’t work that way. It took us a few trials each round to get the hang of it.”
He hopes that the prize will be an incentive to other groups to start a new habit. “We hope that when they start it, they won’t want to go back. There are big benefits to doing a little bit more planning upfront. It doesn’t have to take a long time, but just having that process can help improve the design of the studies. ”
Another advantage of pre-registration is that it helps to tackle publication bias, where positive research findings are more likely to be published than negative ones.
Open Science Framework
Another initiative of the Center for Open Science is its Open Science Framework to help researchers manage their workflow, from planning to executing, and archiving. This software aims to help simplify researchers’ work and keep track of collaborations and projects. It is about to sign up its ten thousandth user and is being used by a new joint project with the UK-based Open Knowledge Foundation to develop an open, online database of clinical trials.
All Trials
The All Trials project aims to aggregate information from a variety of sources to build up provide a comprehensive, global picture of the data on all trials conducted on medical treatments.
Dr Ben Goldacre, Senior Clinical Research Fellow in the Centre for Evidence Based Medicine at the University of Oxford and well known for arguing for just such greater transparency and sharing of data, will direct the project.
True or false?
John Ioannidis, now at Stanford, University, published a further paper last October in PLoS Medicine , this time called `How to make more published research true’. In it, he argued for change in how research organisations promote and reward researchers. However, he also proposed registration of studies, sharing of data, materials and protocols, improvement of study design standards, peer review and reporting, and better training,
But perhaps science is still, after all, a means to undercover truth.