Nature Editorial: Checklists work to improve science

Indigophoton

Senior Member (Voting Rights)
A report on an assessment of a new policy implemented by Nature to try to improve the quality of submissions and reproducibility in research.
Five years ago, after extended discussions with the scientific community, Nature announced that authors submitting manuscripts to Nature journals would need to complete a checklist addressing key factors underlying irreproducibility for reviewers and editors to assess during peer review. The original checklist focused on the life sciences. More recently we have included criteria relevant to other disciplines.

To learn authors’ thoughts about reproducibility and the role of checklists, Nature sent surveys to 5,375 researchers who had published in a Nature journal between July 2016 and March 2017 (see Supplementary information and https://doi.org/10.6084/m9.figshare.6139937for the raw data).

Of the 480 who responded, 49% thought that the checklist had improved the quality of research published in Nature (15% disagreed); 37% thought the checklist had improved quality in their field overall (20% disagreed).

Respondents overwhelmingly thought that poor reproducibility is a problem: 86% acknowledged it as a crisis in their field, a rate similar to that found in an earlier survey (Nature 533, 452–454; 2016). Two-thirds of respondents cited selective reporting of results as a contributing factor.
I was quite astonished by this:
Two studies have compared the quality of reporting in Nature journals before and after the checklist was implemented, and with journals that had not implemented checklists. Authors of papers in Nature journals are now several times more likely to state explicitly whether they have carried out blinding, randomization and sample-size calculations (S. Han et al. PLoS ONE 12, e0183591; 2017 and M. R. Macleod et al. Preprint at BioRxiv https://doi.org/10.1101/187245; 2017). Journals without checklists showed no or minimal improvement over the same time period. Even after implementation of the checklist, however, only 16% of papers reported the status of all of the crucial ‘Landis 4’ criteria (blinding, randomization, sample-size calculation and exclusion) for in vivo studies — although reporting on individual criteria was significantly higher. (Emphasis added)
I'm astonished that it's not considered basic information, and that papers get through peer review without it.
Progress is slow, but a commitment to enforcement is crucial. That is why we make the checklist and the reporting of specific items mandatory, and monitor compliance. The road to full reproducibility is long and will require perseverance, but we hope that the checklist approach will gain wider uptake in the community.
https://www.nature.com/articles/d41586-018-04590-7
 
Back
Top Bottom