Nature Immunology | Editorial
Raising standards
- Journal name:
- Nature Immunology
- Volume:
- 14,
- Page:
- 415
- Year published:
- DOI:
- doi:10.1038/ni.2603
- Published online
Nature journals' updated editorial policies aim to improve transparency and reproducibility
Beginning in May, Nature and the Nature research journals are adopting editorial measures to improve the consistency and quality of reporting in the life-sciences articles they publish. To facilitate the interpretation and improve the reliability of published results, we will more systematically ensure the reporting of key methodological details, give more space to Methods sections, examine the statistics more closely and offer more ways for authors to be transparent about these matters.
Central to this initiative is a checklist intended to prompt authors to disclose technical and statistical information in their submissions and to encourage referees to consider aspects important for research reproducibility. It was developed on the basis of community discussions, including workshops held last year by the US National Institute of Neurological Disorders and Stroke (NINDS) and the National Cancer Institute (NCI) to address the problems underlying irreproducibility. We were also inspired by published studies and guidelines about reporting standards (or the lack thereof) and by the collective experience of editors at Nature journals. The resulting checklist is by no means exhaustive; instead, it focuses on a small number of experimental and analytical design elements critical for the interpretation of research results that are often reported incompletely. For example, authors will need to describe methodological parameters that may introduce bias or influence robustness and to provide precise characterization of key reagents, such as cell lines and antibodies, that may be subject to biological variability. The checklist also consolidates several existing policies about data deposition and data presentation.
Specifically, we will require more precise descriptions of statistics. To help improve the statistical robustness of papers, the Nature journals will now employ statisticians as consultants on certain papers, at the editor's discretion and on the referees' suggestions.
We also recognize that there is no single prescribed way of conducting an experimental study. Exploratory investigations often are not amenable to the same degree of statistical rigor as hypothesis-testing studies. Indeed, most academic laboratories do not have the means to carry out the level of validation that would be required, for example, to translate a finding from the laboratory to the clinic. However, there is no justification for not reporting with full transparency how a study is designed, conducted and analyzed so that reviewers and readers can adequately interpret and build on the results.
To allow authors to describe their experimental designs and methods in enough detail for others to interpret and replicate them, the participating journals are removing length restrictions on Methods sections.
To further increase transparency, we now also encourage authors to provide, in tabular form, the data underlying the graphical representations used in figures. This is in addition to our well-established data-deposition policy for specific types of experiments and large datasets. The source data will be made accessible directly from the figure legend for readers interested in seeing it for themselves. We also continue to encourage authors to make use of resources for sharing detailed methods and reagent descriptions, by providing direct online linking between primary research articles and Protocol Exchange, an open resource into which authors can deposit the detailed experimental protocols used in their study.
Ensuring systematic attention to reporting and transparency is only a small step toward solving the issues of reproducibility that have been highlighted across the life sciences and particularly in biomedical research. Much bigger underlying issues contribute to the problem. Too many biologists still do not receive adequate training in statistics and other quantitative aspects of their subject. Mentoring of young scientists on matters of rigor and transparency is inconsistent at best. In academia, the ever-increasing pressures to publish and obtain the next level of funding provide little incentive to pursue and publish studies that contradict or confirm previously published results. Those who would put effort into documenting the validity or irreproducibility of a published piece of work have little prospect of seeing their efforts valued by journals and funders; meanwhile, funding and efforts are wasted on false assumptions.
Tackling these issues is a long-term endeavor that will require the commitment of funders, institutions, researchers and publishers. It is encouraging that funding agencies such as the NINDS and the NCI have led community discussions and are considering recommendations for researchers and themselves. We hope these efforts will expand further and translate into noticeable improvements. Meanwhile, our effort is a small step in improving how science is reported. We trust that our authors will grasp the significance of this step, and we hope that other publishers will adopt similar initiatives. Because what is ultimately at stake is public trust in science.
Central to this initiative is a checklist intended to prompt authors to disclose technical and statistical information in their submissions and to encourage referees to consider aspects important for research reproducibility. It was developed on the basis of community discussions, including workshops held last year by the US National Institute of Neurological Disorders and Stroke (NINDS) and the National Cancer Institute (NCI) to address the problems underlying irreproducibility. We were also inspired by published studies and guidelines about reporting standards (or the lack thereof) and by the collective experience of editors at Nature journals. The resulting checklist is by no means exhaustive; instead, it focuses on a small number of experimental and analytical design elements critical for the interpretation of research results that are often reported incompletely. For example, authors will need to describe methodological parameters that may introduce bias or influence robustness and to provide precise characterization of key reagents, such as cell lines and antibodies, that may be subject to biological variability. The checklist also consolidates several existing policies about data deposition and data presentation.
Specifically, we will require more precise descriptions of statistics. To help improve the statistical robustness of papers, the Nature journals will now employ statisticians as consultants on certain papers, at the editor's discretion and on the referees' suggestions.
We also recognize that there is no single prescribed way of conducting an experimental study. Exploratory investigations often are not amenable to the same degree of statistical rigor as hypothesis-testing studies. Indeed, most academic laboratories do not have the means to carry out the level of validation that would be required, for example, to translate a finding from the laboratory to the clinic. However, there is no justification for not reporting with full transparency how a study is designed, conducted and analyzed so that reviewers and readers can adequately interpret and build on the results.
To allow authors to describe their experimental designs and methods in enough detail for others to interpret and replicate them, the participating journals are removing length restrictions on Methods sections.
To further increase transparency, we now also encourage authors to provide, in tabular form, the data underlying the graphical representations used in figures. This is in addition to our well-established data-deposition policy for specific types of experiments and large datasets. The source data will be made accessible directly from the figure legend for readers interested in seeing it for themselves. We also continue to encourage authors to make use of resources for sharing detailed methods and reagent descriptions, by providing direct online linking between primary research articles and Protocol Exchange, an open resource into which authors can deposit the detailed experimental protocols used in their study.
Ensuring systematic attention to reporting and transparency is only a small step toward solving the issues of reproducibility that have been highlighted across the life sciences and particularly in biomedical research. Much bigger underlying issues contribute to the problem. Too many biologists still do not receive adequate training in statistics and other quantitative aspects of their subject. Mentoring of young scientists on matters of rigor and transparency is inconsistent at best. In academia, the ever-increasing pressures to publish and obtain the next level of funding provide little incentive to pursue and publish studies that contradict or confirm previously published results. Those who would put effort into documenting the validity or irreproducibility of a published piece of work have little prospect of seeing their efforts valued by journals and funders; meanwhile, funding and efforts are wasted on false assumptions.
Tackling these issues is a long-term endeavor that will require the commitment of funders, institutions, researchers and publishers. It is encouraging that funding agencies such as the NINDS and the NCI have led community discussions and are considering recommendations for researchers and themselves. We hope these efforts will expand further and translate into noticeable improvements. Meanwhile, our effort is a small step in improving how science is reported. We trust that our authors will grasp the significance of this step, and we hope that other publishers will adopt similar initiatives. Because what is ultimately at stake is public trust in science.
No hay comentarios:
Publicar un comentario