Recommendations for transparency and research integrity on preprint servers: a cross-sectional study across disciplines

Authors:
  • Mario Malički1
  • Ana Jerončić2
  • Gerben ter Riet3,4
  • Lex Bouter5,6
  • John P.A. Ioannidis1,7,8,9,10
  • Steve Goodman1,7,8
  • Ijsbrand Jan Aalbersberg11
1 Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, CA, USA
2 Department of Research in Biomedicine and Health, University of Split School of Medicine, Split, Croatia
3 Urban Vitality Centre of Expertise, Amsterdam University of Applied Sciences, Amsterdam, The Netherlands
4 Amsterdam UMC, University of Amsterdam, Department of Cardiology, Amsterdam, The Netherlands
5 Department of Philosophy, Faculty of Humanities, Vrije Universiteit, Amsterdam, The Netherlands
6 Amsterdam UMC, Vrije Universiteit, Department of Epidemiology and Statistics, Amsterdam, The Netherlands
7 Department of Medicine, Stanford University School of Medicine, Stanford, California, USA
8 Department of Epidemiology and Population Health, Stanford University School of Medicine, Stanford, California, USA
9 Department of Biomedical Data Science, Stanford University School of Medicine, Stanford, California, USA
10 Department of Statistics, Stanford University School of Humanities and Sciences, Stanford, California, USA
11 Elsevier, Amsterdam, The Netherlands

Aim

While peer review has been a cornerstone of scholarly communication for more than a century,1 it has been criticized for its lack of transparency and the delays it introduces between submission and publication of studies.2 Experiments to accelerate the dissemination of research began in the 1960s, and in the 1990s, first preprint servers emerged and became widely used in Physical Sciences and Economics.3 In the last decade, more than 30 new preprint servers apperared,4 but research on preprints is still scarce. In line with our recent exploration of journal’s instructions to authors,5 and reported associations between transparency and completeness of reporting of studies with the policies of journals they are published in,6 we sought to explore what preprint servers recommend or require in their instructions to authors, especially regarding transparency and research integrity.

Methods

We conducted a cross-sectional analysis of all preprint servers that do not require specific study funding or institutional affiliation nor actively seek out peer reviewers for all submitted preprints. Between 25 January 2020 and 31 March 2020, we analysed server webpages that resembled Instructions to Authors (ItAs), as well as About, Policy, and Frequently Asked Questions pages. Finally, for each server, we created user accounts and went through the preprint submission process (without submitting a preprint) to check if additional information was available on online submission platforms (except for ChinaXiv which required an email associated with a Chinese institution). We extracted data on seven topics related to preprint policies, six to submission requirements, and 18 to transparency in reporting and research integrity.

Results

We analysed 57 preprint servers covering approximately 3 million preprints (Table 1). While most servers (n=41, 72%) accepted only specific (sub)disciplinary research, ten (18%) accepted research from all disciplines, and six (11%) limited submissions to researchers coming from a specific region or a country. Almost half of the servers (n=27, 47%) had webpages whose titles could be classified as Instructions to Authors or Submission Guidelines. Policies and submission requirements that most often referred to scholarly scope (n=57, 100%) and moderation conducted before or after the preprints are made public (n=47, 82%). However, only two servers used a moderation checklist, Preprints.org and Research Square, of which the latter also provided a “badge” of passed checks. Of the 18 transparency in reporting and research integrity topics we analysed (Table 2), preprints servers addressed a median of 1 topic (range 0 to 11), but the median for the three servers covering strong Health Sciences communities (bioRxiv, medRxiv, and SSRN) was 7 (range 4 to 8). Most commonly addressed topics across all servers were data sharing (n=22, 39%), plagiarism (n=15, 26%), and use of ORCID (n=14, 25%).

Discussion

Even though most servers employ some moderation of preprints, they provide very little guidance on issues that are important for transparency and research integrity.

Conclusion

Recent increase in preprints and preprint servers, brings an opportunity to encourage or require transparent reporting of research, adherence to research integrity standards, and detailed check-listing before the preprints are made public. In doing so servers could improve the quality and trust in scholarly information exchange.

References

  1. Moxham N, Fyfe A. The Royal Society and the prehistory of peer review, 1665–1965. The Historical Journal. 2017:1-27.
  2. The STM Report: An overview of scientific and scholarly journal publishing. The Hague, the Netherlands; 2018.
  3. Cobb M. The prehistory of biology preprints: A forgotten experiment from the 1960s. PLOS Biology. 2017;15(11):e2003995.
  4. Rittman M. Preprint Servers. 2018. Available from: http://researchpreprints.com/.
  5. Malicki M, Aalbersberg IJJ, Bouter L, Ter Riet G. Journals’ instructions to authors: A cross-sectional study across scientific disciplines. PLOS One. 2019;14(9):e0222157.
  6. Stevens A, Shamseer L, Weinstein E, Yazdi F, Turner L, Thielman J, et al. Relation of completeness of reporting of health research to journals’ endorsement of reporting guidelines: systematic review. BMJ. 2014;348:g3804.