Among the ecosystem of academic journals, clinical journals have a particularly diverse readership. Clinician-scientists, epidemiologists, and statisticians all strategically sift through clinical manuscripts to explore conclusions of interest. Very rarely are all three groups satisfied. Due to size limitations, journals regularly relegate ‘obscure’ methodological details to an unformatted supplementary document. The underlying code and data used to generate manuscript figures are also rarely shared in computer-readable forms that would promote reproducibility of study findings. Lastly, navigating and manually extracting data from formatted tables, instead of computer-readable tables make secondary analyses of clinical literature more labour-intensive than necessary. These problems emerge from a system where novelty is prioritized above relevance, ease of submission is prioritized over replicability, and expedience is prioritized over accessibility to the diverse end-users of clinical evidence. By moving away from a manuscript submission process designed for typeset print towards one that promotes interactive engagement with the manuscript, research evidence can be made relevant, replicable, and accessible to its diverse readership.
The objective of this project is to draft an RMarkdown and Jupyter notebook template for submission of interactive manuscripts. To encouraging clinical researchers to familiarize themselves with the process of generating an interactive manuscript, we further seek to develop openly available educational videos. Such objectives are not novel and are inspired by numerous examples outside of clinical publishing. For example, Nature in 2014 piloted a feature permitting readers to view interactive Jupyter notebooks alongside the online version of a journal. Authorea, a collaborative publishing platform, has a feature permitting users to dynamically change views between the underlying dataset, code, visualization of all publication figures. Distill, an open-source machine learning journal, also has a reactive diagram feature similar to Authorea, and allows readers to pilot the use of presented algorithms with their own input parameters. Some groups, such as the Reproducible Document Stack, are directly aimed at enabling researchers to publish reproducible manuscripts in online journals (1).
These features are examples of how interactive research publishing has evolved to meet the needs of a diverse readership (1). To address the problems of poor methodological transparency, lack of code and data sharing, and limited ability to conduct secondary analyses of the literature, we seek to draft RMarkdown and Jupyter templates for clinical manuscript submission. If selected, following presentation at the EBMLive conference, we hope to work with academic and publishing partners to pilot a submission call using such reproducible templates.
We expect that submitters will submit both raw and summarized data tables in computer-readable formats, documented analytical code in research repositories, and manuscript drafts using the relevant RMarkdown and Jupyter templates. This pilot process will inform the further design of manuscript submission workflows that are more relevant to the broader research readership, more replicable from a computational perspective, and more accessible to readers across the world. REFERENCES: 1. Reproducible Document Stack – supporting the next-generation research article [Internet]. eLife. 2017 [cited 2020 Mar 1].
Available from https://elifesciences.org/labs/7dbeb390/reproducible-document-stack-supporting-the-next-generation-research-article
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.