Statistics from Altmetric.com
Poor methodological quality and incomplete reporting of published research affects clinical decision-making1–3 and contributes to research waste.4 Reporting guidelines (RG) offer one solution by promoting transparency and ensuring that key methodological safeguards are fully reported.5 In oncology, recent studies have shown evidence of poor reporting quality of phase II and III trials.6 7 Moreover, a survey of members from the European Organization for Research and Treatment of Cancer found that the frequency of adverse event reporting fell short of members’ expectations.7 To address such situations, correct implementation of the Consolidated Standards of Reporting Trials (CONSORT) checklist by clinical trialists would ensure that all harms and unexpected effects encountered by the treatment group are reported in oncology trials (CONSORT item 19).8 Indeed, the CONSORT statement, like other RGs, has been shown to improve the quality of research when incorporated into study design and reporting.9 10
The registration of clinical trials is another mechanism intended to promote transparency and improve methodological standards. Two recent studies demonstrated small improvements over time in the quality of trial registration, but conclude that more improvement is necessary with respect to important items such as clearly defined primary outcomes.11 12 Discrepancies between outcomes listed in the trial registry record and those reported in the published trial have been noted across many medical specialties, providing indirect evidence for selective reporting bias. This bias occurs when researchers preferentially include (or exclude) outcomes based on statistical significance (or a lack thereof).13 14 The International Committee of Medical Journal Editors (ICMJE) and the WHO have instituted policies to improve clinical trial reporting and registration. The US government has made prospective clinical trial registration a legal mandate,15 and similar regulations have been implemented in Europe.16 In January 2017, the US National Institutes of Health (NIH) began requiring registration of all NIH-funded randomised trials in ClinicalTrials.gov (the US clinical trial registry) prior to patient enrolment and reporting of summary results after trial completion.17
In this study, we first evaluated the published guidance (eg, instructions for authors) provided by a cohort of highly ranked oncology journals to authors regarding the use of RGs for common study types. We also examined these journals’ policies on clinical trial registration. We then evaluated whether this guidance has led to improvements in reporting and registration.
The primary outcome of this study was to examine the adherence to RGs and trial registration policies of 21 oncology journals. The secondary outcome was to investigate whether adherence to the CONSORT statement and ICMJE trial registration policies affects reporting practices in oncology. Our exploratory outcome was a description of the rates of adherence to oncology-specific RGs (eg, REporting recommendations for tumour MARKer prognostic studies (REMARK)).
We surveyed Google Scholar and identified the top 20 oncology journals, sorted by h5-index. We also included JAMA Oncology because its impact factor places it in the top 20 oncology journals, but Google rankings do not yet reflect that status. We conducted a cross-sectional review of the oncology journals’ policies and instructions for authors concerning guideline adherence and trial registration requirements. This study did not meet the regulatory definition of human subject research, so it was not subject to Institutional Review Board oversight. We applied relevant Statistical Analyses and Methods in the Published Literature (SAMPL) guidelines for reporting descriptive statistics.18
Before initiating the study, all authors met to outline the study design. A study protocol was developed based on our previous investigations.19 20 A pilot test was done on the first five journals to identify any flaws in the protocol and to establish uniformity in data extraction. Follow-up meetings were held periodically throughout the data extraction process to resolve discrepancies. CW, MH and CC performed web-based searches for each journal and searched the Instructions for Authors page for relevant information. A third author (GM) validated all data. Each author was blinded to the ratings of the others. The methodology for the following primary and secondary outcomes is visually depicted in figure 1.
For each journal, we determined whether it adhered to ICMJE Uniform Requirements for Manuscripts (URM), Animal Research: Reporting of In Vivo Experiments (ARRIVE), Case Report (CARE), CONSORT, Meta-Analysis of Observational Studies in Epidemiology (MOOSE), Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), Quality of Reporting of Meta-analyses (QUOROM), Strengthening the Reporting of Observational Studies in Epidemiology (STROBE), or Standards of Reporting Diagnostic Accuracy Studies (STARD). Additionally, we extracted whether or not a journal mentioned ClinicalTrials.gov, WHO trial registries or both. If a journal mentioned trial registration without naming a specific registry, we coded that journal as ‘generic trial registration’.
Definitions were constructed a priori by CW and MV for the coding process based on previous literature definitions.19 20 For each of the RGs and registries, adherence by each journal was classified as ‘compulsory/required’, ‘recommended’ or ‘not mentioned’. For cases in which it was unclear whether the journal followed a specific guideline or registry, adherence was rated as ‘unclear’. Keywords such as ‘must’, ‘need’ or ‘manuscripts will not be considered for publication unless’ were categorised as compulsory/required. Similarly, keywords such as ‘should’, ‘encouraged’ and ‘prefer’ were categorised as recommended.
After data extraction, MH and CC reviewed each journal’s website to determine which of the common study types relating to extracted RGs (systematic reviews/meta-analyses, clinical trials, diagnostic accuracy studies, case reports, epidemiological studies and animal studies) were accepted. Next, MH and CC emailed the editors in chief of the included journals for confirmation regarding the extracted study type. We sent two reminder emails at 1-week intervals to ensure best practices in eliciting email response.21 We cross-referenced the journals’ accepted article types to the data that we extracted from journal websites. If a journal did not publish a particular type of study, then it was not considered in comparing accepted study types and RG adherence. For example, ARRIVE guidelines were not considered relevant to a journal if it did not publish preclinical animal studies.
Next, CW performed a PubMed search using publication type ‘randomized controlled trial’ (RCT) for all included journals during a 5-year period (1 January 2012 to 31 December 2016). This search strategy has been shown to have over 93% sensitivity and specificity for retrieving RCTs.22 All RCTs were divided into groups based on whether or not the journal adhered to CONSORT guidelines and whether or not they endorsed ICMJE trial registration policies. CW then randomly sampled 30 RCTs from each journal. If a journal did not publish at least 40 RCTs during the 5-year study period, it was excluded. ORs and CIs were calculated based on the results of the data extraction using STATA V.13.1.
A single author (CW) surveyed each journal’s website to descriptively analyse the rate of adherence to oncology-specific RGs. A list of these guidelines can be found on the Enhancing the QUAlity and Transparency Of Health Research (EQUATOR) Network’s website.23
All authors met after completing data extraction and analysis to resolve any final discrepancies in the scoring of the journal data.
Our study comprised 21 oncology journals. Table 1 shows all extracted data. Only one (4.8%) journal was found to not adhere to any RG, while five (23.8%) did not adhere to any trial registration policies. The ICMJE-URM was mentioned by 15 (71.4%) journals, and the EQUATOR Network was mentioned by three (14.3%) journals. We recorded an editor response rate of 52.4% (11 of 21).
The CONSORT statement was mentioned by 16 journals: 11/21 (52.4%) required adherence and 5/21 (23.8%) recommended adherence. ARRIVE was mentioned by 11 journals: 1/20 (5.0%) required adherence and 11/20 (55.0%) recommended adherence. STARD was mentioned by eight journals: 4/19 (21.1%) required adherence and 4/19 (21.1%) recommended adherence. PRISMA was mentioned by eight journals: 3/21 (14.3%) required adherence and 5/21 (23.8%) recommended adherence. STROBE was mentioned by seven journals: 5/21 (23.8%) required adherence and 2/21 (9.5%) recommended adherence. MOOSE was mentioned twice and was recommended both times. QUOROM and CARE were not mentioned in any journal’s Instructions for Authors page.
Five (23.8%) journals did not mention trial registration at all. Ten journals mentioned trial registration through ClinicalTrials.gov: 3/21 (14.3%) required registration and 7/21 (33.3%) recommended it. Six journals mentioned WHO trial registration: 4/21 (19.0%) required registration and 2/21 (9.5%) recommended it. Generic trial registration was mentioned by 11 journals: 8/21 (38.1%) required generic registration and 3/21 (14.3%) recommended it.
Our PubMed search yielded 2614 results and 13 eligible journals, defined as those publishing more than 40 RCTs in the 5-year period studied. Journal-specific publication rates of a CONSORT flow diagram and trial registry number are available via the Open Science Framework (osf.io/7g6td).
Of the 13 journals, 10 adhere to CONSORT guidelines and three do not (figure 2). The RCTs published in the 10 journals that adhere to CONSORT included a flow diagram 70.3% (211/300) of the time. The three journals that do not adhere to CONSORT included a flow diagram 57.8% (52/90) of the time. This finding indicates that journal adherence to CONSORT increases the likelihood of an author adhering to its key items (OR=1.73, 95% CI 1.03 to 2.89).
ICMJE trial registration policies
Nine of the 13 journals endorsed ICMJE trial registration policies and four did not (figure 3). The RCTs published in the nine journals that endorsed ICMJE registration policies included a trial registration number 67.4% (182/270) of the time. The RCTs published in the other four journals included a trial registration number 67.5% (81/120) of the time. No association existed between endorsement of ICMJE and reporting of a trial registry number (OR=1.00, 95% CI 0.61 to 1.61).
Six of the included oncology journals adhered to REMARK guidelines (data available online: osf.io/7g6td). No other mention of oncology-specific guidelines was found.
Oncology journals support the use of RGs more often than journals in other medical specialties. Only one journal in our sample did not adhere to any RGs. Recent investigations have found no adherence to RGs in 48% (32/67) of haematology journals,19 41% (11/27) of emergency medicine journals20 and 41% (15/37) of critical care journals.24
Specific guidelines such as CONSORT and PRISMA still show a need for greater endorsement since several journals do not mention these guidelines. Evidence suggests that adherence to CONSORT and PRISMA improves some aspects of study methodology in oncology. These studies also called into question important items that remain under-reported.25–27 Additional studies corroborate these findings in other medical specialties28 29 Specifically, key items such as funding source, proper adherence to study protocol, sample size calculation, adverse events and description of the trial’s design have been found to be under-reported.7 25 30 31 The same trends have been observed in oncology systematic reviews; however, these studies have also noted that risk of bias evaluations is infrequently reported.26
Calls have been made in the past for increased transparency in clinical trial reporting for the sake of patient outcomes and research integrity.32 RGs were designed to increase scientific transparency and integrity. For example, CONSORT requires trial registration and the reporting of the registration number, which may prevent the selective reporting of outcomes upon publication. A beneficial aspect of RGs for peer reviewers and editors includes the ease by which the methodological rigour of a trial may be evaluated. This is particularly beneficial for junior authors and reviewers who wish to familiarise themselves with aspects of high-quality study designs. The submission of a guideline checklist along with a manuscript may also decrease the time burden for reviewers and editors who choose to investigate the methodological quality of a manuscript.
Our secondary objective was to determine if journal endorsement of a guideline or policy affected the design and reporting of an RCT. Our results demonstrate that the journal adherence to CONSORT guidelines increased the likelihood of authors publishing a CONSORT flow diagram. This finding indicates that oncology journal adherence to CONSORT has a positive effect on reporting practices within oncology trials. And while publication of a participant flow diagram may fail to predict adherence to other CONSORT items, our finding nonetheless demonstrates that good reporting practices are more likely to occur in CONSORT-endorsing oncology journals.
With regard to trial registration, our results demonstrate no association between oncology journal endorsement of ICMJE clinical trial registration policies and author publication of a registry number. The rates of publication of a registry number were similar in ICMJE-endorsing and non-ICMJE-endorsing journals (67.5% and 67.4%, respectively). We recognise a need for improvement because one-third of the analysed trials did not direct the readership to the registration page, and because journals that endorsed the ICMJE trial registration policy failed to distinguish themselves from non-endorsing journals. Here, oncology journal policy can help steer clinical trials in a more transparent direction by enforcing the registration of clinical trials and the reporting of the registry number in published manuscripts.
Some areas of medicine, such as oncology, have study designs that are unique and require their own specific guidance with regard to methods and reporting. Therefore, our exploratory outcome was to determine the rates of adherence to oncology-specific RGs. We found that six journals adhered to REMARK guidelines for tumour marker prognostic studies, making it the most popular and only oncology-specific RG within our journal cohort. The rate of adherence to REMARK is encouraging, given the unique study design for which it was created, and the fact that not all journals within our sample accept tumour marker prognostic studies. This finding reflects the overall theme that oncology journals adhere to RGs at a higher rate than journals in other specialties.
For journals that do not currently adhere to RGs or registration policies, a first step may be to simply refer authors to the EQUATOR Network. The EQUATOR Network is the premier clearing-house for RGs; it was established to aid authors and reviewers in the reporting and evaluation of scientific research, and it is committed to strengthening the integrity of scientific research.33 The network has produced algorithms to aid researchers who are unfamiliar with RGs to determine which one is most suited for their study design. The network also displays RGs for popular study designs on its homepage to help mitigate the time burden of choosing the correct guideline. Only three journals in our sample referred authors to the EQUATOR Network’s website.
The EQUATOR Network is currently publicising and organising its first project dedicated to a single medical specialty. The EQUATOR Network Oncology Project is designed to increase awareness, address barriers to adherence and augment the use of RGs in the oncology literature.34 The future goal of the project is to establish an expert advisory group composed of multiple stakeholders in oncology that share the common goal of using RGs, in part, to increase the quality of oncology research. The EQUATOR Network Oncology Project is currently in the early stages of development, and its present and future plans can be found on the EQUATOR Network’s website.
Other methods beyond RGs that are designed to increase research transparency have also been implemented. The BMJ and BMJ Open both require a declaration of transparency on behalf of all primary authors of clinical trials, with the aim of reducing the incidence of selective reporting bias, which is frequently found in both oncology and haematology journals.35 36 Additionally, most journals require a declaration of any conflict of interest that the authors may have, with the aim of increasing the integrity and objectivity of published research.
One limitation of our analysis with respect to our secondary objective is that the inclusion of a flow diagram may fail to predict a trial’s adherence to other CONSORT statement items. Recent studies have demonstrated variable adherence to CONSORT statement items.37 38 Therefore, our finding that manuscripts in CONSORT-adhering journals more often publish a participant flow diagram may not be generalisable to all CONSORT statement items.
To conclude, RG adherence in oncology journals is better overall compared with other medical specialties that have been investigated, but nonetheless, adherence to individual RGs needs improvement. We have demonstrated that mentioning CONSORT increases the likelihood of author adherence. The benefits of RG adherence have been demonstrated as well as some solutions to potential barriers to uptake and adherence. Ongoing efforts are being made to improve the quality of oncology research, and we encourage support of these efforts, which may begin with a reference to RGs in the Instructions for Authors page on oncology journal websites.
Contributors CW and MV conceptualised the investigation. CW, GM, MH and CC extracted and validated all data. All authors contributed to data analysis, manuscript writing and consent to the final version of this manuscript.
Competing interests None declared.
Patient consent Not required.
Provenance and peer review Not commissioned; externally peer reviewed.
Data sharing statement Data extracted for the primary, secondary and exploratory outcomes from this investigation are publicly available via the Open Science Framework (osf.io/7g6td).
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.