Article Text

Download PDFPDF

41 Validation of crowdsourcing for citation screening in systematic reviews
Free
  1. Nassr Nama1,2,
  2. Margaret Sampson3,
  3. Nick Barrowman4,5,
  4. Katie O’Hearn3,
  5. Ryan Sandarage6,
  6. Kusum Menon4,3,
  7. Gail Macartney4,3,
  8. Kimmo Murto4,3,
  9. Jean-Philippe Vaccani4,3,
  10. Sherri Katz4,3,
  11. Roger Zemek4,3,
  12. Ahmed Nasr4,3,
  13. James Dayre McNally4,3
  1. 1University of British Columbia, Faculty of Medicine, Vancouver, Canada
  2. 2BC Children’s Hospital, Vancouver, Canada
  3. 3Children’s Hospital of Eastern Ontario, Ottawa, Canada
  4. 4University of Ottawa, Faculty of Medicine, Ottawa, Canada
  5. 5Clinical Research Unit, CHEO Research Institute, Ottawa, Canada
  6. 6University of British Columbia, Faculty of Medicine, Vancouver, Canada

Abstract

Objectives Systematic reviews (SRs) are often cited as the highest level of evidence available as they involve the identification and synthesis of published studies on a topic. Unfortunately, it is increasingly challenging for small teams to complete SR procedures in a reasonable time period, given the exponential rise in the volume of primary literature. Crowdsourcing has been postulated as a potential solution. The feasibility objective of this study was to determine whether an online crowd would be willing to perform and complete abstract and full text screening. The validation objective was to assess the quality of the crowd’s work, including retention of eligible citations (sensitivity) and work performed for the investigative team, defined as the percentage of citations excluded by the crowd.

Method We performed a prospective study evaluating the feasibility and validity of crowdsourcing essential components of an SR, including abstract screening, document retrieval, and full text assessment. Using the CrowdScreenSR citation screening software, 2323 articles from 6 SRs were available to an online crowd. Citations excluded by less than or equal to 75% of the crowd were moved forward for full text assessment. For the validation component, performance of the crowd was compared with citation review through the accepted, gold standard, trained expert approach.

Results Of 312 potential crowd members, 117 (37.5%) commenced abstract screening and 71 (22.8%) completed the minimum requirement of 50 citation assessments. The majority of participants were students (192/312, 61.5%). The crowd screened 16,988 abstracts (median: 8 per citation; IQR 7-8), and all citations achieved the minimum of 4 assessments after a median of 42 days (IQR 26-67). Crowd members retrieved 83.5% (774/927) of the articles that progressed to the full text phase. A total of 7604 full text assessments were completed (median: 7 per citation; IQR 3-11). Citations from all but 1 review achieved the minimum of 4 assessments after a median of 36 days (IQR 24-70). When complete crowd member agreement at both levels was required for exclusion, sensitivity was 100% (95%CI 97.9-100) and work performed was 68.3% (95%CI 66.4-70.1). Using the predefined alternative 75% exclusion threshold, sensitivity remained 100% and work performed increased to 72.9% (95%CI 71.0-74.6; P<.001).

Conclusions Crowdsourcing of citation screening for SRs is feasible and has reasonable sensitivity and specificity. By expediting the screening process, crowdsourcing could permit the investigative team to focus on more complex SR tasks. This requires a user-friendly online platform that allows research teams to crowdsource their reviews. Future directions should focus on assessing the application of this methodology to real life projects and determine its potential for rapid completion of systematic reviews.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.