Article Text
Statistics from Altmetric.com
The problem
Systematic reviews and related evidence synthesis products (eg, rapid reviews, evidence maps and scoping reviews) are foundational to evidence-based medicine, informing all levels of healthcare, from patient–provider decisions to national policy-making.1 However, an ongoing challenge to systematic reviews is the exponential growth in the number of journal articles and other related reports of medical research (eg, clinical trial records, conference abstracts and preprint articles). Since 2017, there have been over a million new records added annually to PubMed alone.2 Thus, there is a compelling need to streamline the information retrieval process for systematic reviews.
The current process and its limitations
The current approach to information retrieval for systematic reviews involves two discrete and successive steps: (1) formulating a search strategy, which normally consists of a combination of a set of Boolean queries with both indexed (eg, PubMed MeSH terms) and free-text terms and (2) manual screening of the relevant citations returned by the search, preferably by at least two independent reviewers.3 These two steps influence each other, in as much as a more comprehensive search strategy will return a larger number of citations that must be manually screened. To limit the time and resources required for screening, search strategies are often artificially limited to return only the number of citations considered feasible for the team to screen manually. However, this may lead to the inadvertent exclusion of relevant studies, leading to less robust and possibly even biased conclusions.
Enter technology
Tools that leverage text-mining are increasingly available for use in systematic review search strategy design to improve the identification of relevant articles,4 which is referred to as sensitivity, and reduce screening burden by semiautomating aspects of the search design process. Text mining software typically analyses a ‘seed’ set of bibliographic citations or documents to assess the frequency of terms or phrases. The goal is …
Footnotes
Contributors GPA wrote the content of this piece. RP edited it and made suggestions for improvement.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Provenance and peer review Commissioned; externally peer reviewed.