Objectives Systematic literature reviews collate what is known on a topic to date and provide the current status of available evidence. Practitioners in a specific medical discipline may request a new search in order to verify their knowledge or to learn of new findings. A urology surgeon specialising in oncology requested that a cognitive computing model called EvidenceEngineTM complete a literature search, collection of studies, analysis, and interpretation of evidence on the research question, ‘Is adjuvant radiotherapy or salvage radiotherapy superior for patients who have undergone prostatectomy for prostate cancer?’ To validate the findings of this novel machine-assisted search engine, a clinical nurse researcher conducted the same literature search and interpretation of findings via a manual, systematic review of the literature. The objective of this case study was to compare the results of these two search methods in answering the research question.
Method A literature search, data collection, analysis and interpretation of digitally accessible full-text articles and abstracts published between 2011 and 2017 was performed by a machine-assisted tool called EvidenceEngineTM. Each study was evaluated using a quantitative scoring methodology to determine its level of merit, based on the factors of study design, population size, potential conflict of interest, publication date and peer review status. An analysis of the results with regard to patient outcomes for post radical prostatectomy patients undergoing adjuvant radiotherapy (ART) versus salvage radiotherapy (SRT) was then performed to determine the overall direction of the evidence. The clinical nurse researcher was blinded to these results until the manual search was completed which consisted of the traditional steps of a systematic literature review including searching Medline, PubMed, and Scopus via multiple key words, recording, categorising, reading, and analysing pertinent articles within the date range, culminating in a summary of the evidence.
Results The results and process for completion of these independent literature reviews were compared. The machine-assisted EvidenceEngineTM approach reached a quantitative score of 2.3/10 in favour of ART, indicating that the aggregate evidence is very weakly supportive of ART over SRT. The manual approach qualitatively reached the conclusion that neither ART nor SRT was definitively superior based on the evidence. Both the machine-assisted and manual approaches uncovered critical study limitations in the evidence, including design flaws in the RCTs (e.g. comparison arms were essentially uncontrolled observation arms; variable inclusion criteria for PSA levels before randomization), although the machine-assisted approach required additional manual analysis of the report results. The manual evidence review was completed within forty-two hours over eight work days, while the machine-assisted approach consisted of a one-week waiting period for report generation, followed by a few hours of manual analysis.
Conclusions While both methods reached the same conclusion, there are advantages and disadvantages of each method. The EvidenceEngineTM is capable of quantifying the direction of the evidence and the strength of each study, while the manual method qualitatively describes the evidence direction and the study strength by the researcher assigning a level of evidence score. The EvidenceEngineTM may be a more time efficient method as the researcher is presented with the relevant studies and their evidence quality scores, allowing more time for analysis and interpretation of results and limitations. Both methods utilised together enhance the systematic use of existing evidence.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.