Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
As an introduction to evidence-based practice, we as a group of evidence-based researchers, clinicians and editors have collated the top 10 papers we consider most helpful when starting on the journey of evidence-based medicine (EBM). We have based our selection on our experience of teaching a wide range of individuals and describe why we consider each paper to be important.
EBM: what it is and what it is not
In 1996, the BMJ published an editorial by Sackett and others, in which they defined EBM as ‘the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients’.1 By emphasising the essential components of EBM, they made it clear that evidence, values and expertise play similar roles in clinical decision making.
EBM: a new approach to teaching the practice of medicine
Although the 1996 paper by Sackett et al clarified what EBM is and what it is not, the term and concept had already been introduced in a 1992 paper by the Evidence-Based Medicine Working Group.2 A paradigm shift in medical practice was proposed, where the examination of evidence from clinical research is given equal place in clinical decision making as that of biological reasoning, clinical experience and intuition. As a result, physicians needed to develop new skills, including effective searching skills and the application of evidence rules to evaluate the clinical literature. Thus, EBM was born.
The scandal of poor medical research
Over 20 years ago, Altman published his now seminal paper in the BMJ, telling us in his view how most medical research was not good and was probably wrong.3 Altman wrote that much research was ‘seriously flawed through the use of inappropriate designs, unrepresentative samples, small samples, incorrect methods of analysis and faulty interpretation’.
Echoing the realisation by Sackett and colleagues that much of medical practice lacked evidence of effectiveness and that much research was inadequate, Altman concluded: ‘We need less research, better research, and research done for the right reasons’. Now, 22 years later, the call to arms has been repeated in the EBM Manifesto for Better Healthcare.4
Assessing the quality of research
A cornerstone of evidence-based practice is the ability to assess the quality of the evidence and the research that underpins it—often easier said than done. Glasziou and colleagues published an editorial in the BMJ, aimed at helping clinicians and researchers to assess research.5 They suggested five general principles:
Different types of research are needed to answer different types of clinical questions.
Irrespective of the type of research, systematic reviews are necessary.
Adequate grading of the quality of evidence goes beyond the categorisation of research design.
Assessment of the benefit to harm balance should draw on a variety of types of research.
Clinicians need efficient search strategies for identifying reliable clinical research.
Empirical evidence of bias. Dimensions of methodological quality associated with estimates of treatment effects in controlled trials
The relevance of the methods used in clinical trials to the results that emerge was highlighted in this seminal paper by Schulz and colleagues.6 They found that knowledge of treatment allocation inflated the effect of an intervention by an average of 41%. Failure to blind group allocation could overestimate the intervention effect by 17%. This paper was influential in defining the effects of systematic bias on research outcomes and in showing why critical appraisal matters.
What is the evidence that postgraduate teaching in EBM changes anything? A systematic review
With growing acceptance of the importance of evidence-based practice, it soon followed that doctors need skills to appraise, interpret and apply research findings to their clinical practice. Most medical schools worldwide now include some element of EBM teaching. But what is the evidence that teaching EBM affects anything?
This systematic review of 23 studies showed that stand-alone teaching improved knowledge, but not skills, attitude, or behaviour.7 Clinically integrated teaching improved all four of these domains. The authors proposed a hierarchy of evidence-based healthcare teaching and learning activities:
Level 1—interactive and clinically integrated activities.
Level 2(a)—interactive but classroom-based activities.
Level 2(b)—didactic but clinically integrated activities.
Level 3—didactic, classroom or standalone teaching.
Clarke and colleagues have since published an overview of systematic reviews, which supports these findings and highlights the need to implement effective teaching strategies.
EBM manifesto for better healthcare
The manifesto was a response to systematic bias, wastage, error and fraud in research relevant to patient care. Jointly published in the BMJ 4 and Evidence-Babsed Medicine, the manifesto is an invitation to contribute towards better evidence by creating a list of priorities and sharing the lessons from achievements already made. The manifesto steps necessary to develop trusted evidence were refined through consultation and requires the evidence-based community to focus attention on strategies that could most improve the quality of healthcare.
EBM: a commentary on common criticisms
This was the first systematic appraisal of some common criticisms of EBM8 Following a systematic database search and feedback from seminars (delivered by David Sackett), Straus and McAlister identified three limitations unique to EBM, including limited time and resources, the need to develop new skills and a paucity of evidence that EBM is effective.
They said that many of what they called ‘pseudolimitations’ and criticisms of EBM often stem from misperceptions or misrepresentations, for example, that EBM is an ivory tower concept and that only randomised trials or systematic reviews constitute evidence. They cited evidence from frontline clinicians that refuted the first claim and showed that the question determines the best type of evidence to answer it, thus disproving the second criticism.
They concluded that clinicians should have better access to evidence in practice and that the way evidence was described and shared with patients needed to be improved. They also pointed out the lack of evidence of the impact of EBM on healthcare and patient outcomes that needs addressing.
General practitioners’ perceptions of the route to EBM: a questionnaire survey
This paper demonstrated rapid adoption of the ethos and philosophy of EBM by primary care doctors but also identified some of the early barriers to its implementation.9 These included awareness of the available resources and lack of time. Access to available technologies was also a major problem; for example, only around 20% of general practitioners at that time had access to the Internet and key bibliographic databases such as Medline.
Most of those surveyed had some understanding of the technical terms used in EBM, however, more than two-thirds felt unable to explain the meaning of these terms. Respondents thought that the best way to move towards EBM was by using evidence-based guidelines or protocols developed by colleagues.
Evidence-based guidelines or collectively constructed ‘mindlines?’ Ethnographic study of knowledge management in primary care
This paper was one of the first empirical assessments of how general practitioners use findings from scientific research in their daily practice and decision making.10 Using a mixed methods approach, Gabbay and le May found that few practitioners went through the steps associated with the traditional model of evidence-based healthcare (eg, the 5 A’s), including accessing newly published knowledge.
They observed that practitioners preferred shortcuts and relied on ‘mindlines’ (‘collectively reinforced, internalised guidelines’). These mindlines were developed over time not through reading of literature, but predominantly by their experiences and interactions with colleagues, opinion leaders, pharmaceutical representatives, patients and other sources of knowledge.
However, they stressed that practitioners were professionally responsible for ensuring that mindlines are underpinned by research evidence, and that ‘knowledge of key opinion leaders, from medical or nursing school onwards, is based on research and experiential evidence and wherever appropriate follow the evidence-based healthcare model’.
Our list is not designed to be exhaustive; you may disagree with our top 10, as we certainly did. We, however, found it useful to discuss the papers that we think matter and are essential to developing and understanding the use of evidence in healthcare. We hope that this list will evolve and would welcome suggestions to enhance it.
Contributors DN drafted the manuscript. All authors made edits and agreed on the final manuscript.
Funding DN has received expenses and fees for his media work. He holds grant funding from the NIHR School of Primary Care Research and the Royal College of General Practitioners. On occasion, he receives expenses for teaching EBM. CH has received expenses and fees for his media work. He holds grant funding from the NIHR, the NIHR School of Primary Care Research, The Wellcome Trust and the WHO. On occasion, he receives expenses for teaching EBM and is also paid for his GP work in NHS out of hours.
Disclaimer The views expressed in this commentary represent the views of the authors and not necessarily those of their host institution, the NHS, the NIHR or the Department of Health.
Competing interests None declared.
Provenance and peer review Commissioned; internally peer reviewed.