Article Text

Primary care
Successes, lessons and opportunities: 15-year follow-up of an integrated evidence-based medicine curriculum
  1. Christina S Korownyk1,
  2. G Michael Allan1,2,
  3. James McCormack3,
  4. Adrienne J Lindblad1,2,
  5. Samantha Horvey1,
  6. Michael R Kolber1
  1. 1 Family Medicine, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, Alberta, Canada
  2. 2 College of Family Physicians of Canada, Mississauga, Ontario, Canada
  3. 3 Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, British Columbia, Canada
  1. Correspondence to Dr Christina S Korownyk, Family Medicine, University of Alberta Faculty of Medicine and Dentistry, Edmonton, AB T6G 2R3, Canada; cpoag{at}ualberta.ca

Abstract

In 2005, the Department of Family Medicine at the University of Alberta introduced an evidence-based practice curriculum into the 2-year Family Medicine Residency Program. The curriculum was based on best available evidence, had multiple components and was comprehensive in its approach. It prioritised preappraised summary evidence over in-depth evidence appraisal. This paper describes the lessons learnt over the past 15 years including components that were eventually discontinued. We also discuss additions to the programme including the development of accessible, preappraised, summarised resources. We review the difficulties associated with evaluation and the incorporation of evidence-based practice into all aspects of residency training. Future directions are discussed including the incorporation of shared decision-making at the point of care.

  • general practice
  • evidence-based practice
http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

In 2005, the University of Alberta Evidence-Based Medicine (EBM) programme in the department of family medicine, incorporated a new evidence-based practice (EBP) curriculum into the ongoing Family Medicine Residency program. The curriculum incorporated didactic, small group and point of care teaching. The two primary goals of the EBP curriculum were to (1) train competent self-directed life-long learners with the skills to efficiently keep up to date and (2) cultivate resident’s EBP skills to enable them to solve problems encountered in daily practice.1 These aligned with the overarching family medicine residency goals of ensuring graduates are competent to provide comprehensive care in any Canadian community, prepared for the evolving needs of society and taught based on the best available evidence on patient care and medical education.2

The Family Medicine Residency Program at the University of Alberta is a 2-year programme, which employs a blended model of block-based and longitudinal integrated clinical learning experiences. The academic portion of the residency programme consists of monthly full-day academic days, with workshops on communication skills, procedural skills and EBP interspersed throughout the 2 years. The EBP curriculum originally included five main components integrated throughout 2 years of the family medicine residency (table 1). These included a 2-day EBP workshop, quarterly 1 hour of didactic lectures, monthly journal clubs, a computerised EBP desktop and one-page Brief Evidence-based Assessments of Research (BEARs).1

Table 1

Past and current evidence-based practice curriculum

The curriculum has been well described previously1 and addressed all five steps of EBP described in the 2005 Sicily statement.3 It was deliberately designed to address the most significant barrier to incorporating an evidence-based primary care practice, namely, time.4–9 Evidence appraisal skills remain the most frequently taught part of EBP educational interventions10 despite evidence that practicing clinicians rarely practice evidence appraisal,11 12 dedicate very limited time to answering their own clinical questions11–13 and have difficulty accessing timely and reliable resources.4 6 7 14

Our programme encouraged residents to become more familiar with accessing and using preappraised evidence over performing in depth evidence appraisals of single papers. As EBP teaching has evolved, newer EBP competencies highlight benefits of preappraised resources yet continue to prioritise evidence appraisal.15 The objective of this paper is to highlight successes, lessons and evolution of 15 years of a Family medicine residency EBP programme in the context of contemporary research.

Successes

The 2-day EBP workshop and 1 hour of quarterly didactic lectures are the most consistently highly rated components of the programme. The goal is to teach principles pertaining to better understand health information (and how it can be misleading) in an engaging and entertaining manner. The general philosophy is to ‘light a fire, not fill the bucket.’16 The workshop consists of lectures on therapeutic, diagnostic and systematic reviews followed by accompanying small groups where students practice these skills performing evidence appraisal on a representative paper for each. The goal is not for the residents to become experts in evidence appraisal, rather to be familiar with the process and how many ways we can be misled. We have replaced traditional database searching techniques with sessions in which students are taught how to find and use preappraised and high-quality summary resources. In this context, preappraised resources are defined as evidence that has been filtered—such that only the most relevant data are reviewed and has been subjected to a rigorous evidence appraisal process by a third party not involved in the development of the evidence. While we do not provide specific tools to assess preappraised resources, residents are provided with a number of examples of high-quality resources and encouraged to approach newly available resources with scepticism and common sense. In particular, preappraised resources should complete the same process of evidence appraisal that residents have been introduced to, saving residents substantial time. They should be free of financial and other potential conflicts of interest. Quarterly 1 hour of didactic lectures during regularly scheduled Family Medicine academic days allow for a brief review of a number of relevant research articles in the context of EBP principles. Papers are generally picked to highlight well-done randomised controlled trials with patient-oriented outcomes that are potentially practice changing. In addition, papers with significant limitations are highlighted as an example of how research can be misleading. These lectures provide examples of what residents should expect from preappraised resources.

BEARs are one-page worksheets that promote the translation of clinical uncertainties into questions and rapid searches.2 Residents complete and present at least four BEARs during their scheduled family medicine clinical rotation. They are to be completed using preappraised evidence at the point of care. BEARs provide the opportunity for learners to incorporate EBP concepts into practice and provide the opportunity for discussion in site-specific rounds. The BEAR template has previously been published17 and our research found that BEARs facilitate the use of a variety of resources in answering clinical questions and 69% of residents reported changing their practice after completing a BEAR.17

Following the initiation of this curriculum, the EBM team noted that primary care clinicians were generally not involved in their own continuing medication education,18 were overwhelmed with information and industry-funded resources seemed common. We determined that if we are training residents to use reliable, preappraised resources—we needed to ensure that these were available to residents during residency training and upon their graduation. We wanted to demonstrate that incorporation of high-quality evidence into community practice was feasible, even in a busy community practice.

In 2008, we initiated evidence summaries on primary care questions. These ‘Tools for Practice’ (TFP) are one-page evidence synthesis, focusing on highest quality evidence and patient-oriented outcomes (available at: toolsforpractice.ca). All family medicine residents are encouraged to sign up for the one-page evidence summaries at the EBP workshop, which occurs at the beginning of residency. TFPs are free of cost and financial conflict of interest. They are distributed every 2 weeks to residents in the programme in addition to almost 40 000 clinicians worldwide. Residents are also provided with the opportunity to complete an EBP elective during their residency. During the elective, residents are mentored in evidence appraisal while contributing to the development of a future TFP.

Lessons learned and ongoing debates

The EBP desktop, a site that provided residents with a number of evidence-based resources, was largely unused and eventually removed. Similarly, as residents became increasingly competent at searching for medical information, the librarian session during the workshop (which taught residents how to develop searching strategies) was replaced with a small group session emphasising how to quickly finding reliable answers to clinical questions.

Balancing the ability to perform an in-depth evidence appraisal on one paper versus quickly finding preappraised synthesised resources pertaining to one’s question continues to be debated. In 2005, we replaced the more comprehensive and time-consuming ‘critical appraisal topics’ with BEARs. This change was well received. However, the introduction of primary summarised evidence resources at the monthly journal club is a source of ongoing conversation. Some feel that residents should be using the journal clubs to learn more in-depth evidence appraisal of individual articles. However, we feel that in-depth assessment of a single article does not emulate future practice and in an era of ever increasing medical literature and time pressures, most clinicians favour prefiltered resources that summarise the evidence.19 20 Systematic review publications increased 2728% between 1991 and 2014 with 185 different meta-analyses of antidepressants published between 2007 and 2014.21 Some will argue that this reinforces the need for clinicians to be able to discern misinformation, with a greater focus on evidence appraisal.22 Others suggest we need to effectively triage the information.23 We would argue that prioritisation of time is key in primary care, and time expended in one area results in lost opportunity costs in another.

Our department has been very supportive of the EBM curriculum, dedicating 2 days for the EBM workshop and allowing for a regular schedule of lectures and journal club. The predominant clinical setting of the residency programme means that integration of EBP into all elements of education is largely dependent on the clinical practice of the hundreds of clinicians that contribute to the residency programme. Offers of academic faculty development have not been well attended. In 2012, in collaboration with the university and the provincial family medicine college, we initiated an annual medical education conference on evidence-based practice for community clinicians. New high-quality evidence that can be readily incorporated into primary care is reviewed in a fun and engaging manner. The conference is highly rated and has grown in size every year since its initiation. Many of the attendees are clinical preceptors for the family medicine residents, and it is our hope that the conference can provide teachers with tools and enthusiasm to model the regular incorporation of evidence in their daily practice.

Evaluation

Evaluation is one of the biggest limitations of our programme. Evaluation of EBM knowledge or EBP is fraught with difficulties. Shaneyfelt identified 104 unique instruments for evaluating EBP education24 and Oude et al identified 160 instruments for assessing EBP behaviours.25 The proliferation of assessment tools continues,26 27 suggesting evaluation is complex and there is no clear consensus on how EBP programmes should be evaluated overall.15

In the original description of the curriculum, we reported improvement in resident attitudes regarding EBP and their comfort with practical application.2 For the first 3 years, the workshop moved knowledge scores (on a 15-question quiz) from 50% (SD 18%) to 75% (SD 20%) (table 2). As this was consistent year to year, we did not continue collection. These findings are consistent with systematic reviews that report an increase in EBP knowledge and attitudes following EBP interventions.28 Although self-reported knowledge may improve, this does not necessarily translate into EBP behaviours.28

Table 2

Two-day evidence-based practice workshop knowledge scores

Individual components of our curriculum have traditionally been evaluated independently. Evaluations generally included the use of simple rating scales that have been modified over the years and thus are not easily comparable. In addition, there is no clear national standard for EBP evaluation. Family Medicine national exams in Canada do not have EBM specifically identified as part of examination blueprint (personal communication—Brent Kvern Director, Certification and Examinations, College of Family Physicians of Canada). It is not clear if a national standard would improve outcomes, as we are not interested in collecting data on knowledge acquisition that may not result in behaviour change on graduation. We consider the ultimate measure of success to be graduating residents who regularly practice medicine based on best available evidence. Unfortunately, the measurement of this is complex, and we did not have sufficient resources to measure this in residents who had graduated from the programme.

The difficulty is compounded as the primary outcome of EBP remains debated: is EBP intended to improve health on a population level or encourage shared informed decision making, which may allow patients to make decisions that may not be consistent with ‘best evidence?’29 30 If the latter is true, assessments of EBP may be as simple as, ‘were reliable tools used for shared decision-making with the patient?’

Moving forward

Recently experts in evidence appraisal published the 21st chapter of the GRADE guidelines—tools intended to assist with the assessment of results and certainty of evidence.S31 The need for multiple publications suggests that evidence appraisal has become increasingly complex, and in many ways a moving target. Based on 15 years experience with EBM education, we would advocate for familiarity with, not mastery of evidence appraisal skills in an EBP curriculum for primary care. Even if one were familiar with basic evidence appraisal skills, there are numerous other factors to consider that require further time and investigation (eg, How does this paper fit with others on the topic? How many other trials were registered but not published?) In our personal experience, comprehensive and timely appraisal of health information requires years of regular practice. Thus, we continue to encourage our learners to learn basics of EBM and health literacy, but strongly advocate the use of prefiltered evidence summaries (ideally written by experienced authors without financial conflicts of interest) to answer daily clinical questions.

Over the past 15 years, we have seen a gradual evolution from ‘point of care’ tools to ‘shared informed decision-making’.S32 Improving learner familiarity with shared decision-making tools is essential in moving medicine to a more inclusive decision-making setting. Our team, along with numerous other groups, has developed tools to assist with shared decision-making in clinical practice.S33–37 Increasing learner familiarity with these resources may be an opportunity to improve translation of evidence into practice and we are working to incorporate these as examples of practical evidence application into our curriculum. For example, we are incorporating these more and more into the 1-hour journal club to allow for discussion around their practical implementation into practice.

In order to engage faculty, we continue to grow our annual medical education conference on evidence-based practice for both community and academic clinicians. This medical education event is organised outside of the university setting and is designed to promote EBP and build enthusiasm for teaching and incorporation of EBP into daily practice.

In summary, the current curriculum (table 1) is a result of modifications in response to ongoing feedback and best evidence. The originally stated goals of the EBP curriculum remain relevant, encouraging familiarity of with basic EBM concepts yet emphasising the use of preappraised evidence summaries. Fifteen years later, we would add a third goal—to train residents to optimise patient-oriented outcomes in the context of shared decision-making and individual patient preferences and values.

Supplemental material

Ethics statements

Patient consent for publication

Acknowledgments

The authors would like to acknowledge the Family Medicine Department at the University of Alberta, the Alberta College of Family Physicians and the PEER team for their assistance and support.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • Contributors CK, MK and AL conceived of the original idea. MA, SH and MK assisted with data collection. CK wrote the first draft. Draft review and critical revision were completed by JM, MA, SH, MK and AL. All authors gave final approval of the version to be published.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.