Article Text
Abstract
Background and Objectives As teaching technology advances, medical education is increasingly using digital mediums and exploring instructional models such as the flipped classroom and blended learning courses, where the in-class taught sessions are more groups on content delivered before class. Early evidence suggests lectures and foundational material can be equally provided online, but we have low-quality research to be convinced. We aim to test and develop an online evidence-based teaching resource that seeks to improve the availability and scalability of evidence-based medicine (EBM) learning tools. We evaluate the feasibility of a study design that could test for changes in academic performance in EBM skills using an online supplement.
Methods Mixed-methods feasibility study of a randomised controlled trial (RCT) in an undergraduate medical student cohort.
Results Of a small cohort (n=34), eight participants agreed to randomisation and completed the study. No study participant completed the EBM supplementary course in full. Students report time-management as a significant barrier in participation, and all aspects of the study and communications should be delivered with efficiency a key consideration.
Conclusion Randomising students to an online EBM supplement within a medical school programme presents challenges of recruitment and student motivation, but the study design is potentially feasible.
- medical education & training
- evidence-based practice
Data availability statement
Data are available upon reasonable request. Contact medical statistician Thomas Fanshawe for deidentified participant data at: thomas.fanshawe@phc.ax.ac.uk
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
Key message
What is already known about this subject?
Essential training in evidence-based medicine improves critical thinking and statistical reasoning.
Medical school programmes are constrained by the availability of teachers, resources, and time to provide evidence-based medicine (EBM) training.
Foundational material in EBM could be delivered online to improve scalability and uptake.
What are the new findings?
Randomising students to an online EBM supplement within a medical school programme presents challenges of recruitment and student motivation, but is feasible.
Qualitative and quantitative indicators suggest that students will access online material for revision before an examination.
Baseline knowledge assessment of the student cohort could guide more appropriate content of an online learning supplement in EBM.
How might it impact on clinical practice in the foreseeable future?
This article provides insight into how to develop, implement and pursue evidence-based education of EBM.
Background
Established in the mid-1990s as an approach to achieve better healthcare outcomes, evidence-based medicine (EBM) is the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients.1 EBM has become essential for the training of young clinicians by stressing critical thinking and the importance of statistical reasoning and continuous evaluation of medical practice.2 Current research explores efficacy measures of EBM teaching, and explores how to improve teaching with different types of education interventions.3–7 Review authors suggest ‘teaching methods for optimising EBP among health professionals could become a robust standardised procedure of the medical education curricula and lifelong learning of healthcare professionals’.8 Still, some medical school curricula are constrained by the availability of teachers and supporting materials to provide adequate EBM training.9
As teaching technology advances, medical education is increasingly using digital mediums and exploring instructional models such as the flipped classroom and blended learning courses, where the in-class taught sessions are more groups on content delivered before class.10 Evidence suggests lectures and foundational material can be equally provided online.11 12 A recent systematic review shows blended learning (lectures with online supplement) is helpful for examination preparation, concept clarification and one strategy to reduce problems in medical student performance.13 However, one review cites insufficient evidence regarding the effectiveness of e-learning on healthcare professional behaviour or patient outcomes and calls for more research in this area.14 Here, we aim to test and develop an online evidence-based teaching resource that seeks to improve the availability and scalability of EBM learning tools. We evaluate the feasibility of a study design that could test for changes in academic performance in EBM skills using an online supplement.
Research objectives
The purpose of this research and analysis is set to explore two related questions: first, could an introductory online course in EBM be administered and evaluated for change in learning outcomes in the context of a prospective randomised-controlled trial (RCT) for medical school training; and second, could delivering a 100% online short-course as a learning supplement to standard in-person teaching increase EBM knowledge and skill acquisition in medical students?
Methods
Type of study
A mixed-methods feasibility study of an RCT in an undergraduate medical student cohort.
Participants
Undergraduate medical students (n=34) in the first year of the graduate entry programme at University of Oxford in academic year 2018–2019.
Recruitment and blinding
Students were recruited via e-mail and with two in-class oral reminders from course programme director (DMcC) over a 21-day period. Consenting participants were assigned a study ID number (DY) and randomly allocated in equal numbers, using www.graphpad.com, to the intervention or control group (TRF). The outcome assessor (TRF) was blind to participants. Other researchers (MCMcC, DN and CH) were blind in addition to group allocation and data collection.
Intervention
The intervention group received the standard curriculum plus automated registration and delivery of the online EBM Primer. Students signed into the course on the university’s password-protected virtual learning environment (Canvas). These students were provided unlimited access to the EBM Primer from the point of randomisation until their final examination, 12 weeks later.
Characteristics of the intervention
The evidence-based healthcare (EBHC) programme at the University of Oxford sponsored the development of a 100% online EBM primer. The intervention was designed to meet the flexible learning needs of students and piloted in a healthcare education programme (Oxford’s MSc in Evidence-Based Healthcare). The online course is fully self-paced, untutored and requires 10–15 hours to complete. For a course sample and syllabus, please see content overview (online supplementary appendix A) and view https://www.youtube.com/watch?v=Eg_a3twU0cU.
Supplemental material
Control
The control group received Oxford’s Year 4 Graduate Entry standard teaching curriculum until after the practice examination initial assessment. The control group received access to the EBM Primer 8 weeks before June’s final examination.
Outcome measures
Primary outcome measures
Numerical data collected on the participant recruitment rate, protocol adherence after point of randomisation, and individual and group metrics of course activity ascertain the feasibility of study design.
In addition (I), we asked students about barriers and enabling factors for participating in an RCT and completing an online course supplement. For the set of qualitative questions, see online supplementary appendix B.
Supplemental material
Secondary outcome measures
The standard formative assessment (approximately 2 hours, computer-based examination) was completed by all study and non-study participants. Study participants completed an integrated online assessment tool of approximately 20 min in addition to the formative assessment (see online supplementary appendix C for the Assessing Competency in EBM (ACE) tool).15 The intervention group also completed a customised precourse and postcourse multiple choice questionnaire (MCQ) within the course. Then, all study participants completed the standard final examination (approximately 2 hours, computer-based examination).
Supplemental material
Data analysis
Quantitative outcomes use absolute values, percentages, means and SDs and differences between groups analysed using a t-test showing mean difference and 95% CIs. Usage metrics and statistical analyses are limited due to the small sample size. Questionnaire responses were collected and analysed by frequency of comments and by occurring themes (TRF and MCMcC).
Results
Feasibility data
The study was conducted over a 24-week period from recruitment to data analysis. For participant flow and study timeline, please refer figure 1.
Recruitment and student participation
Of 34 eligible students, eight (24%) chose to participate and were randomly allocated to the intervention and control conditions (four in each group). Five participants were recruited in week 1, two additional participants in week 2 and one participant in week 3.
Adherence to protocol
No participant withdrew from the study. Two of the four participants allocated to the intervention group completed the MCQ at the start of the study; two intervention group participants did not access the EBM primer at any point in the study period. Participants did not complete the MCQ at the end of the intervention phase. All eight participants completed the ACE tool and the summative course assessments. Three of eight study participants completed the qualitative data portion of the study, and three of 26 non-study participants responded to our inquiry of reasons for not participating.
Changes to protocol
We amended our protocol after the randomisation of participants to include qualitative data collection on students’ preferences in learning about EBM, and also to collect information barriers and motivations for participating in the study. The short questionnaires were emailed to the class cohort (n=34) following their final examination in June 2019.
Accessing the EBM course
Figure 2 shows the frequency of access to the EBM Primer throughout the study period. Between 1 April and 30 April 2019, the EBM Primer was available to intervention group participants. One participant viewed the majority of pages during this period. Ahead of the final examination, the EBM Primer was made available to all study participants. Four control participants viewed most pages. Patterns of behaviour suggested that access to the online course occurred on a single day rather than via repeat visits. Three participants accessed the course in days just before a final examination.
Barriers and enabling factors
Three of eight participants in the study, and three of the 26 non-participants returned responses to questionnaires.
Of the study participants, two highlighted the advantage of having access to an extra resource as a reason to participate, and one suggested that it could be a more enjoyable way of learning EBM than lectures. Responses to the teaching materials in the EBM primer were mixed: one participant found it more engaging than lectures, and one reported it to be ‘a good revision aide, but not so good as a primer’. The third found the information clear but providing no new information from lectures. Two participants commented that the exercises or videos were long or slow. Two participants expressed concerns over testing the objectives of the course or key EBM concepts, and one participant highlighted unwelcomed differences in EBM primer-based evaluations compared with the students’ final examination. One participant said having access to an online supplement may provide those students more advantage ahead of the final examination and ‘felt that this was unfair’. One respondent felt the EBM course material was better positioned as a revision tool. Another participant felt the course was too long in duration.
Of three non-participant respondents, two gave insufficient time as a reason for not participating, and the other mentioned ‘no desire to engage’ in supplementary EBM teaching.
Differences in EBM skills and knowledge
Quantitative outcome results for each participant were measured using the ACE tool (online supplementary appendix C). Data provided for preliminary analysis only.
The two intervention group participants who completed the MCQ at the start of the study scored 6.8 and 7.7 out of 10, respectively, but as neither of these completed it at the end of the intervention phase, this measure could not be analysed.
The mean (SD) of the ACE measure was 10.0 (2.4) in the intervention group and 10.5 (1.9) in the control group (mean difference −0.5, 95% CI −4.4 to 3.4) (figure 3).
For the final performance metric (examination paper 3), the mean (SD) was 66.0 (16.3) in the intervention group and 77.0 (9.8) in the control group (mean difference −11.0, 95% CI −35.5 to 13.5).
Discussion
As a feasibility RCT, this study indicates implementation challenges for medical student recruitment and adherence to an online training supplement of EBM. Our preliminary data suggest that students access online learning material in advance of examination, and it is unclear if supplementary training in EBM would be a priority for students otherwise.
Threats of bias
Student participants who volunteered for this study may have systematic differences in their baseline knowledge and interest in EBM. A potential selection bias was further indicated in qualitative data, where one student described feeling unmotivated to participate in the study due to lack of interest in EBM.
Our study design and its intervention were developed by members of the same research and teaching team (MCMcC, DY, CH and DN). To control for researcher or interpreter bias, the EBM Primer course director (MCMcC) and contributors (CH and DN) were not involved in recruitment, data collection or preliminary analyse and blinded from the point of recruitment to end of study. DY was not involved in analysis of the data. DMcC, as director of the programme, was involved in developing the study design and conducting recruitment.
Recommendations for improved study design
Recruitment, participation and retention
In future iterations, we recommend a longer and more aggressive (in-person) recruitment phase to encourage students to participate. Our retention rate was 100%, which we suspect was attributed to the short study duration and motivated study participants. No study participant completed the EBM supplementary course in full. Students report time management as a significant barrier in participation, and all aspects of the study and communications should be delivered with efficiency a key consideration. An alternative to individual randomisation within a medical school cohort, different groups of classes could be randomly allocated to receive online EBM course versus standard curriculum. In addition, increasing the required amount of EBM content on the medical school curriculum could increase student motivation and participation.16
Characteristics of the intervention
Implementation of an online supplement requires minimal input from a teaching or administrative standpoint. An online intervention carries advantages of standardising content delivery, amenable to measurement in the RCT framework. Additional information is required to explore how this online course should be modified to meet students’ preferences in terms of content delivery and duration. Patterns of behaviour indicate content were more important to study participants in advance of an examination.
Pre-existing knowledge of EBM
Earlier validation studies for the ACE tool reports ‘EBM-novice’ averaged 8.6 (SD 2.4), ‘EBM-intermediate’ averaged 9.5 (SD 1.8) and ‘EBM-advanced’ averaged 10.4 (SD 2.2).15 Our test results indicate seven of eight study participants form Oxford’s graduate-entry year 4 medical students had an existing intermediate/advanced knowledge-based of EBM. Other medical schools or students earlier in their training may present with lower baseline knowledge, and therefore would present an opportunity for greater increase in knowledge acquisition.
Conclusion
The conduct of an RCT examining the effect of an online supplement is potentially feasible. Adequate recruitment and adherence of study participants will need consideration in each context, focusing on strategies that will increase perceived benefits and encourage student interest but not threaten validity or reproducibility of results. We support the development of evidence-based teaching for EBM and online education interventions. Medical educators should pursue this research agenda and seek data to understand what works in terms of quantitative metrics for effective EBM knowledge acquisition, as well as in terms of qualitative data for student preferences.
Supplemental material
Data availability statement
Data are available upon reasonable request. Contact medical statistician Thomas Fanshawe for deidentified participant data at: thomas.fanshawe@phc.ax.ac.uk
Ethics statements
Patient consent for publication
Ethics approval
The study was approved by Oxford’s Central University Research Ethics Committee (CUREC) as lower-risk research involving human participants and/or their data (R62369).
Acknowledgments
We are grateful to student volunteers and faculty of University of Oxford’s EBHC programme, who with their support and interest in evidence-based practice, have made this study possible.
Footnotes
Twitter @dnunan79, @carlheneghan
Contributors Authors made the following valuable contributions to the research and manuscript: MCMcC: protocol development, intervention design plus implementation, data analysis, draft and submit manuscript. TRF: protocol development, data analysis, draft manuscript. DMcC: protocol development, recruitment, communications with study participants, manuscript review. DY: protocol development, communications with study participants, data collection, manuscript review. DN: protocol advice, manuscript review. CH: protocol advice, ethics application and manuscript review.
Funding The intervention was developed using University of Oxford’s EBHC programme funds and volunteer efforts. Research was self-funded. Authors have not received funding or compensation beyond their usual University roles. No competitive industry inputs or commercial interests are known.
Competing interests The online course intervention design and its implementation were a joint initiative of MCMcC and CH. DMcC and DY work within the medical sciences division to administer teaching in the medical school programme. DN and MCMcC have presented this course as part of a sample curriculum in teaching evidence-based medicine. CH, TRF and DN are paid faculty who teach on the evidence-based health care (EBHC) programme, Oxford University. MCMcC is also part-time lecturer on the EBHC programme.
Provenance and peer review Not commissioned; externally peer reviewed.