Article Text

Download PDFPDF

Expanding the evidence within evidence-based healthcare: thinking about the context, acceptability and feasibility of interventions
Free
  1. Rachel L Shaw1,
  2. Michael Larkin2,
  3. Paul Flowers3
  1. 1School of Life and Health Sciences, Aston University, Birmingham, West Midlands, UK
  2. 2School of Psychology, University of Birmingham, Edgbaston, Birmingham, West Midlands, UK
  3. 3Institute for Applied Health Research, Glasgow Caledonian University, Glasgow, UK
  1. Correspondence to: Dr Rachel L Shaw
    , School of Life and Health Sciences, Aston University, Birmingham, WM B4 7ET, UK; r.l.shaw{at}aston.ac.uk.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Delivering evidence-based healthcare is a complex interpersonal process

Evidence-based healthcare (EBHC) depends on collating research evidence, communicating findings and translating findings into best-practice guidance that can be implemented in real-world practice. Cochrane's definition of EBHC highlights the centrality of the clinician–evidence relationship to bridge the gap between research and practice (see: http://www.cochrane.org/about-us/evidence-based-health-care#REF1). How clinicians feel about evidence can have an impact on the degree of fidelity with which healthcare interventions are implemented.1 As such there are critical differences between evidence of efficacy and effectiveness. All evidence-based interventions, whether biomedical, social or structural, involve interpersonal processes and are “delivered in the context of an encounter between a health professional and a patient, making healthcare professional clinical behaviours an important proximal determinant of the quality of care that patients receive.”2

Thus, interpersonal relationships and communication are fundamental to implementation science. This jars with most understandings of the role of evidence within guideline formation, which disproportionately privileges large-scale population-based studies. Such studies are vital tests of efficacy and cost-effectiveness, yet poor at understanding implementation. The current evidence hierarchy used to help shape guidance production (eg, by National Institute for Health and Care Excellence (NICE) and the Scottish Intercollegiate Guideline Network (SIGN)) struggles to incorporate qualitative and mixed methods research, and thus lacks systematic analyses of the context and experience of implementing interventions. Clinicians and commissionersi need to understand the interactions, relationships and sociocultural contexts that shape the acceptability and meaningfulness of healthcare interventions. To complete the cycle of translating findings into ‘practice-ready’ guidance it is necessary to consider the human systems within which an intervention is to be implemented. This is an iterative process, as described in the Medical Research Council's (MRC) framework for developing complex behavioural interventions (see: http://www.mrc.ac.uk/Utilities/Documentrecord/index.htm?d=MRC004871).

A complete evidence base must inform healthcare guidance. This should develop from a pluralistic model of research; address translational and implementation questions; draw upon different methods for different questions; provide diverse evidence on which to base best-practice guidance.

What is missing from conventional accounts of how ‘best evidence’ informs practice?

We know that within the hierarchy of evidence, systematic reviews of randomised controlled trials (RCTs) sit at the top as the ‘gold standard’ and are key to informing guidance.3 This is unsurprising because they report on efficacy and/or cost-effectiveness of interventions in terms which provide clear messages for policy-makers. However, as Sackett's discussion of evidence-based medicine made clear, when faced with an individual patient, an RCT may not provide coherent advice.4 Instead, Sackett emphasised the need for both clinical expertise and current best evidence because without both, practice will suffer to the detriment of patients. Balancing insights from cumulative knowledge and taking a negotiated and tailored approach to each patient represents the best approach.

While the traditional hierarchy of evidence is vital in understanding cumulative knowledge and healthcare delivery, it is not without its shortcomings. The vital focus on objective, measurable and controllable ways of looking at healthcare at a population level mean that social context and individual experience are discounted; the patient and the clinician are both reduced to cyphers. To be useful, best-practice guidance must be adaptable to real-world practice scenarios, that is, take account of the cultural and psychosocial context. This is the task of translational research which emphasises working within real-world environments, partnerships, stakeholder consultations and often involves qualitative and/or mixed methods designs.5 ,6

Incorporating experience and context into healthcare evidence

There has been a substantial rise in the use qualitative and mixed methods research in medical and healthcare settings in the past 20 years which has provided great insight into the value and accessibility of healthcare services.7 ,8 Furthermore, it has shown us how patients make sense of their own health, whether the health information provided is clear and appropriate and whether healthcare professionals feel competent when explaining complex risks associated with diagnosis or treatment regimens.9 ,10

Some ground has been gained in recent moves by funding bodies, particularly the UK government funding programme—National Institute for Health Research (NIHR)—to foreground patient perspectives in designing and conducting research through patient, carer and public involvement (PCPI; eg,:http://www.crncc.nihr.ac.uk/about_us/stroke_research_network/in_your_area/south_west/Patients_carers). A review of patient involvement has demonstrated that it can be cost-effective and has been adopted in a range of research designs including RCTs, naturalistic studies, participatory and political action research.11 ,12 As such, PCPI has sanctioned the utility of patient and carer perspectives. Nevertheless, it remains an initiative to involve patients in the research process, rather than expanding the types of methods used in designing services. Patient perspectives remain absent from the process by which evidence becomes guidance. Although patients increasingly help set research agendas, unless their perspectives are turned into high-quality peer-reviewed papers, they will not shape guidance production.

The added value of qualitative and mixed methods research

To incorporate experience and context into the evidence base, alongside efficacy and cost-effectiveness, to provide useful best-practice guidance that can deal with multiple formulations at the population, systemic and individual levels, we require a wider range of research questions and methods and a more balanced view of their value. Mixed methods designs and qualitative research offer systematic ways of exploring the sociocultural context within which healthcare services must be designed and delivered; they provide means of gathering and analysing patients’ perspectives, features of interactions between patients and practitioners, language used in health information and tools to understand patient expectations and satisfaction; and they can help us understand key relationships at the level of family, community, health and social care organisations and government. NICE's conceptual framework puts the patients’ lifeworld at the centre.13 This refers to all of the socioeconomic, cultural and historical aspects of our lives; it includes our relationships, geography and our ability to take control, or have agency, in our interactions with healthcare providers and policy-makers. The centrality of the lifeworld requires that we be open to a broad range of research questions and designs and that we use the best-fit methods for the questions we ask. This may involve multiple qualitative methods, qualitative and quantitative methods or more than one quantitative method.14 The outcomes of such research can help us to understand not only the role of experience and context in the implementation of interventions but can also help us to develop future interventions, and future evaluations of the effectiveness of those interventions. They are, therefore, likely to be important to the users of research.15 Critically then, the centrality of the lifeworld means that we must be open to including the outcomes of such research in syntheses of evidence, and in the development of guidance to inform best practice.

Systematic reviews of diverse evidence

The value of qualitative research is clear but there remains a gap between high-quality primary qualitative research and its systematic review for inclusion in good-practice guidance.16 That is not to say that methods for systematically reviewing qualitative evidence do not exist.17 Furthermore, UK bodies NICE and SIGN have declared their commitment to qualitative evidence as essential, alongside more ‘traditional’ (ie, quantitative) sources of evidence, in understanding healthcare and subsequently in producing guidelines for best clinical practice. For this work to be translated into the policy and practice world, we need a clear methodology. The Cochrane Collaboration Qualitative Methods and Implementation Group in the UK (see: http://cqim.cochrane.org/) and the Patient-Centered Outcomes Research Institute in the USA (see: http://www.pcori.org/) have begun this work. Significant advancements in appraisal tools for diverse evidence have been accomplished18 but focused efforts are still required to establish rigorous methods for synthesis of quantitative and qualitative data. Integrative and aggregative synthesis methods have been proposed.17 The EPPI (Evidence for Policy and Practice Information and Co-ordinating Centre at the Institute of Education, London) approach involves the parallel synthesis of, for example, intervention studies (quantitative methods) and perspectives studies (qualitative methods). This is not an integrative approach because findings from studies with different designs are dealt with separately before being brought together in a ‘mosaic’ to answer the research question.19 A recent proposal for ‘best-fit’ framework synthesis generates an a priori framework from literature describing conceptual models or theories. Next, deductive (using the framework) and inductive coding (data-driven) using principles of thematic analysis are used to perform the synthesis.20 While currently proposed for the synthesis of qualitative evidence alone, there is potential to adapt this integrative method of synthesis for use with diverse evidence.

Once a rigorous methodology for systematic review and synthesis of diverse evidence is recognised, the next challenge is determining how such evidence will manifest in best-practice guidance. As indicated above, qualitative and mixed methods evidence will advise practitioners in their everyday encounters with patients in individual or group settings. Additionally, it will inform the development of strategies for designing interventions that are feasible in different contexts and that generate patient acceptance across a range of socioeconomic and geographical groupings. In short, incorporating diverse evidence into EBHC will produce guidance encompassing best practice at the population, systemic and individual levels.

References

Footnotes

  • Competing interests None.

  • 1 UK healthcare is devolved to member states. In England healthcare services are commissioned by the NHS Commissioning Board whose role is to allocate funds to deliver the best possible care to patients. This is supported by regional Clinical Commissioning Groups comprising local practitioners whose responsibility it is to ensure that local services meet local needs (see: http://webarchive.nationalarchives.gov.uk/20130805112926/http://healthandcare.dh.gov.uk/system/).