Empirical evidence of bias. Dimensions of methodological quality associated with estimates of treatment effects in controlled trials

JAMA. 1995 Feb 1;273(5):408-12. doi: 10.1001/jama.273.5.408.

Abstract

Objective: To determine if inadequate approaches to randomized controlled trial design and execution are associated with evidence of bias in estimating treatment effects.

Design: An observational study in which we assessed the methodological quality of 250 controlled trials from 33 meta-analyses and then analyzed, using multiple logistic regression models, the associations between those assessments and estimated treatment effects.

Data sources: Meta-analyses from the Cochrane Pregnancy and Childbirth Database.

Main outcome measures: The associations between estimates of treatment effects and inadequate allocation concealment, exclusions after randomization, and lack of double-blinding.

Results: Compared with trials in which authors reported adequately concealed treatment allocation, trials in which concealment was either inadequate or unclear (did not report or incompletely reported a concealment approach) yielded larger estimates of treatment effects (P < .001). Odds ratios were exaggerated by 41% for inadequately concealed trials and by 30% for unclearly concealed trials (adjusted for other aspects of quality). Trials in which participants had been excluded after randomization did not yield larger estimates of effects, but that lack of association may be due to incomplete reporting. Trials that were not double-blind also yielded larger estimates of effects (P = .01), with odds ratios being exaggerated by 17%.

Conclusions: This study provides empirical evidence that inadequate methodological approaches in controlled trials, particularly those representing poor allocation concealment, are associated with bias. Readers of trial reports should be wary of these pitfalls, and investigators must improve their design, execution, and reporting of trials.

Publication types

  • Meta-Analysis
  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, P.H.S.

MeSH terms

  • Bias*
  • Clinical Protocols
  • Models, Statistical
  • Quality Control
  • Randomized Controlled Trials as Topic / standards*
  • Randomized Controlled Trials as Topic / statistics & numerical data
  • Research Design