Background and objective: Randomized clinical trials (RCTs) provide the most reliable estimates of treatment effectiveness for therapeutic interventions. However, flaws in their design and conduct may bias treatment effect estimates, leading to overestimation or underestimation of the true intervention effect. This is especially relevant for complex interventions, such as those in rehabilitation, which are multifaceted and tailored for individual patients or providers, leading to variations in delivery and treatment effects. To assess whether poor intervention fidelity, the faithfulness of the intervention delivered in an RCT to what was intended in the trial protocol, influences (biases) estimates of treatment effects derived from meta-analysis of rehabilitation RCTs.
Methods: In this metaepidemiological study of 19 meta-analyses and 204 RCTs published between 2010 and 2020, we evaluated the difference in intervention effects between RCTs in which intervention fidelity was monitored and those in which it was absent. We also conducted random-effects metaregression to measure associations between intervention fidelity, risk of bias, study sample size, and treatment effect estimates.
Results: There was a linear relationship between fidelity and treatment effect sizes across RCTs, even after adjusting for risk of bias and study sample size. Higher degrees of fidelity were associated with smaller but more precise treatment effect estimates (d = -0.23 95% CI: -0.38, -0.74). Lower or absent fidelity was associated with larger, less precise estimates. Adjusting for fidelity reduced pooled treatment effect estimates in 4 meta-analyses from moderate to small or from small to no negligible or no effect, highlighting how poor fidelity can bias meta-analyses' results.
Conclusion: Poor or absent intervention fidelity in RCTs may lead to overestimation of observed treatment effects, skewing the conclusions from individuals studies and systematic reviews with meta-analyses when pooled. Caution is needed when interpreting the results of complex intervention RCTs when fidelity is not monitored or is monitored but not reported.
Plain language summary: Patients, the public, and health-care providers rely on clinical trials for information about how effective treatments are when making decisions about health care. However, the way that clinical trials are conducted may alter the evidence that clinical trials provide about how effective interventions truly are. In this study, we investigated whether how closely health-care providers monitor how they deliver rehabilitation treatments to patients in clinical studies, and how closely those treatments match the treatment that the researchers had planned, influences the results of those studies. We found that when researchers or health-care providers don't closely monitor how they deliver treatments during a study, those studies may provide exaggerated estimates of the effectiveness of the treatments studies. This is important, because it may mean that some health-care providers and patients may opt for treatments that are less effective than they appeared in clinical studies, or may overlook treatments that are more effective than they appeared in other studies.
Keywords: Adherence; Bias; Complex interventions; Compliance; Integrity; Intervention fidelity; Treatment effect estimates.
Copyright © 2024 The Authors. Published by Elsevier Inc. All rights reserved.