Objective: To assess the level of agreement when selecting quality measures for inclusion in a composite index of neonatal intensive care quality (Baby-MONITOR) between two panels: one comprised of academic researchers (Delphi) and another comprised of academic and clinical neonatologists (clinician).
Study design: In a modified Delphi process, a panel rated 28 quality measures. We assessed clinician agreement with the Delphi panel by surveying a sample of 48 neonatal intensive care practitioners. We asked the clinician group to indicate their level of agreement with the Delphi panel for each measure using a five-point scale (much too high, slightly too high, reasonable, slightly too low and much too low). In addition, we asked clinicians to select measures for inclusion in the Baby-MONITOR based on a yes or no vote and a pre-specified two-thirds majority for inclusion.
Result: In all, 23 (47.9%) of the clinicians responded to the survey. We found high levels of agreement between the Delphi and clinician panels, particularly across measures selected for the Baby-MONITOR. Clinicians selected the same nine measures for inclusion in the composite as the Delphi panel. For these nine measures, 74% of clinicians indicated that the Delphi panel rating was 'reasonable'.
Conclusion: Practicing clinicians agree with an expert panel on the measures that should be included in the Baby-MONITOR, enhancing face validity.