Optimizing physician skill development for medical students: The four-part assessment

Am J Surg. 2017 May;213(5):906-909. doi: 10.1016/j.amjsurg.2017.03.026. Epub 2017 Apr 5.

Abstract

Background: Medical student performance has been poorly correlated with residency performance and warrants further investigation. We propose a novel surgical assessment tool to determine correlations with clinical aptitude.

Methods: Retrospective review of medical student assessments from 2013 to 2015. Faculty rating of student performance was evaluated by: 1) case presentation, 2) problem definition, 3) question response and 4) use of literature and correlated to final exam assessment. A Likert scale interrater reliability was evaluated.

Results: Sixty student presentations were scored (4.8 assessors/presentation). A student's case presentation, problem definition, and question response was correlated with performance (r = 0.49 to 0.61, p ≤ 0.003). Moderate correlations for either question response or use of literature was demonstrated (0.3 and 0.26, p < 0.05).

Conclusion: Our four-part assessment tool identified correlations with course and examination grades for medical students. As surgical education evolves, validated performance and reliable testing measures are required.

Publication types

  • Evaluation Study

MeSH terms

  • Aptitude Tests*
  • Aptitude*
  • Clinical Competence
  • Education, Medical, Undergraduate*
  • Educational Measurement / methods*
  • General Surgery / education*
  • Humans
  • Oregon
  • Retrospective Studies
  • Single-Blind Method
  • Students, Medical / psychology*