Results of Objective Structured Clinical Examinations (OSCEs) for Assessment of Clinical Competence

Nancy Krusen, Debra Rollins

The presentation reports outcomes of a first-trial objective structured clinical examination (OSCE) used to assess clinical competence. OSCEs are brief, multiple stations assessing a variety of clinical practice skills. The presentation analyzes the educational value of OSCE as a performance-based tool. The presentation supports a culture of learning, assuring skill prior to clinical practice placement. We describe task-specific checklist and global scores, descriptive statistics for seventeen OSCE stations, descriptive statistics for learner performance, phenomenological analysis of learner and rater feedback, and plans for additional research. Through formal presentation, small group discussion, and large group sharing, learners will be able to differentiate skill-specific and overall rating scales, deliberate reliability and validity of OSCE use, and seek additional resources for OSCE implementation.

Harden and Gleeson (1975) first outlined OSCEs for use in medical school assessment to support traditional didactic and clinical examinations. OSCEs are used frequently across health professions to demonstrate competence, conduct program evaluation, and indicate compliance with educational Standards. We founded the OSCE in transformative learning (Mezirow, 1981), through which students transform old knowledge by reflecting on new experiences, and in situated learning (Lave & Wenger, 1991), through which faculty design the just-right challenge at the just-right time.

Faculty from a School of Occupational Therapy unanimously identified the need for a performance-based measure of clinical competence (other than traditional didactic or clinical examination) prior to clinical placement. Faculty members identified a preference for the measure to be formative for student learning and summative for program evaluation. (Development of the OSCE is reported elsewhere.) A cohort of students (n=40) each completed a rotation of seventeen OSCE stations in competence areas matching those of the national Fieldwork Performance Evaluation. The OSCE presentation supports a culture of learning across a curriculum with long-term impact assuring quality for the public.

Authors will present quantitative analysis of learner performance comparing item-specific task checklist with global scores for each OSCE station, analysis of a self-completed student survey data, and analysis of station-specific difficulty level data. Authors will also present qualitative textual analysis of learner and rater feedback. To inform educational practice, authors recommend and plan further research on the psychometric properties of the OSCE as a measure of competence including rater reliability; correlation with other measures of clinical competence; correlation of performance with curricular courses; the predictive value for clinical performance; and the relationship between OSCE performance, classroom performance, and clinical training performance.