Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review

JAMA. 2009 Sep 23;302(12):1316-26. doi: 10.1001/jama.2009.1365.

Abstract

Context: Direct observation of medical trainees with actual patients is important for performance-based clinical skills assessment. Multiple tools for direct observation are available, but their characteristics and outcomes have not been compared systematically.

Objectives: To identify observation tools used to assess medical trainees' clinical skills with actual patients and to summarize the evidence of their validity and outcomes.

Data sources: Electronic literature search of PubMed, ERIC, CINAHL, and Web of Science for English-language articles published between 1965 and March 2009 and review of references from article bibliographies.

Study selection: Included studies described a tool designed for direct observation of medical trainees' clinical skills with actual patients by educational supervisors. Tools used only in simulated settings or assessing surgical/procedural skills were excluded. Of 10 672 citations, 199 articles were reviewed and 85 met inclusion criteria.

Data extraction: Two authors independently abstracted studies using a modified Best Evidence Medical Education coding form to inform judgment of key psychometric characteristics. Differences were reconciled by consensus.

Results: A total of 55 tools were identified. Twenty-one tools were studied with students and 32 with residents or fellows. Two were used across the educational continuum. Most (n = 32) were developed for formative assessment. Rater training was described for 26 tools. Only 11 tools had validity evidence based on internal structure and relationship to other variables. Trainee or observer attitudes about the tool were the most commonly measured outcomes. Self-assessed changes in trainee knowledge, skills, or attitudes (n = 9) or objectively measured change in knowledge or skills (n = 5) were infrequently reported. The strongest validity evidence has been established for the Mini Clinical Evaluation Exercise (Mini-CEX).

Conclusion: Although many tools are available for the direct observation of clinical skills, validity evidence and description of educational outcomes are scarce.

Publication types

  • Research Support, Non-U.S. Gov't
  • Review
  • Systematic Review

MeSH terms

  • Clinical Competence* / standards
  • Education, Medical, Graduate / methods*
  • Humans
  • Internship and Residency / methods*
  • Observation*
  • Patient Simulation*
  • Quality Assurance, Health Care*
  • Reproducibility of Results
  • Students, Medical* / statistics & numerical data