Real Time Evaluation in Emergency Medicine Training - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Real Time Evaluation in Emergency Medicine Training

Description:

Describe the differences between checklist evaluations and global ratings ... (hawks and doves) Range restriction error (central tendency) Halo effect. ACGME ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 44
Provided by: philip51
Category:

less

Transcript and Presenter's Notes

Title: Real Time Evaluation in Emergency Medicine Training


1
Real Time Evaluation in Emergency Medicine
Training
  • Philip Shayne, MD
  • Program Director
  • Emory UniversityCORD-EM Best Practices
    Conference
  • February 23, 2003

2
Goal
  • Explore the potential for using direct
    observation (real time) evaluations in assessing
    the competency of emergency medicine learners.

3
Objectives
  • 1. List the ACGME core competencies properly
    assessed by directed observation.
  • 2. Describe the potential, advantages and
    pitfalls of real time evaluations.
  • 3. Describe the differences between checklist
    evaluations and global ratings of live
    performance.
  • 4. Practice the use of a direct observation
    assessment tool.
  • 5. Critique the practicality and validity of the
    real time instruments.

4
A framework for authenticity of clinical
assessment
Does
Shows how
Knows how
Knows
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990 65 S63-S7.
5
Hierarchy of clinical assessment
Knows(knowledge base)
MCQ
6
Hierarchy of clinical assessment
Knows how(competence)
Oral Exams
Knows(knowledge base)
7
Hierarchy of clinical assessment
  • Simulations
  • Procedure labs
  • OSCEs/SP

Shows How(Performance)
Knows how(competence)
Knows(knowledge base)
8
Hierarchy of clinical assessment
  • Checklist evaluation
  • Global Ratings
  • 360-degree evaluation

DOES(Action)
Shows How(Performance)
Knows how(competence)
Knows(knowledge base)
9
(No Transcript)
10
Practice Assessment Methods
  • Indirect methods
  • Chart audit (record review or CSR)
  • Clinical work samples (practice audits)
  • Portfolios
  • Direct methods
  • Clinical supervisor ratings (direct observation)
  • Video assessment
  • Patient surveys
  • 360 evaluations

11
Assessment Tool Benchmarks
  • Directly observe the candidate interacting in a
    clinical situation
  • Allow frequent sampling (across clinical
    situations and examiners)
  • Global/generic scales (comparable)
  • Reliable
  • Valid
  • Cost and time efficient
  • Not interfere with patient care

12
(No Transcript)
13
Barriers to practice assessment
  • Patients are not standardized
  • Observers are not standardized
  • Hawthorne effect

14
Common Rating Errors
  • Leniency / Severity error (hawks and doves)
  • Range restriction error (central tendency)
  • Halo effect

15
ACGME Toolbox of Assessment MethodsDirect
Practice Assessment
  • Global Rating of Live or Record Performance
  • Checklist Evaluation of Live or Recorded
    Performance

16
CORD EM Consensus
  • Global Ratings are primary assessment method for
  • System-Based Practice
  • Global Ratings are secondary assessment method
    for
  • Patient Care
  • Medical Knowledge

17
CORD EM Consensus
  • Checklist evaluations are primary assessment
    method for
  • Patient Care
  • Medical Knowledge
  • Interpersonal Communications Skills
  • Professionalism
  • System-Based Practice
  • Checklist evals are partial helpful in assessing
    Practice-Based Learning

18
Global Rating of Live Performance
  • Described by the ACGME as
  • Resident is rated in general categories of
    ability
  • Completed retrospectively
  • Contain scales to rate performance (e.g 1-10)
  • Not heavily favored by the ACGME as a best method
    for evaluation

19
Checklist Evaluation of Live Performance
  • Checklist of essential or desired behaviors,
    activities, or steps that make up a more complex
    competency
  • Completed in real time
  • Judged instead of scaled (e.g. pass/fail)
  • Highly favored method of assessment by ACGME and
    CORD EM Consensus.

20
Tools We Can Use
21
ABIM mini-CEX
  • 15-20 minute snapshot of clinical performance
  • Performed real time
  • Scales correlate easily with ACGME competencies
  • Validated based on 4 performances per year

American Board of Internal Medicine, MiniCEX
Pilot Project, http//www.abim.org/MiniCEX/defaul
t.htm, 2001.
22
Mini-CEX ACGME
  • Medical Interviewing
  • Physical Exam
  • Humanistic Qualities
  • Clinical Judgement
  • Counseling Skills
  • Patient Care
  • Professionalism
  • Medical Knowledge
  • Communication

23
Mini-CEXBenchmarks
  • Directly observe the candidate interacting in a
    clinical situation
  • Allow frequent sampling (across clinical
    situations and examiners)
  • Global/generic scales (comparable)
  • Reliable
  • Valid
  • Cost and time efficient
  • Not interfere with patient care

24
Mini-CEXProblems
  • Does little to address any of the common rating
    errors (Reliability ?)
  • Not specific to EM
  • Does not address all core competencies

25
Literature on Checklist Evaluations
  • Technical skills in a surgery program
  • Evaluating videos of internal medicine residents
    performing physicals using a structured form
  • MetroHealth direct observation
  • Emory clinical evaluation exercise

26
MetroHealth Experience
  • Begun in 1993 published in 1996, 2002.
  • Four hour shadowed observation period by a
    non-clinical attending
  • Structured assessment form, specific to year of
    training.
  • Originally rated on a 5 point scale, changed to 3
    point scale (below expected, at expected, and
    above expected.

Cydulka RK, Emerman CL, Jouriles NJ Evaluation
of Resident Performance and Intensive Bedside
Teaching during Direct Observation, Academic
Emergency Medicine, April 1996, 3345-51.
27
MetroHealth DOBenchmarks
  • Directly observe the candidate interacting in a
    clinical situation
  • Allow frequent sampling (across clinical
    situations and examiners)
  • Global/generic scales (comparable)
  • Reliable
  • Valid
  • Cost and time efficient
  • Not interfere with patient care

28
MetroHealth Direct Observation
  • Best described program in EM
  • Specific to our specialty
  • Has some validity / reliability data
  • Can cover most of the ACGME core competencies
  • Costly

29
Emory CEE
  • 15 minute snapshot of clinical performance
  • Checklist of expected behaviors
  • Specific to EM
  • Judged on a 3 point scale (At, below, or above
    expected)

Shayne P, Heilpern K, Ander D, Palmer-Smith V.
Protected Clinical Teaching Time and a Bedside
Clinical Evaluation Instrument in an Emergency
Medicine Training Program. Academic Emergency
Medicine 2002 91342-1349.
30
Emory CEEBenchmarks
  • Directly observe the candidate interacting in a
    clinical situation
  • Allow frequent sampling (across clinical
    situations and examiners)
  • Global/generic scales (comparable)
  • Reliable
  • Valid
  • Cost and time efficient
  • Not interfere with patient care

31
Emory CEE
  • Easy to perform
  • Specific to EM
  • Opportunity to repeat multiple times in a year
  • Will cover most of the ACGME core competencies

32
Emory CEE
  • Not validated
  • Reliability unproven
  • Not mapped to the ACGME core competencies

33
Exercise
34
(No Transcript)
35
(No Transcript)
36
Objectives
  • 1. List the ACGME core competencies properly
    assessed by directed observation.
  • 2. Describe the potential, advantages and
    pitfalls of real time evaluations.
  • 3. Describe the differences between checklist
    evaluations and global ratings of live
    performance.
  • 4. Practice the use of a direct observation
    assessment tool.
  • 5. Critique the practicality and validity of the
    real time instruments.

37
Summary
  • Real time observation is the most authentic
    assessment of clinical competency
  • Real time observation is what we do in EM
  • There are possibilities for doing it efficiently
    and inexpensive

38
Summary
  • Supported by ACGME and CORD EM consensus
  • Serious issues exist with establishing
    reliability and validity

39
Summary
  • Global ratings assessments are easy to map to
    specific ACGME competencies, but probably have
    low reliability
  • Checklist evaluations probably will be more
    reliable, but less easy to generalize to core
    competencies

40
Summary
  • Direct observation with a structured checklist
    evaluation has potential to cover multiple
    competencies in EM
  • Patient care
  • Medical knowledge
  • Interpersonal skills
  • Professionalism
  • Systems-based Practice

41
(No Transcript)
42
Where to go?
  • Develop a consensus checklist evaluation
  • Map checklist to core competencies
  • Develop a multi-site, multi-rater trial to work
    on issues of reliability, validity and easy of
    use.

43
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com