Title: Real Time Evaluation in Emergency Medicine Training
1Real Time Evaluation in Emergency Medicine
Training
- Philip Shayne, MD
- Program Director
- Emory UniversityCORD-EM Best Practices
Conference - February 23, 2003
2Goal
- Explore the potential for using direct
observation (real time) evaluations in assessing
the competency of emergency medicine learners.
3Objectives
- 1. List the ACGME core competencies properly
assessed by directed observation. - 2. Describe the potential, advantages and
pitfalls of real time evaluations. - 3. Describe the differences between checklist
evaluations and global ratings of live
performance. - 4. Practice the use of a direct observation
assessment tool. - 5. Critique the practicality and validity of the
real time instruments.
4A framework for authenticity of clinical
assessment
Does
Shows how
Knows how
Knows
Miller GE. The assessment of clinical
skills/competence/performance. Academic Medicine
(Supplement) 1990 65 S63-S7.
5Hierarchy of clinical assessment
Knows(knowledge base)
MCQ
6Hierarchy of clinical assessment
Knows how(competence)
Oral Exams
Knows(knowledge base)
7Hierarchy of clinical assessment
- Simulations
- Procedure labs
- OSCEs/SP
Shows How(Performance)
Knows how(competence)
Knows(knowledge base)
8Hierarchy of clinical assessment
- Checklist evaluation
- Global Ratings
- 360-degree evaluation
DOES(Action)
Shows How(Performance)
Knows how(competence)
Knows(knowledge base)
9(No Transcript)
10Practice Assessment Methods
- Indirect methods
- Chart audit (record review or CSR)
- Clinical work samples (practice audits)
- Portfolios
- Direct methods
- Clinical supervisor ratings (direct observation)
- Video assessment
- Patient surveys
- 360 evaluations
11Assessment Tool Benchmarks
- Directly observe the candidate interacting in a
clinical situation - Allow frequent sampling (across clinical
situations and examiners) - Global/generic scales (comparable)
- Reliable
- Valid
- Cost and time efficient
- Not interfere with patient care
12(No Transcript)
13Barriers to practice assessment
- Patients are not standardized
- Observers are not standardized
- Hawthorne effect
14Common Rating Errors
- Leniency / Severity error (hawks and doves)
- Range restriction error (central tendency)
- Halo effect
15ACGME Toolbox of Assessment MethodsDirect
Practice Assessment
- Global Rating of Live or Record Performance
- Checklist Evaluation of Live or Recorded
Performance
16CORD EM Consensus
- Global Ratings are primary assessment method for
- System-Based Practice
- Global Ratings are secondary assessment method
for - Patient Care
- Medical Knowledge
17CORD EM Consensus
- Checklist evaluations are primary assessment
method for - Patient Care
- Medical Knowledge
- Interpersonal Communications Skills
- Professionalism
- System-Based Practice
- Checklist evals are partial helpful in assessing
Practice-Based Learning
18Global Rating of Live Performance
- Described by the ACGME as
- Resident is rated in general categories of
ability - Completed retrospectively
- Contain scales to rate performance (e.g 1-10)
- Not heavily favored by the ACGME as a best method
for evaluation
19Checklist Evaluation of Live Performance
- Checklist of essential or desired behaviors,
activities, or steps that make up a more complex
competency - Completed in real time
- Judged instead of scaled (e.g. pass/fail)
- Highly favored method of assessment by ACGME and
CORD EM Consensus.
20Tools We Can Use
21ABIM mini-CEX
- 15-20 minute snapshot of clinical performance
- Performed real time
- Scales correlate easily with ACGME competencies
- Validated based on 4 performances per year
American Board of Internal Medicine, MiniCEX
Pilot Project, http//www.abim.org/MiniCEX/defaul
t.htm, 2001.
22Mini-CEX ACGME
- Medical Interviewing
- Physical Exam
- Humanistic Qualities
- Clinical Judgement
- Counseling Skills
- Patient Care
- Professionalism
- Medical Knowledge
- Communication
23Mini-CEXBenchmarks
- Directly observe the candidate interacting in a
clinical situation - Allow frequent sampling (across clinical
situations and examiners) - Global/generic scales (comparable)
- Reliable
- Valid
- Cost and time efficient
- Not interfere with patient care
24Mini-CEXProblems
- Does little to address any of the common rating
errors (Reliability ?) - Not specific to EM
- Does not address all core competencies
25Literature on Checklist Evaluations
- Technical skills in a surgery program
- Evaluating videos of internal medicine residents
performing physicals using a structured form - MetroHealth direct observation
- Emory clinical evaluation exercise
26MetroHealth Experience
- Begun in 1993 published in 1996, 2002.
- Four hour shadowed observation period by a
non-clinical attending - Structured assessment form, specific to year of
training. - Originally rated on a 5 point scale, changed to 3
point scale (below expected, at expected, and
above expected.
Cydulka RK, Emerman CL, Jouriles NJ Evaluation
of Resident Performance and Intensive Bedside
Teaching during Direct Observation, Academic
Emergency Medicine, April 1996, 3345-51.
27MetroHealth DOBenchmarks
- Directly observe the candidate interacting in a
clinical situation - Allow frequent sampling (across clinical
situations and examiners) - Global/generic scales (comparable)
- Reliable
- Valid
- Cost and time efficient
- Not interfere with patient care
28MetroHealth Direct Observation
- Best described program in EM
- Specific to our specialty
- Has some validity / reliability data
- Can cover most of the ACGME core competencies
- Costly
29Emory CEE
- 15 minute snapshot of clinical performance
- Checklist of expected behaviors
- Specific to EM
- Judged on a 3 point scale (At, below, or above
expected)
Shayne P, Heilpern K, Ander D, Palmer-Smith V.
Protected Clinical Teaching Time and a Bedside
Clinical Evaluation Instrument in an Emergency
Medicine Training Program. Academic Emergency
Medicine 2002 91342-1349.
30Emory CEEBenchmarks
- Directly observe the candidate interacting in a
clinical situation - Allow frequent sampling (across clinical
situations and examiners) - Global/generic scales (comparable)
- Reliable
- Valid
- Cost and time efficient
- Not interfere with patient care
31Emory CEE
- Easy to perform
- Specific to EM
- Opportunity to repeat multiple times in a year
- Will cover most of the ACGME core competencies
32Emory CEE
- Not validated
- Reliability unproven
- Not mapped to the ACGME core competencies
33Exercise
34(No Transcript)
35(No Transcript)
36Objectives
- 1. List the ACGME core competencies properly
assessed by directed observation. - 2. Describe the potential, advantages and
pitfalls of real time evaluations. - 3. Describe the differences between checklist
evaluations and global ratings of live
performance. - 4. Practice the use of a direct observation
assessment tool. - 5. Critique the practicality and validity of the
real time instruments.
37Summary
- Real time observation is the most authentic
assessment of clinical competency - Real time observation is what we do in EM
- There are possibilities for doing it efficiently
and inexpensive
38Summary
- Supported by ACGME and CORD EM consensus
- Serious issues exist with establishing
reliability and validity
39Summary
- Global ratings assessments are easy to map to
specific ACGME competencies, but probably have
low reliability - Checklist evaluations probably will be more
reliable, but less easy to generalize to core
competencies
40Summary
- Direct observation with a structured checklist
evaluation has potential to cover multiple
competencies in EM - Patient care
- Medical knowledge
- Interpersonal skills
- Professionalism
- Systems-based Practice
41(No Transcript)
42Where to go?
- Develop a consensus checklist evaluation
- Map checklist to core competencies
- Develop a multi-site, multi-rater trial to work
on issues of reliability, validity and easy of
use.
43(No Transcript)