Title: ACGMEIHI Conference
1ACGME/IHI Conference
- Medical Knowledge
- Building and Testing
Scott A. Schartel, D.O. Associate Professor of
Anesthesiology Director of Resident
Education Department of Anesthesiology Temple
University
Albert J. Varon, M.D. Professor of Anesthesiology
Surgery Vice Chairman for Education Department
of Anesthesiology University of Miami School of
Medicine
2Goal
- To identify key characteristics of appropriate
assessment tools to determine competence in
medical knowledge.
3Objectives
- Medical Knowledge
- Define competency
- Develop appropriate assessment tools
- Review how standard-setting occurs
- Demonstrate learning objectives as the basis for
a competency-based curriculum - Discuss the feasibility, reliability, and
validity of different evaluation tools
4Definition
- Residents must demonstrate knowledge about
established and evolving biomedical, clinical,
and cognate sciences and the application of this
knowledge to patient care - www.acgme.org
5Definition
- Residents are expected to
- demonstrate an investigatory and analytic
thinking approach to clinical situations - know and apply the basic and clinically
supportive sciences which are appropriate to
their discipline - www.acgme.org
6Assessing Medical Knowledge
- Establish goals and objectives
- Set the curriculum
- Develop an assessment system
- pass/fail standards
- types of examinations
- multiple choice
- essay (short/long answer)
- oral
7Taxonomy of Educational ObjectivesCognitive
Domain
- Knowledge
- Comprehension
- Application
- Analysis
- Synthesis
- Evaluation
Low
High
Benjamin Bloom, et al, 1956
8Examination Construction
- Exam content should match the objectives of the
educational program - Important topics should be weighted more heavily
than those less important - Sample size needs to be large enough for the
examination to be reliable (reproducible) and
accurate (valid) - Examination should discriminate between novices
and experts
9Setting Pass/Fail Standards
- Norm-referenced
- a certain percentage of the group taking an
examination pass or fail. The pass/fail point is
set based on the performance of the group as a
whole - Criterion-referenced
- a specific pass/fail point is established. The
performance of the group as a whole does not
influence the pass/fail decisions - www.acgme.org
10Guidelines for Setting Pass/Fail Standards
- Regardless of the procedure used, setting
standards requires judgment. Setting standards
will always be arbitrary, but need not be
capricious. - Susan M Case, MD David B Swanson, PhD
- National Board of Medical Examiners
11Guidelines for Setting Pass/Fail Standards
- Unless there is a specific reason to fail a given
number of examinees, a standard based on
examinees mastery of exam content is preferred
over a standard based on a particular failure
rate - Susan M Case, MD David B Swanson, PhD
- National Board of Medical Examiners
12Guidelines for Setting Pass/Fail Standards
- It is wise to involve multiple informed judges in
the standard setting process.Differences of
opinion will occur, and use of multiple judges
will reduce hawk/dove effects. - Susan M Case, MD David B Swanson, PhD
- National Board of Medical Examiners
13Guidelines for Setting Pass/Fail Standards
- Judges should be provided with data on examinee
performance at some point in setting
standards.Setting standards without such data
may lead to non-uniform standards and
unreasonable results. - Susan M Case, MD David B Swanson, PhD
- National Board of Medical Examiners
14Alternative View
- The fundamental problem is that norm-referencing
embraces the possibility of failure, and in a
mistaken effort not to hurt anyones feelings we
have rejected that. But, in the absence of
mechanisms to prolong periods of study until
students can meet realistic criteria, we are
stuck with a trade-off between failure and
lowering standards. The nature of the system is
that criterion-referencing inevitable leads to
the latter - JS Atherton Heterodoxy Against criterion
referenced assessment. http//www.doceo.co.uk/het
erodoxy/criterion.htm
15Multiple Choice Questions True/False Items
- C-type
- A, B, both, or neither response is true
- K-type
- complex true/false items
- X-type
- simple true/false items, single statement
- Simulations
- patient management problems
16Multiple Choice Questions One Best Answer
- A-type
- 4 or more options, single items or sets
- B-type
- 4-5 option matching items in sets of 2-5
- R-type
- extended matching items in sets of 2-20 items
17Basic Rules for One Best-Answer Family
- The question should
- focus on an important concept
- assess application of knowledge, not recall of
isolated facts - pose a clear question in the stem
- have distractors that are homogenous
- avoid technical flaws that provide benefit to
test-wise students
18A-Type QuestionsAppropriate Shape
Long Stem
A. B. C. Short Options D. E.
19A-Type QuestionsPoorly Shaped
Short Stem
A. B. C. Long Options D. E.
20R-TypeExtended Matching Items
- Theme
- Options
- Lead-ins
- Stems
21Use of Oral Examinations
- Good for testing higher levels of cognitive
domain (analysis, synthesis, evaluation) - More difficult to standardize
- More difficult to grade objectively
22Chart-Stimulated Recall
- Uses actual patient records
- The examiner reviews the records and constructs
examination questions - Focus is on evaluation and management decision
making - Very labor intensive for both examinee and
examiners
23Chart Stimulated Recall Examination (ABEM)
- Candidate must
- Collect 1st 6 charts during 5 week period for
which candidate had primary responsibility for
diagnosis and management and where patient was
admitted, transferred, or expired in ED - Copy charts and supporting information
- Transcribe all hand-written notes into
type-written form - Organize in prescribed order
- Complete CSR chart outline, checklist, and
inventory forms
24Best Methods
- Written examination (MCQ)
- Standardized oral examination
- Chart stimulated recall oral examination
- Toolbox of assessment methods. ACGME ABMS.
Sept 2000
25In-Training Examinations
- Cognitive examinations given annually and
sponsored by specialty boards or societies - Tool to evaluate knowledge
- fundamental sciences
- management of specialty related clinical problems
26In-Training Examinations
- American Board of Medical Specialties
27In-Training Examinations
- American Board of Medical Specialties
28In-Training Examinations
- Objectives
- enable residents to assess strengths weaknesses
in specialty knowledge at the time of the
examination - assess residents progress year to year
- compare residents performance with national peer
groups
29In-Training Examinations
- Used by some programs as a tool to
- assess whether residents are eligible for
advancement - develop specialized educational programs and
goals - assess effectiveness of training program
- develop curricula
- validate need for planned curricular changes
30In-Training Examinations
- Individual reports include
- total percent correct score
- percentile rank
- a list of test areas or keywords answered
incorrectly - norm table with guidelines for interpreting the
data (for use in comparing score with different
resident groups)
31In-Training Examinations
- Program director reports include
- score report and list of test areas missed for
each resident - overall program performance report
- norm table with guidelines for interpreting the
data - examinee and program performance graphs
32In-Training Examinations
- Often designed as abbreviated version of exam
used for certifying purposes - Residents familiarize themselves with specialty
board examination - Predict performance on subsequent certifying
examination
33In-Training Examinations
- Predictive of performance on
- American Board of Radiology written examination.
- Baumgartner BR, et al. Acad Radiol 3 873-8, 1996
- Royal College of Physicians and Surgeons of
Canada certification examination. - Kearney RA, et al. Can J Anaesth 47 914-8, 2000
- American Board of Psychiatry and Neurology
examination. - Goodman JC, et al. Neurology 58 1144-6, 2002
- Juel VC, et al. Neurology 60 1385-7, 2003
-
34Oral Practice Examinations
35Oral Board Examinations
- American Board of Medical Specialties
36Oral Board Examinations
- American Board of Medical Specialties
37Oral Practice Examinations
- Elicit response to standardized, real-life case
scenario - Evaluates clinical reasoning judgment
- Reasonably valid tool for assessing
- resident performance
- progress toward independent practice and
certification
38Oral Practice Examinations
- Consistency, inter-rater reliability and validity
of 441 consecutive mock oral examinations in
Anesthesiology - good internal consistency (r 0.82)
- moderate to good inter-rater reliability (r
0.68) - moderate correlation with ITE scores (r 0.47)
and faculty evaluations (r 0.41) - performance associated with training duration
(p
Schubert A, et al. Anesthesiology 1999 91288-98
39Conceptual Framework ofAvailable Evaluation
Methods
Schubert A, et al. Anesthesiology 1999 91288-98
40Items For Discussion
- How should written examination results be used to
improve an educational program?
41Items For Discussion
- How should we use in-training exams? Do they have
a role in making promotion/dismiss decisions? - (Why? Why not?)
42Items For Discussion
- What pass/fail criteria should be used in written
examinations? In other types of assessments? - (criterion-referenced vs norm-referenced)
43Items For Discussion
- Should oral practice examinations be used as
formative or as high-stake evaluations?
44ACGME/IHI Conference
- Medical Knowledge
- Building and Testing
Scott A. Schartel, D.O. Associate Professor of
Anesthesiology Director of Resident
Education Department of Anesthesiology Temple
University
Albert J. Varon, M.D. Professor of Anesthesiology
Surgery Vice Chairman for Education Department
of Anesthesiology University of Miami School of
Medicine