Implementing the ACGME Outcome Project: Questions, Answers, Tips, and Traps PowerPoint PPT Presentation

presentation player overlay
1 / 59
About This Presentation
Transcript and Presenter's Notes

Title: Implementing the ACGME Outcome Project: Questions, Answers, Tips, and Traps


1
Implementing the ACGME Outcome ProjectQuestions,
Answers, Tips, and Traps
  • Doris A. Stoll, PhD
  • RRC Executive Director

Accreditation Council for Graduate Medical
Education
2
Learning Objectives
  • Recognize the value of outcomes-based education
    as a viable tool for improving GME
  • Develop learning objectives and assessment tools
    for four of the general competencies
  • Identify key challenges in implementing Outcome
    Project-related changes in their programs
  • Identify resources available for assistance in
    implementing Outcome Project-related changes

3
What is the ACGME?
  • The Accreditation Council for Graduate Medical
    Education is an independent, private sector,
    voluntary, not-for-profit organization
    responsible for evaluating and accrediting
    residency programs in the United States.

4
The ACGME Mission
  • To improve the quality of health care in the
    United States by ensuring and improving the
    quality of graduate medical educational
    experiences for physicians in training.

5
The current problem
  • Increasing public concerns with quality and
    safety.
  • Variable patterns of care that are not based on
    medical science.
  • Poor quality of interpersonal service.
  • Public encounters difficulty in assessing
    competence (initial and continuing ) and judging
    quality.

6
The current mindset
  • The SIMPLE questions
  • Does the program comply with the written
    Requirements?
  • Does the program have established goals and
    objectives and an organized curriculum?
  • Does the program have a process to evaluate its
    residents and itself?

7
A new way of thinking
  • The COMPLEX questions
  • Do the residents achieve the learning objectives
    set by the program?
  • What evidence can the program provide to
    demonstrate that they do so?
  • How does the program demonstrate continuous
    improvement in its educational processes?

8
In other words
  • How well do we
  • learn what is
  • being taught

9
  • how well do we practice what we learn?

10
A new way of thinking
  • Never confuse activity with productivity.
  • Dee Hock

11
A new way of thinking
How to change the educational and accreditation
system from
12
A new way of thinking
  • How to make this transition to a competency based
    program for a community of
  • 724 sponsoring institutions
  • 7,800 accredited programs
  • 99,000 residents
  • 27 Residency Review Committees covering 26 core
    specialties and 115 subspecialties

13
  • What is the value of a competency-based model of
    education?
  • How can we implement the competencies?
  • How can we assess the competencies?

14
Project Principles
  • Focus of general competencies that apply to all
    specialties
  • Tendency to improve what we measure
  • More flexibility to be creative
  • More credibility for public accountability
  • Use of the continuum of lifelong learning, e.g.
    ABMS maintenance of certification
  • Focus on improvements in lieu of minimal
    thresholds

15
WHAT?
  • A long-term initiative
  • To enhance residency education
  • Through educational outcome assessment

16
HOW? Project Activities
  • Identifying what to measure
  • Developing measurement tools
  • Collaborating to find the answers

17
Patient Care
  • Compassionate
  • Appropriate
  • Effective
  • For treatment of health problems
  • For the promotion of health

18
Medical Knowledge
  • About established and evolving science
  • Biomedical
  • Clinical
  • Cognate (epidemiological social-behavioral)
  • About application of this knowledge to patient
    care
  • Ability to critically assess new knowledge

19
Practice-based Learning and Improvement
  • Investigation and evaluation of their own patient
    care
  • Appraisal and assimilation of scientific evidence
  • Improvements in patient care

20
Interpersonal and Communication Skills
  • Results in effective information exchange and
    teaming with
  • Patients
  • Their families
  • Other health professionals
  • Enhances therapeutic relationship

21
Professionalism
  • Manifested through
  • A commitment to carrying out professional
    responsibilities
  • Adherence to ethical principles
  • Sensitivity to diverse patient populations

22
Systems-Based Practice
  • Manifested by
  • Actions that demonstrate awareness of and
    responsiveness to larger context and system of
    health care
  • Ability to effectively call on system resources
    to provide care that is of optimal value

23
WHEN?
24
Building a Support Network
  • Shared experiences/interest groups
  • National, regional and local conferences
  • Poster sessions at Mastery Workshops
  • RFP process 70 examples
  • Web-based resource center for the competencies
    and their assessment

25
(No Transcript)
26
Assessment Tools (The Toolbox)
  • 360 Evaluation Instrument
  • Chart Stimulated Recall Oral Exam (CSR)
  • Checklist Evaluation of Live or Recorded
    Performance
  • Objective Structured Clinical Exam (OSCE)
  • Procedure, Operative or Case Logs

27
The Toolbox (continued)
  • Patient Surveys
  • Portfolios
  • Record Review
  • Simulations and Models
  • Standardized Oral Exams
  • Standardized Patients (SP)
  • Written Exams (MCQ)

28
(No Transcript)
29
(No Transcript)
30
A New Way of Thinking
Do the residents achieve the learning objectives
set by the program? What evidence can the program
provide that they do so? How does the program
demonstrate continuous improvement in its
educational processes?
31
Education 101
Problem ID and General Needs Assessment
Needs Assessment of Targeted Learners
Evaluation and Feedback (individual, program)
Goals and Specific Measurable Objectives
Implementation
Kern, Thomas, Howard, Bass, 1998
Educational Strategies
32
Curriculum
33
(No Transcript)
34
The Nuts and Bolts
  • Learning Outcomes
  • (educational goals)
  • Learning Objectives

35
Outcomes and Objectives
  • Outcomes
  • Can be described under a small number of headings
  • Emphasize broad overview
  • Knowledge and metacompetencies are embedded
  • Objectives
  • Are extensive and detailed
  • Emphasize instructional intent at a lower and
    more detailed level
  • Classified into discrete areas (knowledge,
    skills, attitudes)

LEARNER-CENTERED MEASURABLE!
adapted from Harden R. Learning outcomes and
instructional objectives is there a difference?
Medical Teacher. 200224(2)151-155.
36
The Learning Outcome
  • VII.C.4. Residents must be able to demonstrate
    interpersonal and communication skills that
    result in effective information exchange and
    teaming with other health care providers,
    patients, and patients' families.

Small no. of headings, broad overview,
metacompetencies
37
The Learning Objective
  • Residents are expected to use effective
    listening skills and elicit and provide
    information using effective nonverbal,
    explanatory, questioning and writing skills.

Detailed, emphasize more detailed instructional
level, KSA
38
The Learning Objective
  • Who?
  • Will do
  • How much?
  • Of what?
  • By when?

Kern, Thomas, Howard, Bass 1998
39
A Competency-based Objective
  • At the completion of the PG-1 year, the resident
    will be able to diagnose and manage common
    ambulatory medical disorders, i.e., hypertension,
    diabetes, angina, COPD with minimal supervision.

40
Levels of CognitionTaxonomy of Knowledge
Evaluation
Synthesis
Analysis
Application
Comprehension
Complexity
Knowledge
Difficulty
taken from How the Brain Learns David A. Sousa
41
Skill Development Models
42
Prototype Requirement
Residency Programs must have an effective plan
for assessing resident performance, or develop a
plan and demonstrate progress toward implementing
it. The plan should include use of dependable
measures.
43
Types of Evaluation
  • Formative
  • Improve performance
  • Summative
  • Note achievement

Both types of evaluation can be used to evaluate
either an individual or a program.
44
Characteristics of good assessment
  • Systematic
  • Dependable
  • Comprehensive
  • Congruent
  • Practical

45
Characteristics of good assessment (continued)
  • Makes professional practice more transparent
  • Deconstructs the role of physician
  • Clarifies levels of expertise by distinguishing
    functional levels

46
Characteristics of good assessment (continued)
  • Measures actual performance
  • Identifies areas for improvement, i.e., self,
    others
  • Satisfies reasonable requests for accountability

47
Resources
  • Where do we
  • find them?

48
How do we select the experiences to evaluate?
  • Organize
  • Categorize experiences broadly, from the simple
    to the complex
  • Level required experiences

49
How do we select the experiences to
evaluate?(continued)
  • Identify
  • Diagnoses with high incidence/prevalence
  • Cases with significant morbidity and mortality
  • Diagnoses that are treatable and preventable
  • Incidents where questionable management occurs
  • Situations where improvement is needed and can be
    accomplished through improved education

50
Assessment Tools (The Toolbox)
  • 360 Evaluation Instrument
  • Chart Stimulated Recall Oral Exam (CSR)
  • Checklist Evaluation of Live or Recorded
    Performance
  • Objective Structured Clinical Exam (OSCE)
  • Procedure, Operative or Case Logs

51
Assessment Tools (The Toolbox)(continued)
  • Patient Surveys
  • Portfolios
  • Record Review
  • Simulations and Models
  • Standardized Oral Exams
  • Standardized Patients (SP)
  • Written Exams (MCQ)

52
Why focus on Outcomes?
  • Directly improves resident learning
  • Relates to the real world of work
  • Defines, focuses, and prioritizes content,
    methods, and assessment

53
Why focus on Outcomes?(continued)
  • Keeps faculty close to the curriculum and the
    assessment methods
  • Supports communication among and involvement of
    the faculty

54
Why focus on Outcomes?(continued)
  • Capitalizes on the need for accountability (and
    opportunities)
  • Links the parts to the whole, i.e., sections,
    divisions, and departments
  • Develops multiple methods of assessment

55
Goal-based Evaluation
  • determining to what extent educational
    objectives are realized by the program of
    curriculum and instruction, i.e., the degree to
    which behavior changes take place.
  • Ralph Tyler
  • 1949

56
The Link with Assessment (4)
  • Objective
  • Residents should become proficient in the
    cost-effective diagnosis and management of common
    clinical problems.
  • How to measure?
  • Report on a patient management issue that
    incorporates principles of epidemiology,
    cost-effectiveness.
  • Targeted reading by the resident that directly
    relates to cost-effectiveness.

57
Use Multiple Evaluators
  • Reduce bias
  • Increase accuracy
  • Enhance fairness

58
Evaluate on Multiple Occasions
  • Resident behaviors
  • Differences in patients
  • Variable clinical situations

59
  • He that judges, without informing himself to the
    utmost he is capable, cannot acquit himself of
    judging amiss.
  • John Locke
  • 1690
Write a Comment
User Comments (0)
About PowerShow.com