Evaluating Training Using Medical Education Metrics MEMS - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Evaluating Training Using Medical Education Metrics MEMS

Description:

Created pediatric neurology course with a 3rd party developer. Implemented ... What did learners think of the course? How does this course compare to others ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 15
Provided by: valeries2
Category:

less

Transcript and Presenter's Notes

Title: Evaluating Training Using Medical Education Metrics MEMS


1
Evaluating Training Using Medical Education
Metrics (MEMS)
  • Valerie Smothers
  • Director of Communications
  • MedBiquitous Consortium
  • http//www.medbiq.org
  • valerie.smothers_at_medbiq.org

2
Joys Story
  • Created pediatric neurology course with a 3rd
    party developer
  • Implemented
  • 300 students have taken the course

Great!
Or Is It?
3
A Few Questions
  • What did learners think of the course?
  • How does this course compare to others delivered
    by Joys group?
  • Are there other organizations that would like to
    know how many students took Joys course and what
    they thought?

How Do We Know that What Were Doing Is Effective?
4
Evaluation
  • Any systematic method for gathering information
    about the impact and effectiveness of a learning
    offering. (learningcircuits.org)
  • MEMS helps you
  • Measure effectiveness
  • Communicate results

5
What is MEMS?
  • Medical Education Metrics
  • XML draft standard for communicating core
    Kirkpatrick level 1 evaluation data (learner
    reaction)
  • Developed by the MedBiquitous Metrics Working
    Group

6
Metrics Working Group
  • American Academy of Family Physicians
  • American Medical Association
  • Baylor College of Medicine
  • CDC
  • Department of Veterans Affairs
  • Johns Hopkins
  • Johnson Johnson
  • MedSn
  • Pfizer
  • University of Cininnati
  • Consulted with ACCME

7
Why do we need a standard for evaluation data?
  • Educators want best practices, ability to compare
  • Funders want to measure reach and efficacy
  • Accreditors want to measure success of activity
    and provider

8
MEMS Data
  • Activity Description
  • Whats being evaluated
  • Learner Demographics
  • Who took the course
  • Participant Activity Evaluation
  • What did participants think
  • Participation Metrics
  • How many people participated
  • Provider Profile (future use)
  • Extensible

9
Activity Description
Activity Description
Learning Object Metadata Modality Reporting period
10
Learner Demographics
Learner Demographics
Audience Category Profession Specialty Reading
level
11
Participant Activity Evaluation
Participant Activity Evaluation
Educational objectives achieved Relevant to
learning needs Environment conducive to
learning Method conducive to learning Plan to
change practice Validated current
practice Evidence base presented Free of
commercial bias Balanced view of therapeutic
options
12
Participation Metrics
Participation Metrics
Targeted audience Registered participants Particip
ants receiving credit Credits awarded Distinct
hosts or visitors Page requests Participants
completing activity
13
Looking Ahead
  • Higher levels of evaluation
  • Learning (based on pretest and posttest)
  • Behavior
  • Results (clinical outcomes)

14
What You Can Do
  • Review your evaluations
  • Do they ask core questions?
  • Are they fairly consistent?
  • Think about how you might use evaluation data
  • Participate in shaping the standard!
Write a Comment
User Comments (0)
About PowerShow.com