Developing Measures of Human Performance: An Approach and Initial Reactions - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Developing Measures of Human Performance: An Approach and Initial Reactions

Description:

... Authoring Tool (PMAT) Workshop allowed us to test the preliminary business logic for PMAT. 11. Description ... Currently developing an automated tool PMAT ... – PowerPoint PPT presentation

Number of Views:41
Avg rating:3.0/5.0
Slides: 26
Provided by: Aholt
Category:

less

Transcript and Presenter's Notes

Title: Developing Measures of Human Performance: An Approach and Initial Reactions


1
Developing Measures of Human Performance An
Approach and Initial Reactions
  • Dana Costar, David P. Baker, and Amy Holtzman
  • American Institutes for Research
  • Kimberly Smith-Jentsch
  • University of Central Florida
  • Paul Radtke
  • NAVAIR Orlando TSD

2
Overview
  • Challenge
  • Available Resources
  • Objectives
  • Description
  • Reactions
  • Next Steps

3
Challenge
  • Current performance measurement issues facing the
    Navy
  • Develop consistency across Fleet
  • Improve training
  • Focus on objectives/measures that clearly predict
    outcomes
  • DoD Directive 7730.65 Department of Defense
    Readiness Reporting System (DRRS)

4
Challenge
  • Current performance measurement issues facing the
    Navy
  • Develop consistency across Fleet
  • Improve training
  • Focus on objectives/measures that clearly predict
    outcomes
  • DoD Directive 7730.65 Department of Defense
    Readiness Reporting System (DRRS)

5
Challenge
  • Requirement for the development of reliable and
    valid measures of individual, team and multi-team
    performance
  • Typically a collateral duty in the Navy
  • Instructors have limited experience in
    performance measurement
  • Instructors tend to develop
  • Measures that are familiar to them (e.g.,
    checklists)
  • Measures that are easy-to-use
  • Accurately measuring human performance is
    important for effective training

6
Human Performance Umbrella
INDIVIDUAL
TEAM
MULTI-TEAM
  • Procedural taskwork processes
  • Non-procedural taskwork processes
  • Teamwork processes
  • Procedural taskwork processes
  • Non-procedural taskwork processes
  • Teamwork processes
  • Procedural taskwork processes
  • Non-procedural taskwork processes
  • Teamwork processes


7
Performance Measurement
  • Purpose
  • Diagnosis and feedback
  • Readiness
  • Metrics
  • Checklists
  • Frequency Counts
  • Distance/Discrepancy
  • Rating Scales

8
Available Resources
  • Much of of the past research has focused on
    performance appraisal
  • Psychometric properties of rating scales
  • More recently rater training
  • There is no clear guidance in the literature
  • Which rating format is best for measuring
    different aspects of the human performance
  • How this guidance varies by training purpose

9
Objectives
  • Develop and conduct a workshop on human
    performance measurement for training
  • At the end of the workshop, attendees would be
    able to
  • Identify and craft good training objectives
  • Understand the importance of collecting different
    types of performance data (i.e, process and
    outcome data)
  • Understand the advantages and disadvantages of
    different kinds of measures
  • Develop performance measures for training

10
Objectives
  • Workshop was an interim approach for addressing
    Navy needs
  • Long-term goal develop an automated tool for
    developing measures of human performance.
  • Performance Measurement Authoring Tool (PMAT)
  • Workshop allowed us to test the preliminary
    business logic for PMAT

11
Description
  • Morning Session
  • Briefing on individual, team, and multi-team
    performance assessment
  • Presented a 7-step framework for developing
    performance measures
  • Afternoon Session
  • Hands-on practice
  • Workshop 1 (Own Examples)
  • Workshop 2 (NMETLs)

12
Description
Using Performance Measures Effectively
Developing an Appropriate Metric
Defining What To Measure
13
7-Step Framework
  • What to measure?
  • Consider Level of Analysis
  • Identify Measurement Objectives
  • Clarify Purpose for Measuring Performance
  • Decide whether you need to assess process,
    outcomes or both
  • Ensure that objectives are measurable

14
What to Measure
  • Conduct fire support
  • Eliminate mobile hostile targets
  • In the vicinity of friendly forces
  • 100

Mission essential task Task Condition Stan
dard
15
What to Measure
Measurement objective Eliminate 100 of hostile
mobile targets in the vicinity of friendly forces
16
What to Measure
  • Step 5 Is our objective measurable?
  • Observable or audible? ?
  • Only measuring one task? ?
  • Does it include a condition and standard? ?

17
7-Step Framework
  • Develop Metrics
  • Select method (Dictated by earlier research)
  • Develop measures

18
Developing Appropriate MetricsGuidelines for
Selecting Measures
Distance Discrepancy
Outcomes
Frequency
Checklist
Rating Scale
Accuracy
?
?
?
Timeliness
?
?
Productivity
?
Efficiency
?
?
Safety
?
?
Effects
?
?
19
Workshop Evaluation
  • Assessed participant reactions
  • Morning session
  • Afternoon session
  • Did the workshop prepare you to
  • Develop good measurement objectives
  • Distinguish between process and outcomes
  • Select an appropriate measurement method
  • Develop effective measures
  • Open-ended questions

20
Participants
  • Workshop 1
  • Morning Session 44 individuals (35 respondents)
  • Afternoon Session 22 individuals (11
    respondents)
  • Workshop 2
  • Morning Session 49 individuals (31 respondents)
  • Afternoon Session 24 individuals (13
    respondents)

21
Morning Session
22
Afternoon Session
  • Workshop 1
  • Majority agreed the workshop prepared them to
  • Develop training/measurement objectives
  • Select appropriate measurement method
  • Distinguish processes from outcomes
  • Develop performance measures for training
  • Workshop 2
  • Similar results to Workshop 1

23
Summary
  • Overall reactions to the workshops were positive
  • Met its stated objectives
  • Provided useful information about performance
    assessment during training
  • Provided a specific process for developing
    measures
  • Provided participants practice with the process

24
Next Steps
  • This study was an interim step
  • Currently developing an automated tool PMAT
  • Workshops provided information for refining the
    7-step process
  • Conducting research to develop the business logic
    for PMAT
  • PMAT will be available in Summer 2004
  • Part of a larger project on debriefing
    distributed scenario-based training exercises

25
Contact Information
  • David P. Baker, Ph.D.
  • American Institutes for Research
  • 1000 Thomas Jefferson St., NW
  • Washington, DC 20007-3835
  • 202-342-5036
  • dbaker_at_air.org
Write a Comment
User Comments (0)
About PowerShow.com