LESSONS TO BE LEARNED: Measurement of Unit Performance - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

LESSONS TO BE LEARNED: Measurement of Unit Performance

Description:

... dimensions: Diplomatic, Information, Military, Economic ... the audio-visual, time tagged context. Rapid, easy retrieval of data for analysis and for ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 13
Provided by: s730223
Category:

less

Transcript and Presenter's Notes

Title: LESSONS TO BE LEARNED: Measurement of Unit Performance


1
LESSONS TO BE LEARNED Measurement of Unit
Performance
  • Paper to the NTSA DoD Training Transformation
    Technologies Meeting
  •  
  • JAEC ADVISORY GROUP
  • Jack Hiller, Chief Scientist, Mission Systems,
    ITSD, Northrop Grumman Corp.
  •   
  • September 5, 2003

2
Background
  • Useful References
  • 1. Determinants of Effective Unit Performance,
    ARI, 1994
  • 2. Assessing and Measuring Training Performance,
    2000, ARI Technical Report 1116

3
Evaluation vs. Assessment
  • Evaluation
  • Standards or criteria permitting objective
    performance scoring.
  • Mission or task outcomes that are observable.
  • Assessment
  • Rating criteria indefinite/fuzzy.
  • Too many variables and chance factors.

4
Evaluation vs. Assessment (Cont.)
  • Increasing Role for Assessment Approaches
  •  
  • FCS Units of Action may command far-ranging
    support, similar to brigade and higher.
  • Terrorist options for inflicting damage are
    virtually infinite, so predefined mission task
    standards for decision making might lack
    relevance.
  • Joint operations command decisions are far
    ranging, encompassing complex multiple
    dimensions Diplomatic, Information, Military,
    Economic (DIME)

5
Lessons To Be Learned The Primacy of Defined
Purpose
  • Examples for problems in Measurement Purpose
  • 1. We will collect all the data and get it to
    you. NTC Commander
  • 2. We are building a ten thousand item database
    which will answer all questions about the
    determinants of unit effectiveness.

6
Lessons To Be Learned The Primacy of Defined
Purpose (Cont.)
  • 3. Data collected in BOS categories to support
    NTC AARs were problematic for use in
  • a. Take Home Packages.
  • b. Systematically analyzing for Lessons
    Learned in DOTLMS.
  • c. Systematically analyzing for trends in
    DOTLMS.

7
Unknown Measurement Reliability, and Thus
Uncertain Validity
  • Four developers of infantry battle drills were
    asked to independently rate the performances of a
    number of trained squads.
  • 1. Infantry LTC scored performances uniformly
    NOGO.
  • 2. Platoon SGT rated performances uniformly GO.

8
Unknown Measurement Reliability, and Thus
Uncertain Validity, (Cont.)
  • 3. PhD researcher rated about half GO and NOGO.
  • 4. Highly experienced infantry researcher
    refused to rate, because conditions did not
    follow prescribed directions.
  • The evaluation test turned into an assessment
    exercise.
  • Be wary of scores from untrained raters.

9
Scope of the Measurement Domain Implications
for the JNTC
  • The domain is large, with 4 DIME dimensions
    crossing the 7 DOMTLPF categories forming a
    matrix with 28 cells, with many relevant for
    rating.
  • Automation technology should be enlisted to
    support measurement personnel, O/Cs.

10
Scope of the Measurement Domain Implications
for the JNTC (Cont.)
  • PDA wireless computers can facilitate
  • Prompting what is to be observed against defined
    performance standards.
  • Recording ratings and comments.
  • Recording the audio-visual, time tagged context.
  • Rapid, easy retrieval of data for analysis and
    for preparation of AARs.
  • Cumulative storage of data to support lessons
    learned and performance trends analysis.

11
An Approach for Measuring Adaptable and Creative
Performance
  • Provide rating directions and
  • Calibration training for assessment personnel?
    Premature.
  • Instead
  • Provide a three or five point rating scale for
    creativity.
  • Direct assessors to explain their rating.
  • After experience has been accumulated, it may be
    possible to revisit 1 and 2.
  • Initial assessors need to be acknowledged experts.

12
Conclusions
  1. Establish the multiple purposes for measurement.
  2. Map each articulated purpose to the measures to
    be collected for it.
  3. Establish a mechanism to implement the lessons to
    be learned.
Write a Comment
User Comments (0)
About PowerShow.com