Drill - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Drill

Description:

Drill & Exercise Evaluation The Big Stuff Matters Most. Ben Blue. Savannah River Site ... general idea, problems in the 'big ticket' areas become more visible ... – PowerPoint PPT presentation

Number of Views:30
Avg rating:3.0/5.0
Slides: 31
Provided by: ryla4
Category:
Tags: ben | big | drill

less

Transcript and Presenter's Notes

Title: Drill


1
Drill Exercise Evaluation The Big Stuff
Matters Most
  • Ben Blue
  • Savannah River Site
  • May 7

2
Background / Drivers
  • During DOE Type I Assessment in January, 2007,
    the following finding was issued
  • EM used checklists to evaluate emergency
    exercises that did not contain formal, objective
    evaluation and acceptance criteria.
  • Grading process identified as an area for
    improvement in the 2007 Evaluated Exercise.
  • Also noted by DNFSB as an area ofconcern.

3
Improvement Plan
  • Developed and implemented improvement plan after
    07 Exercise
  • Implemented plan as a project, subject to the
    same controls, management attention, and
    reporting
  • Plan focused on
  • Increasing management involvement focus
  • Achieving consistent Incident Scene Command
    Control
  • Improving Radcon emergency response performance
  • Improving Drill and Exercise Conduct Evaluation
  • Improving Emergency Management self assessment
    program
  • Improving corrective action effectiveness

4
Corrective Action
  • Initially, we identified those objectives we
    considered to be risk significant,
    specifically
  • Safety
  • Protective Actions
  • Mitigation
  • Emergency Classification
  • Notifications Communications

5
Next Step
  • The next step was to use those Risk Significant
    Objectives as critical performance items when
    grading drills and exercises.
  • These objectives had to have more effect on the
    final grade than other objectives
  • Simple numeric weighting was giving enough of an
    effect because of averaging.

6
Next Step, continued
  • New Process developed
  • Based on similar process used by military
  • Objectives and supporting criteria are classified
    as
  • Critical
  • Major
  • Minor
  • Wording changed to make evaluation more objective
    (on-going effort)

7
How it Works
  • Failure of a critical item immediately fails the
    next higher level
  • Failing a critical criterion fails the objective
  • Failing a critical objective fails the drill or
    exercise

8
How it Works, continued
  • Failure of major items lowers the grade, but does
    not immediately fail the next higher level
  • Failure of one major item results in next higher
    level being no better than Partially Met
  • Failure of two or more major items results in
    failure of the next higher level

9
How it Works, continued
  • Failure of minor items is used to drive the
    numeric grade down, but cannot cause automatic
    failure

10
Input
  • Evaluators input grades at the Criteria level

11
Input, continued
  • Critical and Major items are either Met or Not
    Met
  • Input is based on wording of Objective or
    Criteria, supporting lines of inquiry, AND
    evaluators judgment

12
Input, continued
  • Minor items are either Met, Not Met, or
    Partially Met

13
Input Converted to Numeric
  • Evaluator input converted to numeric value
  • Met 100
  • Partially Met 75
  • Not Met 50
  • Numeric grades are averaged to determine an
    overall numeric grade

14
Numeric Values
15
Grading and Evaluation
  • Numeric grades are applied to three tests to
    determine final rating
  • Tests use the criteria grades first to determine
    the objective grades
  • Objective grades are then applied to the same
    tests to determine the overall grade for the
    drill or exercise

16
Grading and Evaluation, continued
  • First test is the Critical items
  • Failure of one automatically fails the next level
  • Second test is the Major items
  • Failure of one means next level is Partially
    Met
  • Failure of two or more automatically fails the
    next level

17
Grading and Evaluation, continued
  • Third test is the numeric grade
  • gt85 Met (Pass)
  • 70 to 85 Partially Met
  • lt70 Not Met (Fail)

18
Objective Grading
19
Overall Grading
20
Implementation
  • Briefed evaluators and implemented process in
    January, 2008
  • Beta testing evaluation tools (software) looking
    for bugs
  • Getting a feel for how the process is received by
    management and players

21
Implementation, continued
  • Also looking to make sure we didnt overcorrect
  • Previous system (numeric only) made it very
    difficult to get the software to fail a drill
    when it should
  • Wanted to be sure we didnt make it too hard to
    PASS a drill that should
  • Did not see any indication of this

22
Implementation, continued
  • After one month of testing, process fully
    implemented and used during remedial exercise in
    February, 2008

23
Feedback and Adjustment
  • Early feedback from users (evaluators) was
    positive
  • Some problems noted, but they were minor
  • Took time to get used to new way of doing things
  • Management is enthusiastic about logic and
    philosophy of new process

24
Benefits and Successes
  • Process has provided more focus on the more
    important items relative to drill and exercise
    performance
  • Previous system tended to hide problem areas by
    rolling them up into averages
  • New system makes problem areas more prominent by
    directly tying poor performance in more important
    areas to lower grades

25
Mechanics and Software
  • Evaluator input captured in Microsoft Access
    database
  • Grades input via web browser interface
  • Number crunching and evaluation performed in an
    Excel spreadsheet
  • Draws data from Access, then performs
    calculations to apply three evaluation tests and
    produce evaluation report

26
Starting the Evaluation
27
Grading Report
28
Starting from Scratch
  • If you have nothing in place, you will need some
    form of data collection and storage
  • Off-the-shelf database programs will do this
  • Most programs can also apply the three evaluation
    tests and produce the output
  • We use Access and Excel because we had some
    pieces in place that we didnt want to abandon

29
Taking it Home
  • The software packages are unimportant
  • Almost any software can be modified to use this
    concept
  • The process is the thing
  • By applying the general idea, problems in the
    big ticket areas become more visible
  • Fixing these areas first strengthens the
    foundation, making your entire program stronger

30
Summary
  • Make the most important items have the most
    effect on the grade
  • Fix these items first
  • Then concentrate on smaller items
  • Consider changing the process before your
    customers suggest that you do so
  • Happy customers Happy EM staff
Write a Comment
User Comments (0)
About PowerShow.com