Session 412 Why Don - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Session 412 Why Don

Description:

Activity and results plotted to the 4th decimal point ... wheel spinning. What to Consider. Comparative outcomes on all the dimensions mentioned earlier ... – PowerPoint PPT presentation

Number of Views:24
Avg rating:3.0/5.0
Slides: 19
Provided by: Glori52
Category:
Tags: don | session | spinning

less

Transcript and Presenter's Notes

Title: Session 412 Why Don


1
Session 412Why Dont We Weigh Them?
  • Gloria Gery
  • Gery Associates
  • www.gloriagery.com
  • ggery_at_attglobal.net
  • March 2, 2004
  • Training and Online Learning
  • Atlanta

2
A True Story circa 1978
  • Director, IT and End User Training at Aetna
  • Monthly metrics
  • Number of Student Days
  • Cost per Student Day
  • No shows
  • Classroom Utilization
  • Completions vs. drop outs
  • Average satisfaction levels

3
My Boss Mr. Numbers
  • Activity and results plotted to the 4th decimal
    point
  • Graphed and charted
  • Every nit discussed.
  • I said
  • Why dont we weigh them
  • Lets add cost per pound to the metrics?

4
Article
  • http//www.gloriagery.com/articles/0003.asp
  • My boss told me not to be so smart.
  • I am still asking the question

5
The Performance/Learning Cycle
  • Examples
  • Instruction
  • Demonstrations
  • Illustrations
  • Process Support
  • Wizards
  • Templates
  • Variable Manipulators
  • Task automation tools

Learning
Doing
Collaborating
  • Peers
  • Experts

Referencing
  • Content Resources
  • People
  • Data

Courtesy of Ariel Performance Centered Systems,
Inc. Cincinnati, OH www.arielpcs.com
6
What are the Issues in Evaluation
  • Real and Perceived Relevance
  • To the business
  • To the individual
  • What can really be assessed for both participants
    and management
  • Reaction
  • Emotion
  • Cognitive response
  • Attitude
  • Behavior

7
More Issues
  • What can be measured
  • Content
  • Context
  • Duration and Timing
  • Instructor performance
  • Attributes and structure of the course
  • Activity levels
  • Activity appropriateness
  • Skills and behaviors
  • Knowledge and skill transfer
  • Meaningfulness of what is transferred

8
More Issues
  • Commitment to really evaluate
  • Low to moderate
  • More driven by Training function than by
    management or participants
  • Why?
  • De Facto acceptance of the intervention
  • Lack of alternatives
  • Collective collusion
  • Fear of what will be determined

9
Traditional Kirkpatrick Model
  • Evaluation of the course against
  • standards
  • Outcomes
  • Level 1 4 from happiness through business
    impact
  • Rare that people go beyond learner satisfaction
    or remote assessments of materiality

10
Consider Another Point of View An Oblique Angle
  • Evaluate instructional events against alternative
    performance development mechanisms
  • Stop
  • the self-referencing model
  • the were no better or worse off than anyone
    else perspective
  • wheel spinning

11
What to Consider
  • Comparative outcomes on all the dimensions
    mentioned earlier
  • Look at relative effectiveness of the kind of
    intervention for specific types of content,
    skills, competencies and behaviors
  • Compare against
  • Job aids
  • Coaching
  • On-line reference
  • Integrated performance support
  • Performance centered software

12
Possible outcomes
  • Relative performance or value is more or less
    than we thought
  • Question about where money is spent
  • Illusions shattered or beliefs reaffirmed
  • Nothing at all.

13
Comparing to Reference
  • Current form vs. better form
  • Large searchable (sometimes) objectives vs. small
    granular tagged content that can be assembled ad
    hoc
  • The point of view of the reference
  • Provider perspective
  • Typically not task oriented
  • Technical vs. performance or goal oriented
  • Separating accessibility from utility or
    usefulness

14
Comparing to Knowledge and Content Mgt (however
that is defined)
  • Much training occurred because performers could
    not find what they need
  • Just in case
  • Structured, granular content, tagged will be the
    foundation of in context learning
  • Capturing new knowledge assets is different, but
    essential to increasing the quality and depth of
    synthesis
  • More powerful because its experience -based

15
Comparing to Tools
  • Tools higher leverage
  • Decrease requirements for knowledge and skill
  • Embody complex rules and relationship
  • Institutionalize best practice
  • Can rapidly integrate disseminate changed
    process, rules and relationships without changing
    the performers
  • Can (rather must) include content and knowledge
    and enable learning

16
Compare non-integrated to integrated environments
  • Requires demonstration or prototype projects or
    examples of the integrated or fused design
  • May require makeovers of existing resources to
    illustrate
  • Evaluation is frequently face validity but it
    must begin

17
What are the New Metrics
  • Time to understanding
  • Time to retrieval
  • Time to performance
  • Pre-post evaluations (if it matters) of learning
    outcomes
  • Pre-post evaluations of performance
  • Demonstrations of business impact
  • Efficiency
  • Effectiveness
  • Value Added and Strategy

18
Sample Value Propositions
  • Return on invested capital
  • Offsets of costs for duplicate, non-integrated
    resources
  • Flexibility in work assignment
  • People
  • Organizations
  • Pushing work to customers
  • Time to implementation, change or integration
  • Resources required for success
  • Quality
  • Etc.h
Write a Comment
User Comments (0)
About PowerShow.com