Title: Evaluating HRD Programs
1Evaluating HRD Programs
2Effectiveness
- The degree to which a training (or other HRD
program) achieves its intended purpose. - Measures are relative to some starting point.
- Measures how well the desired goal is achieved.
3HRD Evaluation
- Textbook definition
- The systematic collection of descriptive and
judgmental information necessary to make
effective training decisions related to the
selection, adoption, value, and modification of
various instructional activities.
4In Other Words
- Are we training
- the right people
- the right stuff
- the right way
- with the right materials
- at the right time?
5Evaluation Needs
- Descriptive and judgmental information needed.
- Objective and subjective data
- Information gathered according to a plan and in a
desired format. - Gathered to provide decision making information.
6Purposes of Evaluation
- Determine whether the program is meeting the
intended objectives. - Identify strengths and weaknesses.
- Determine cost-benefit ratio.
- Identify who benefited most or least.
- Determine future participants.
- Provide information for improving HRD programs.
7Purposes of Evaluation-2
- Reinforce major points to be made.
- Gather marketing information.
- Determine if training program is appropriate.
- Establish management database.
8Evaluation Bottom Line
- Is HRD a revenue contributor or a revenue user?
- Is HRD credible to line and upper-level managers?
- Are benefits of HRD readily evident to all?
9How Often are HRD Evaluations Conducted?
- Not often enough!!!
- Frequently, only end-of-course participant
reactions are collected. - Transfer to the workplace is evaluated less
frequently.
10Why HRD Evaluations are Rare
- Reluctance to having HRD programs evaluated.
- Evaluation needs expertise and resources.
- Factors other than HRD cause performance
improvements, e.g., - Economy
- Equipment
- Policies, etc.
11Need for HRD Evaluation
- Shows the value of HRD.
- Provides metrics for HRD efficiency.
- Demonstrates value-added approach for HRD.
- Demonstrates accountability for HRD activities.
- Everyone else has it why not HRD?
12Make or Buy Evaluation
- I bought it, therefore it is good.
- Since its good, I dont need to post-test.
- Who says its
- Appropriate?
- Effective?
- Timely?
- Transferable to the workplace?
13Evolution of Evaluation Efforts
- Anecdotal approach Talk to other users.
- Try before buy Borrow and use samples.
- Analytical approach Match research data to
training needs. - Holistic approach Look at overall HRD process,
as well as individual training.
14Models and Frameworks of Evaluation
- Table 7-1 lists nine frameworks for evaluation.
- The most popular is that of D. Kirkpatrick
- Reaction
- Learning
- Job Behavior
- Results
15Kirkpatricks Four Levels
- Reaction
- Focus on trainees reactions
- Learning
- Did they learn what they were supposed to?
- Job Behavior
- Was it used on job?
- Results
- Did it improve the organizations effectiveness?
16Issues Concerning Kirkpatricks Framework
- Most organizations dont evaluate at all four
levels. - Focuses only on post-training.
- Doesnt treat inter-stage improvements.
- WHAT ARE YOUR THOUGHTS?
17Other Frameworks/Models 1
- CIPP Context, Input, Process, Product
- CIRO Context, Input, Reaction, Outcome
- Brinkerhoff
- Goal setting
- Program design
- Program implementation
- Immediate outcomes
- Usage outcomes
- Impacts and worth
18Other Frameworks/Models 2
- Kraiger, Ford, Salas
- Cognitive outcomes
- Skill-based outcomes
- Affective outcomes
- Phillips
- Reaction
- Learning
- Applied learning on the job
- Business results
- ROI
19A Suggested Framework 1
- Reaction
- Did trainees like the training?
- Did the training seem useful?
- Learning
- How much did they learn?
- Behavior
- What behavior change occurred?
20Suggested Framework 2
- Results
- What were the tangible outcomes?
- What was the return on investment (ROI)?
- What was the contribution to the organization?
21Data Collection for HRD Evaluation
- Possible methods
- Interviews
- Questionnaires
- Direct observation
- Written tests
- Simulation/Performance tests
- Archival performance information
22Interviews
- Advantages
- Flexible
- Opportunity for clarification
- Depth possible
- Personal contact
- Limitations
- High reactive effects
- High cost
- Face-to-face threat potential
- Labor intensive
- Trained observers needed
23Questionnaires
- Advantages
- Low cost to administer
- Honesty increased
- Anonymity possible
- Respondent sets the pace
- Variety of options
- Limitations
- Possible inaccurate data
- Response conditions not controlled
- Respondents set varying paces
- Uncontrolled return rate
24Direct Observation
- Advantages
- Non-threatening
- Excellent way to measure behavior change
- Limitations
- Possibly disruptive
- Reactive effects are possible
- May be unreliable
- Need trained observers
25Written Tests
- Advantages
- Low purchase cost
- Readily scored
- Quickly processed
- Easily administered
- Wide sampling possible
- Limitations
- May be threatening
- Possibly no relation to job performance
- Measures only cognitive learning
- Relies on norms
- Concern for racial/ ethnic bias
26Simulation/Performance Tests
- Advantages
- Reliable
- Objective
- Close relation to job performance
- Includes cognitive, psychomotor and affective
domains
- Limitations
- Time consuming
- Simulations often difficult to create
- High costs to development and use
27Archival Performance Data
- Advantages
- Reliable
- Objective
- Job-based
- Easy to review
- Minimal reactive effects
- Limitations
- Criteria for keeping/ discarding records
- Information system discrepancies
- Indirect
- Not always usable
- Records prepared for other purposes
28Choosing Data Collection Methods
- Reliability
- Consistency of results, and freedom from
collection method bias and error. - Validity
- Does the device measure what we want to measure?
- Practicality
- Does it make sense in terms of the resources used
to get the data?
29Type of Data Used/Needed
- Individual performance
- System-wide performance
- Economic
30Individual Performance Data
- Individual knowledge
- Individual behaviors
- Examples
- Test scores
- Performance quantity, quality, and timeliness
- Attendance records
- Attitudes
31System-Wide Performance Data
- Productivity
- Scrap/rework rates
- Customer satisfaction levels
- On-time performance levels
- Quality rates and improvement rates
32Economic Data
- Profits
- Product liability claims
- Avoidance of penalties
- Market share
- Competitive position
- Return on Investment (ROI)
- Financial utility calculations
33Use of Self-Report Data
- Most common method
- Pre-training and post-training data
- Problems
- Mono-method bias
- Desire to be consistent between tests
- Socially desirable responses
- Response Shift Bias
- Trainees adjust expectations to training
34Research Design
- Specifies in advance
- the expected results of the study.
- the methods of data collection to be used.
- how the data will be analyzed.
35Research Design Issues
- Pretest and Posttest
- Shows trainee what training has accomplished.
- Helps eliminate pretest knowledge bias.
- Control Group
- Compares performance of group with training
against the performance of a similar group
without training.
36Recommended Research Design
- Pretest and posttest with control group.
- Whenever possible
- randomly assign individuals to the test group and
the control group to minimize bias. - Use time-series approach to data collection to
verify performance improvement is due to training.
37Ethical Issues Concerning Evaluation Research
- Confidentiality
- Informed consent
- Withholding training from control groups
- Use of deception
- Pressure to produce positive results
38Assessing the Impact of HRD
- Money is the language of business.
- You MUST talk dollars, not HRD jargon.
- No one (except maybe you) cares about the
effectiveness of training interventions as
measured by and analysis of formal pretest,
posttest control group data.
39HRD Program Assessment
- HRD programs and training are investments.
- Line manager often see HR and HRD as costs, i.e.,
revenue users, not revenue producers. - You must prove your worth to the organization
- Or youll have to find another organization.
40Two Basic Methods for Assessing Financial Impact
- Evaluation of training costs
- Utility analysis
41Evaluation of Training Costs
- Cost-benefit analysis
- Compares cost of training to benefits gained such
as attitudes, reduction in accidents, reduction
in employee sick-days, etc. - Cost-effectiveness analysis
- Focuses on increases in quality, reduction in
scrap/rework, productivity, etc.
42Return on Investment
- Return on investment Results/Costs
43Types of Training Costs
- Direct costs
- Indirect costs
- Development costs
- Overhead costs
- Compensation for participants
44Direct Costs
- Instructor
- Base pay
- Fringe benefits
- Travel and per diem
- Materials
- Classroom and audiovisual equipment
- Travel
- Food and refreshments
45Indirect Costs
- Training management
- Clerical/Administrative
- Postal/shipping, telephone, computers, etc.
- Pre- and post-learning materials
- Other overhead costs
46Development Costs
- Fee to purchase program
- Costs to tailor program to organization
- Instructor training costs
47Overhead Costs
- General organization support
- Top management participation
- Utilities, facilities
- General and administrative costs, such as HRM
48Compensation for Participants
- Participants salary and benefits for time away
from job - Travel, lodging and per-diem costs
49Measuring Benefits
- Change in quality per unit measured in dollars
- Reduction in scrap/rework measured in dollar cost
of labor and materials - Reduction in preventable accidents measured in
dollars - ROI Benefits/Training costs
50Utility Analysis
- Uses a statistical approach to support claims of
training effectiveness - N Number of trainees
- T Length of time benefits are expected to
last - dt True performance difference resulting from
training - SDy Dollar value of untrained job performance
(in standard deviation units) - C Cost of training
- ?U (N)(T)(dt)(Sdy) C
51Critical Information for Utility Analysis
- dt difference in units between
trained/untrained, divided by standard deviation
in units produced by trained. - SDy Standard deviation in dollars, or overall
productivity of organization.
52Ways to Improve HRD Assessment
- Walk the walk, talk the talk MONEY.
- Involve HRD in strategic planning.
- Involve management in HRD planning and estimation
efforts. - Gain mutual ownership
- Use credible and conservative estimates.
- Share credit for successes and blame for failures.
53HRD Evaluation Steps
- Analyze needs.
- Determine explicit evaluation strategy.
- Insist on specific and measurable training
objectives. - Obtain participant reactions.
- Develop criterion measures/instruments to measure
results. - Plan and execute evaluation strategy.
54Summary
- Training results must be measured against costs.
- Training must contribute to the bottom line.
- HRD must justify itself repeatedly as a revenue
enhancer, not a revenue waster.