Online Assessment and Evaluation Techniques in Corporate Settings - PowerPoint PPT Presentation

About This Presentation
Title:

Online Assessment and Evaluation Techniques in Corporate Settings

Description:

Title: Part I: The State of Online Learning Author: Vanessa Dennen Last modified by: Curt Bonk Created Date: 9/4/2001 8:48:47 AM Document presentation format – PowerPoint PPT presentation

Number of Views:390
Avg rating:3.0/5.0
Slides: 139
Provided by: Vanessa143
Category:

less

Transcript and Presenter's Notes

Title: Online Assessment and Evaluation Techniques in Corporate Settings


1
Online Assessment and Evaluation Techniques in
Corporate Settings
  • Dr. Curtis J. Bonk
  • President, CourseShare.com
  • Associate Professor, Indiana University
  • http//php.indiana.edu/cjbonk,
  • cjbonk_at_indiana.edu

2
Workshop Overview
  • Part I The State of Online Learning
  • Part II. Evaluation Purposes, Approaches, and
    Frameworks
  • Part III. Applying Kirkpatricks 4 Levels
  • Part IV. ROI and Online Learning
  • Part V. Collecting Evaluation Data Online
    Evaluation Tools

3
Sevilla Wells (July, 2001), e-learning
  • We could be very productive by ignoring
    assessment altogether and assume competence if
    the learner simply gets through the course.

4
Why Evaluate?
  • Cost-savings
  • Becoming less important reason to evaluate as
    more people recognize that the initial expense is
    balanced by long-term financial benefits
  • Performance improvement
  • A clear place to see impact of online learning
  • Competency advancement

5
16 Evaluation Methods
  • 1. Formative Evaluation
  • 2. Summative Evaluation
  • 3. CIPP Model Evaluation
  • 4. Objectives-Oriented Evaluation
  • 5. Marshall Shriver's 5 Levels of Evaluation
  • 6. Bonks 8 Part Evaluation Plan
  • ( the Ridiculous Model)
  • 7. Kirkpatricks 4 Levels
  • 8. Return on Investment (ROI)
  • 9. K-Level 6 budget and stability of e-learning
    team.
  • 10. K-Level 7 whether e-learning champion(s) are
    promoted
  • 11. Cost/Benefit Analysis (CBA)
  • 12. Time to Competency
  • 13. Time to Market
  • 14. Return on Expectation
  • 15. AEIOU Accountability, Effectiveness, Impact,
    Organizational Context, U Unintended
    Consequences
  • 16. Consumer-Oriented Evaluation

6
Part I. The State of Online Learning
7
Survey of 201 Trainers, Instructors, Managers,
Instructional Designers, CEOs, CLOs, etc.
8
Survey Limitations
  • Sample poole-PostDirect
  • The Web is changing rapidly
  • Lengthy survey, low response rate
  • No password or keycode
  • Many backgroundshard to generalize
  • Does not address all issues (e.g., ROI
    calculations, how trained supported, specific
    assessments)

9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
Why Interested in E-Learning?
  • Mainly cost savings
  • Reduced travel time
  • Greater flexibility in delivery
  • Timeliness of training
  • Better allocation of resources, speed of
    delivery, convenience, course customization,
    lifelong learning options, personal growth,
    greater distrib of materials

13
(No Transcript)
14
A Few Assessment Comments
15
Level 1 Comments. Reactions
  • We assess our courses based on participation
    levels and online surveys after course
    completion. All of our courses are
    asynchronous.
  • I conduct a post course survey of course
    material, delivery methods and mode, and
    instructor effectiveness. I look for suggestions
    and modify each course based on the results of
    the survey.
  • We use the Halo Survey process of asking them
    when the course is concluding.

16
Level 2 Comments Learning
  • We use online testing and simulation frequently
    for testing student knowledge.
  • Do multiple choice exams after each section of
    the course.
  • We use online exams and use level 2 evaluation
    forms.

17
Level 3 Comment Job Performance
  • I feel strongly there is a need to measure the
    success of any training in terms of the
    implementation of the new behaviors on the job.
    Having said that, I find there is very limited by
    our clients in spending the dollars required

18
More Assessment CommentsMultiple Level Evaluation
  • Using Level One Evaluations for each session
    followed by a summary evaluation. Thirty days
    post-training, conversations occur with learners
    managers to assess Level 2 (actually Level 3).
  • We do Level 1 measurements to gauge student
    reactions to online training using an online
    evaluation form. We do Level 2 measurements to
    determine whether or not learning has occurred
  • Currently, we are using online teaching and
    following up with manager assessments that the
    instructional material is being put to use on the
    job.

19
Who is Evaluating Online Learning?
  • 59 of respondents said they did not have a
    formal evaluation program
  • At Reaction level 79
  • At Learning level 61
  • At Behavior/Job Performance level 47
  • At Results or Return on Investment 30

20
(No Transcript)
21
Assessment Lacking or Too Early
  • We are just beginning to use Web-based
    technology for education of both associates and
    customers, and do not have the metric to measure
    our success. However, we are putting together a
    focus group to determine what to measure (and)
    how.
  • We have no online evaluation for students at
    this time.
  • We lack useful tools in this area.

22
Limitations with Current System
  • I feel strongly there is a need to measure the
    success of any training in terms of the
    implementation of the new behaviors on the job.
    Having said that, I find there is very limited by
    our clients in spending the dollars required
  • We are looking for better ways to track learner
    progress, learner satisfaction, and retention of
    material.
  • Have had fairly poor ratings on reliability,
    customer support, and interactivity

23
PauseHow and What Do You Evaluate?
24
Readiness Checklist
  • 1.      ___ Is your organization undergoing
    significant change, in part related to
    e-learning?
  • 2.      ___ Is there pressure from senior
    management to measure the results of e-learning?
  • 3.      ___ Has your company experienced one or
    more training/learning disasters in the past?
  • 4.      ___ Is the image of the training/learning
    function lower than you want?

25
Part II Evaluation Purposes, Approaches and
Frameworks
26
What is Evaluation???
  • Simply put, an evaluation is concerned with
    judging the worth of a program and is essentially
    conducted to aid in the making of decisions by
    stakeholders. (e.g., does it work as
    effectively as the standard instructional
    approach).
  • (Champagne Wisher, in press)

27
What is assessment?
  • Assessment refers toefforts to obtain info about
    how and what students are learning in order to
    improveteaching efforts and/or to demo to others
    the degree to which students have accomplished
    the learning goals for a course. (Millar, 2001,
    p. 11).
  • It is a way of using info obtained through
    various types of measurement to determine a
    learners performance or skill on some task or
    situation (Rosenkrans, 2000).

28
Who are you evaluating for?
  • The level of evaluation will depend on
    articulation of the stakeholders. Stakeholders
    of evaluation in corporate settings may range
    from???

29
Evaluation Purposes
  • Determine learner progress
  • What did they learn?
  • Document learning impact
  • How well do learners use what they learned?
  • How much do learners use what they learn?

30
Evaluation Purposes
  • Efficiency
  • Was online learning more effective than another
    medium?
  • Was online learning more cost-effective than
    another medium/what was the return on investment
    (ROI)?
  • Improvement
  • How do we do this better?

31
Evaluation Purposes
  • An evaluation plan can evaluate the delivery of
    e-learning, identify ways to improve the online
    delivery of it, and justify the investment in the
    online training package, program, or initiative.
    (Champagne Wisher, in press)

32
Evaluation Plans
  • Does your company have a training evaluation plan?

33
Steps to Developing an OL Evaluation Program
  • Select a purpose and framework
  • Develop benchmarks
  • Develop online survey instruments
  • For learner reactions
  • For learner post-training performance
  • For manager post-training reactions
  • Develop data analysis and management plan

34
1. Formative Evaluation
  • Formative evaluations focus on improving the
    online learning experience.
  • A formative focus will try to find out what
    worked or did not work.
  • Formative evaluation is particularly useful for
    examining instructional design and instructor
    performance.

35
Formative Questions
  • -How can we improve our OL program?
  • -How can we make our OL program more efficient?
  • -More effective?
  • -More accessible?

36
2. Summative Evaluation
  • Summative evaluations focus on the overall
    success of the OL experience (should it be
    continued?).
  • A summative focus will look at whether or not
    objectives are met, the training is
    cost-effective, etc.

37
Course Completion
  • Jeanne Meister, Corporate University Xchange,
    found a 70 percent drop out rate compared to
    classroom rates of 15.
  • Perhaps need new metrics. Need to see if they
    can test out.
  • Almost any measure would be better than course
    completion, which is not a predictor of
    anything. Tom Kelly, Cisco, March 2002,
    e-Learning.

38
What Can OL Evaluation Measure?
  • Categories of Evaluation Info (Woodley and
    Kirkwood, 1986)
  • Measures of activity
  • Measures of efficiency
  • Measures of outcomes
  • Measures of program aims
  • Measures of policy
  • Measures of organizations

39
Typical Evaluation Frameworks for OL
  • Commonly used frameworks include
  • CIPP Model
  • Objectives-oriented
  • Marshall Shrivers 5 levels
  • Kirkpatricks 4 levels
  • Plus a 5th level
  • AEIOU
  • Consumer-oriented

40
3. CIPP Model Evaluation
  • CIPP is a management-oriented model
  • C context
  • I input
  • P process
  • P product
  • Examines the OL within its larger system/context

41
CIPP OL Context
  • Context Addresses the environment in which OL
    takes place.
  • How does the real environment compare to the
    ideal?
  • Uncovers systemic problems that may dampen OL
    success.
  • Technology breakdowns
  • Inadequate computer systems

42
CIPP OL Input
  • Input Examines what resources are put into OL.
  • Is the content right?
  • Have we used the right combination of media?
  • Uncovers instructional design issues.

43
CIPP OL Process
  • Process Examines how well the implementation
    works.
  • Did the course run smoothly?
  • Were there technology problems?
  • Was the facilitation and participation as
    planned?
  • Uncovers implementation issues.

44
CIPP OL Product
  • Product Addresses outcomes of the learning.
  • Did the learners learn? How do you know?
  • Does the online training have an effect on
    workflow or productivity?
  • Uncovers systemic problems.

45
4. Objectives-Oriented Evaluation
  • Examines OL training objectives as compared to
    training results
  • Helps determine if objectives are being met
  • Helps determine if objectives, as formally
    stated, are appropriate
  • Objectives can be used as a comparative benchmark
    between online and other training methods

46
Evaluating Objectives OL
  • An objectives-oriented approach can examine two
    levels of objectives
  • Instructional objectives for learners (did the
    learners learn?)
  • Systemic objectives for training (did the
    training solve the problem?)

47
Objectives OL
  • Requires
  • A clear sense of what the objectives are (always
    a good idea anyway)
  • The ability to measure whether or not objectives
    are met
  • Some objectives may be implicit and hard to state
  • Some objectives are not easy to measure

48
5. Marshall Shriver's Five Levels of Evaluation
  • Performance-based evaluation framework
  • Each level examines a different areas of
    performance
  • Requires demonstration of learning

49
Marshall Shriver's 5 Levels
  • Level I Self (instructor)
  • Level II Course Materials
  • Level II Course Curriculum
  • Level IV Course Modules
  • Level V Learning Transfer

50
6. Bonks Evaluation Plan
51
What to Evaluate?
  1. Learnerattitudes, learning, use, performance.
  2. Instructorpopularity, course enrollments.
  3. Traininginternal and external components.
  4. Task--relevance, interactivity, collaborative.
  5. Tool--usable, learner-centered, friendly,
    supportive.
  6. Courseinteractivity, participation, completion.
  7. Programgrowth, long-range plans.
  8. Organizationcost-benefit, policies, vision.

52
RIDIC5-ULO3US Model of Technology Use
  • 4. Tasks (RIDIC)
  • Relevance
  • Individualization
  • Depth of Discussion
  • Interactivity
  • Collaboration-Control-Choice-Constructivistic-Comm
    unity

53
RIDIC5-ULO3US Model of Technology Use
  • 5. Tech Tools (ULOUS)
  • Utility/Usable
  • Learner-Centeredness
  • Opportunities with Outsiders Online
  • Ultra Friendly
  • Supportive

54
7. Kirkpatricks 4 Levels
  • A common training framework.
  • Examines training on 4 levels.
  • Not all 4 levels have to be included in a given
    evaluation.

55
The 4 Levels
  • Reaction
  • Learning
  • Behavior
  • Results

56
8. Return on Investment (ROI) A 5th Level
  • Return on Investment is a 5th level
  • It is related to results, but is more clearly
    stated as a financial calculation
  • How to calculate ROI is the big issue here

57
Is ROI the answer?
  • Elise Olding of CLK Strategies suggests that we
    shift from looking at ROI to looking at time to
    competency.
  • ROI may be easier to calculate since concrete
    dollars are involved, but time to competency may
    be more meaningful in terms of actual impact.

58
Example Call Center Training
  • Traditional call center training can take 3
    months to complete
  • Call center employees typically quit within one
    year
  • When OL was implemented, the time to train (time
    to competency) was reduced
  • Benchmarks for success time per call number of
    transfers

59
Example Circuit City
  • Circuit City provided online product/sales
    training
  • What is more useful to know
  • The overall ROI or break-even point?
  • How much employees liked the training?
  • How many employees completed the training?
  • That employees who completed 80 of the training
    saw an average increase of 10 in sales?

60
Matching Evaluation Levels with Objectives Pretest
  • Instructions For each statement below, indicate
    the level of evaluation at which the objective is
    aimed.
  • 1.      ___ Show a 15 percent decrease in errors
    made on tax returns by staff accountants
    participating in the e-learning certificate
    program.
  • 2.      ___ Increase use of conflict resolution
    skills, when warranted, by 80 percent of
    employees who had completed the first eight
    modules of the online training. (see handout for
    more)

61
9. A 6th Level?Clark Aldrich (2002)
  • Adding Level 6 which relates to the budget and
    stability of the e-learning team.
  • Just how respected and successful is the
    e-learning team.
  • Have they won approval from senior management for
    their initiatives.
  • Aldrich, C. (2002). Measuring success In a
    post-Maslow/Kirkpatrick world, which metrics
    matter? Online Learning, 6(2), 30 32.

62
10. And Even a 7th Level?Clark Aldrich (2002)
  • At Level 7 whether the e-learning sponsor(s) or
    champion(s) are promoted in the organization.
  • While both of these additional levels address the
    people involved in the e-learning initiative or
    plan, such recognitions will likely hinge on the
    results of evaluation of the other five levels.

63
11. ROI AlternativeCost/Benefit Analysis (CBA)
  • ROI may be ill-advised since not all impacts hit
    bottom line, and those that do take time.
  • Shifts the attention from more long-term results
    and quantifying impacts with numeric values, such
    as
  • increased revenue streams,
  • increased employee retention, or
  • reduction in calls to a support center.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

64
Cost/Benefit Analysis (CBA)
  • To both qualitative and quantitative measures
  • job satisfaction ratings,
  • new uses of technology,
  • reduction in processing errors,
  • quicker reactions to customer requests,
  • reduction in customer call rerouting,
  • increased customer satisfaction,
  • enhanced employee perceptions of training,
  • global post-test availability.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

65
Cost/Benefit Analysis (CBA)
  • In effect, CBA asks how does the sum of the
    benefits compare to the sum of the costs.
  • Yet, it often leads to or supports ROI and other
    more quantitatively-oriented calculations.
  • Reddy, A. (2002, January). E-learning ROI
    calculations Is a cost/benefit analysis a better
    approach? e-learning. 3(1), 30-32.

66
Other ROI Alternatives
  • 12. Time to competency (need benchmarks)
  • online databases of frequently asked questions
    can help employees in call centers learn skills
    more quickly and without requiring temporary
    leaves from their position for such training
  • 13. Time to market
  • might be measured by how e-learning speeds up the
    training of sales and technical support
    personnel, thereby expediting the delivery of a
    software product to the market
  • Raths, D. (2001, May). Measure of success.
    Online Learning, 5(5), 20-22, 24.

67
Still Other ROI Alternatives
  • 14. Return on Expectation
  • Asks employees a series of questions related to
    how training met expectations of their job
    performance.
  • When questioning is complete, they place a
    figure on that.
  • Correlate or compare such reaction data with
    business results or supplement Level 1 data to
    include more pertinent info about the
    applicability of learning to employee present job
    situation.
  • Raths, D. (2001, May). Measure of success.
    Online Learning, 5(5), 20-22, 24.

68
15. AEIOU
  • Provides a framework for looking at different
    aspects of an online learning program
  • Fortune Keith, 1992 Sweeney, 1995 Sorensen,
    1996

69
A Accountability
  • Did the training do what it set out to do?
  • Data can be collected through
  • Administrative records
  • Counts of training programs ( of attendees, of
    offerings)
  • Interviews or surveys of training staff

70
E Effectiveness
  • Is everyone satisfied?
  • Learners
  • Instructors
  • Managers
  • Were the learning objectives met?

71
I Impact
  • Did the training make a difference?
  • Like Kirkpatricks level 4 (Results)

72
O Organizational Context
  • Did the organizations structures and policies
    support or hinder the training?
  • Does the training meet the organizations needs?
  • OC evaluation can help find when there is a
    mismatch between the training design and the
    organization
  • Important when using third-party training or
    content

73
U Unintended Consequences
  • Unintended consequences are often overlooked in
    training evaluation
  • May give you an opportunity to brag about
    something wonderful that happened
  • Typically discovered via qualitative data
    (anecdotes, interviews, open-ended survey
    responses)

74
16. Consumer-Oriented Evaluation
  • Uses a consumer point-of-view
  • Can be a part of vendor selection process
  • Can be a learner-satisfaction issue
  • Relies on benchmarks for comparison of different
    products or different learning media

75
Part III
  • Applying Kirkpatricks 4 Levels to Online
    Learning Evaluation Evaluation Design

76
Why Use the 4 Levels?
  • They are familiar and understood
  • Highly referenced in the training literature
  • Can be used with 2 delivery media for comparative
    results

77
Conducting 4-Level Evaluation
  • You need not use every level
  • Choose the level that is most appropriate to your
    need and budget
  • Higher levels will be more costly and difficult
    to evaluate
  • Higher levels will yield more

78
Kirkpatrick Level 1 Reaction
  • Typically involves Smile sheets or
    end-of-training evaluation forms.
  • Easy to collect, but not always very useful.
  • Reaction-level data on online courses has been
    found to correlate with ability to apply learning
    to the job.
  • Survey ideally should be Web-based, keeping the
    medium the same as the course.

79
Kirkpatrick Level I Reaction
  • Types of questions
  • Enjoyable?
  • Easy to use?
  • How was the instructor?
  • How was the technology?
  • Was it fast or slow enough?

80
Kirkpatrick Level 2 Learning
  • Typically involves testing learners immediately
    following the training
  • Not difficult to do, but online testing has its
    own challenges
  • Did the learner take the test on his/her own?

81
Kirkpatrick Level 2 Learning
  • Higher-order thinking skills (problem solving,
    analysis, synthesis)
  • Basic skills (articulate ideas in writing)
  • Company perspectives and values (teamwork,
    commitment to quality, etc.)
  • Personal development

82
Kirkpatrick Level 2 Learning
  • Might include
  • Essay tests.
  • Problem solving exercises.
  • Interviews.
  • Written or verbal tests to assess cognitive
    skills.
  • Shepard, C. (1999b, July). Evaluating online
    learning. TACTIX from Fastrak Consulting.
    Retrieved February 10, 2002, from
    http//fastrak-consulting.co.uk/tactix/Features/ev
    aluate/eval01.htm.

83
Kirkpatrick Level 3 Behavior
  • More difficult to evaluate than Levels 1 2
  • Looks at whether learners can apply what they
    learned (does the training change their
    behavior?)
  • Requires post-training follow-up to determine
  • Less common than levels 1 2 in practice

84
Kirkpatrick Level 3 Behavior
  • Might include
  • Direct observation by supervisors or coaches
    (Wisher, Curnow, Drenth, 2001).
  • Questionnaires completed by peers, supervisors,
    and subordinates related to work performance.
  • On the job behaviors, automatically logged
    performances, or self-report data.
  • Shepard, C. (1999b, July). Evaluating online
    learning. TACTIX from Fastrak Consulting.
    Retrieved February 10, 2002, from
    http//fastrak-consulting.co.uk/tactix/Features/ev
    aluate/eval01.htm.

85
Kirkpatrick Level 4 Results
  • Often compared to return on investment (ROI)
  • In e-learning, it is believed that the increased
    cost of course development ultimately is offset
    by the lesser cost of training implementation
  • A new way of training may require a new way of
    measuring impact

86
Kirkpatrick Level 4 Results
  • Might Include
  • Labor savings (e.g., reduced duplication of
    effort or faster access to needed information).
  • Production increases (faster turnover of
    inventory, forms processed, accounts opened,
    etc.).
  • Direct cost savings (e.g., reduced cost per
    project, lowered overhead costs, reduction of bad
    debts, etc.).
  • Quality improvements (e.g., fewer accidents, less
    defects, etc.).
  • Horton, W. (2001). Evaluating e-learning.
    Alexandria, VA American Society for Training
    Development.

87
Kirkpatrick Evaluation Design
  • Kirkpatricks 4 Levels may be achieved via
    various evaluation designs
  • Different designs help answer different questions

88
Pre/Post Control Groups
  • One group receives OL training and one does not
  • As variation try 3 groups
  • No training (control)
  • Traditional training
  • OL training
  • Recommended because it may help neutralize
    contextual factors
  • Relies on random assignment as much as possible

89
Multiple Baselines
  • Can be used for a program that is rolling out
  • Each group serves as a control group for the
    previous group
  • Look for improvement in subsequent groups
  • Eliminates need for tight control of control group

90
Time Series
  • Looks at benchmarks before and after training
  • Practical and cost-effective
  • Not considered as rigorous as other designs
    because it doesnt control for contextual factors

91
Single Group Pre/Post
  • Easy and inexpensive
  • Criticized for lack of rigor (absence of control)
  • Needs to be pushed into Kirkpatrick levels 3 and
    4 to see if there has been impact

92
Case Study
  • A rigorous design in academic practice, but often
    after-the-fact in corporate settings
  • Useful when no preliminary or baseline data have
    been collected

93
Matching Evaluation Levels with Objectives
Posttest
  • Instructions For each statement below, indicate
    the level of evaluation at which the objective is
    aimed.
  • 1. Union Pacific Railroad reported an increase in
    bottom-line performance--on-time delivery of
    goods--of over 35, which equated to millions of
    dollars in increased revenues and savings.
  • 2. They also reported that learners showed a 40
    increase in learning retention and improved
    attitudes about management and jobs.
  • (see handout for more)

94
Part IV
  • ROI and Online Learning

95
The Importance of ROI
  • OL requires a great amount of and other
    resources up front
  • It gives the promise of financial rewards later
    on
  • ROI is of great interest because of the
    investment and the wait period before the return

96
Calculating ROI
  • Look at
  • Hard cost savings
  • Hard revenue impact
  • Soft competitive benefits
  • Soft benefits to individuals
  • See Calculating the Return on Your eLearning
    Investment (2000) by Docent, Inc.

97
Possible ROI Objectives
  • Better Efficiencies
  • Greater Profitability
  • Increased Sales
  • Fewer Injuries on the Job
  • Less Time off Work
  • Faster Time to Competency

98
Hard Cost Savings
  • Travel
  • Facilities
  • Printed material costs (printing, distribution,
    storage)
  • Reduction of costs of business through increased
    efficiency
  • Instructor fees (sometimes)

99
The Cost of E-learning
  • Brandon-hall.com estimates that an LMS system for
    8,000 learners costs 550,000
  • This price doesnt include the cost of buying or
    developing content
  • Bottom line getting started in e-learning isnt
    cheap

100
Hard Revenue Impact
  • Consider
  • Opportunity cost of improperly or untrained
    personnel
  • Shorter time to productivity through shorter
    training times with OL
  • Increased time on job (no travel time)
  • Ease of delivering same training to partners and
    customers (for fee?)

101
Soft Competitive Benefits
  • Just-in-time capabilities
  • Consistency in delivery
  • Certification of knowledge transfer
  • Ability to track users and gather data easily
  • Increase morale from simultaneous roll-out at
    different sites

102
Individual Values
  • Less wasted time
  • Support available as needed
  • Motivation from being treated as an individual

103
Talking about ROI
  • As a percentage
  • ROI(Payback-Investment)/Investment100
  • As a ratio
  • ROIReturn/Investment
  • As time to break even
  • Break even time(Investment/Return)Time Period

104
What is ROI Good For?
  • Prioritizing Investment
  • Ensuring Adequate Financial Support for Online
    Learning Project
  • Comparing Vendors

105
The Changing Face of ROI
  • Return-on-investment isnt what it used to be
    The R is no longer the famous bottom line and the
    I is more likely a subscription fee than a
    one-time payment (Cross, 2001)

106
More Calculations
  • Total Admin Costs of Former Program - Total
    Admin Costs of OL ProgramProjected Net Savings
  • Total Cost of Training/ of StudentsCost Per
    Student (CPS)
  • Total Benefits 100/Total Program CostROI

107
Pause How are costs calculated in online
programs?
108
ROI Calculators
109
Success Story 1 (Sitze, March 2002, Online
Learning)EDS and GlobalEnglish
  • Charge Reduce money on English training
  • Goal 80 online in 3 months
  • Result 12 use in 12 months
  • Prior Costs 1,500-5,000/student
  • New Cost 150-300/user
  • Notes Email to participants was helpful in
    expanding use rolling out other additional
    languages.

110
Success Story 2 (Overby, Feb 2002, CIO)Dow
Chemical and Offensive Email
  • Charge Train 40,000 employees across 70
    countries 6 hours of training on workplace
    respect and responsibility.
  • Specific Results 40,000 passed
  • Savings Saved 2.7 million (162,000 on record
    keeping, 300,000 on classrooms and trainers,
    1,000,000 on handouts, 1,200,000 in salary
    savings due to less training time).

111
Success Story 3 (Overby, Feb 2002, CIO)Dow
Chemical and Safety/Health
  • Charge Train 27,000 employees on environmental
    health and safety work processes.
  • Results Saved 6 million safety incidents have
    declined while the number of Dow employees have
    grown.

112
Success Story 4 (Overby, Feb 2002, CIO)Dow
Chemical and e-learning system
  • Charge 1.3 million e-learning system
  • Savings 30 million in savings (850,000 in
    manual record-keeping, 3.1 in training delivery
    costs, 5.2 in reduced classroom materials,
    20.8 in salaries since Web required 40-60 less
    training time).

113
Success Story 5 (Ziegler, e-learning, April
2002)British Telecom sales training
  • Costs Train 17,000 sales professionals to sell
    Internet services using Internet simulation.
  • Result Customer service rep training reduced
    from 15 days to 1 day Sales training reduced
    from 40 days to 9 days.
  • Savings Millions of dollars saved sales
    conversion went up 102 percent customer
    satisfaction up 16 points.

114
At the End of the Day...
  • Are all training results quantifiable?
  • NO! Putting a price tag on some costs and
    benefits can be very difficult
  • NO! Some data may not have much meaning at face
    value
  • What if more courses are offered and annual
    student training hours drop simultaneously? Is
    this bad?

115
Evaluation Cases (homework)
  1. General Electric Case
  2. Financial Services Company
  3. Circuit Board Manufacturing Plant Safety
  4. Computer Company Sales Force
  5. National HMO Call Center

116
Part V
  • Collecting Evaluation Data Online Evaluation
    Tools

117
Collecting Evaluation Data
  • Learner Reaction
  • Learner Achievement
  • Learner Job Performance
  • Manager Reaction
  • Productivity Benchmarks

118
Forms of Evaluation
  • Interviews and Focus Groups
  • Self-Analysis
  • Supervisor Ratings
  • Surveys and Questionnaires
  • ROI
  • Document Analysis
  • Data Mining (Changes in pre and post-training
    e.g., sales, productivity)

119
How Collect Data?
  • Direct Observation in Work Setting
  • By supervisor, co-workers, subordinates, clients
  • Collect Data By Surveys, Interviews, Focus Groups
  • Supervisors, Co-workers, Subordinates, Clients
  • Self-Report by learners or teams
  • Email and Chat

120
Learner Data
  • Online surveys are the most effective way to
    collect online learner reactions
  • Learner performance data can be collected via
    online tests
  • Pre and post-tests can be used to measure
    learning gains
  • Learner post-course performance data can be used
    for Level 3 evaluation
  • May look at on-the-job performance
  • May require data collection from managers

121
Example Naval Phys. Training Follow-Up Evaluation
  • A naval training unit uses an online
    survey/database system to track performance of
    recently trained physiologists
  • Learners self-report performance
  • Managers report on learner performance
  • Unit heads report on overall productivity

122
Learning System Data
  • Many statistics are available, but which are
    useful?
  • Number of course accesses
  • Log-in times/days
  • Time spent accessing course components
  • Frequency of access for particular components
  • Quizzes completed and quiz scores
  • Learner contributions to discussion (if
    applicable)

123
Computer Log DataChen, G. D., Liu, C. C., Liu,
B. J. (2000). Discovering decision knowledge from
Web log portfolio for managing classroom
processes by applying decision tree and data cute
tech. Journal of Educ Computing Research, 23(3),
305-332.
  • In a corp training situation, computer log data
    can correlate online course completions with
  • actual job performance improvements such as
  • fewer violations of safety regulations,
  • reduced product defects,
  • increased sales, and
  • timely call responses.

124
Learner System Data
  • IF learners are being evaluated based on number
    and length of accesses, it is only fair that they
    be told
  • Much time can be wasted analyzing statistics that
    dont tell much about the actual impact of the
    training
  • Bottom line Easy data to collect, but not always
    useful for evaluation purposes
  • Still useful for management purposes

125
Benchmark Data
  • Companies need to develop benchmarks for
    measuring performance improvement
  • Managers typically know the job areas that need
    performance improvement
  • Both pre-training and post-training data need to
    be collected and compared
  • Must also look for other contextual factors

126
Online Survey Tools for Assessment
127
Web-Based Survey Advantages
  • Faster collection of data
  • Standardized collection format
  • Computer graphics may reduce fatigue
  • Computer controlled branching and skip sections
  • Easy to answer clicking
  • Wider distribution of respondents

128
Sample Survey Tools
  • Zoomerang (http//www.zoomerang.com)
  • IOTA Solutions (http//www.iotasolutions.com)
  • QuestionMark (http//www.questionmark.com/home.htm
    l)
  • SurveyShare (http//SurveyShare.com from
    Courseshare.com)
  • Survey Solutions from Perseus (http//www.perseusd
    evelopment.com/fromsurv.htm)
  • Infopoll (http//www.infopoll.com)

129
(No Transcript)
130
(No Transcript)
131
(No Transcript)
132
Online Testing Tools(see http//www.indiana.edu/
best/)
133
(No Transcript)
134
Test Selection Criteria (Hezel, 1999 Perry
Colon, 2001)
  • Easy to Configure Items and Test
  • Handle Symbols, Timed Tests
  • Scheduling of Feedback (immediate?)
  • Flexible Scoring and Reporting
  • (first, last, average, by individual or group)
  • Easy to Pick Items for Randomizing
  • Randomize Answers Within a Question
  • Weighting of Answer Options
  • Web Resource http//www.indiana.edu/best/

135
Tips on Authentification
  • Check e-mail access against list
  • Use password access
  • Provide keycode, PIN, or ID
  • (Futuristic Other Palm Print, fingerprint, voice
    recognition, iris scanning, facial scanning,
    handwriting recognition, picture ID)

136
Ziegler, April 2002, e-Learning
  • the key is not to measure every possible angle,
    but rather to focus on metrics that are pragmatic
    and relevant to both human and business
    performance at the same time.

137
E-Learning Evaluation Measures
  • So which of the 16 methods would you use???
  • Something ridiculous???

138
Some Final Advice
Or Maybe Some Questions???
Write a Comment
User Comments (0)
About PowerShow.com