Developing Performance Measures at UCEDD: Moving Forward with The Logic Model - PowerPoint PPT Presentation

1 / 97
About This Presentation
Title:

Developing Performance Measures at UCEDD: Moving Forward with The Logic Model

Description:

Director Consulting Services, Performance Management. The Performance Institute ... Assumptions: improving looks = better self image. Factors: Health History. Inputs ... – PowerPoint PPT presentation

Number of Views:127
Avg rating:3.0/5.0
Slides: 98
Provided by: iank154
Learn more at: http://www.aucd.org
Category:

less

Transcript and Presenter's Notes

Title: Developing Performance Measures at UCEDD: Moving Forward with The Logic Model


1
Developing Performance Measures at UCEDD
Moving Forward withThe Logic Model
2
Contact Information
  • Jon Desenberg
  • Director Consulting Services, Performance
    Management
  • The Performance Institute
  • 1515 North Courthouse Road, Suite 600
  • Arlington, VA 22201
  • Phone 703-894-0481 - Fax 703-894-0482
  • www.performanceweb.org
  • Desenberg_at_performanceweb.org

3
  • Measuring
  • Performance for Results

If you dont know where you are going, any road
will get you there. Lewis Carroll
4
Why measure?
  • To Plan?
  • To Comply?
  • To Manage?
  • To Optimize?
  • To Innovate?

What gets measured gets done. Peter Drucker
5
Performance is not about mandates, its about
management
  • PERFORMANCE MEASURES ALLOW YOU TO
  • STRATEGIZE
  • COMMUNICATE
  • MOTIVATE
  • MANAGE

6
Long-Term Move Towards Performance
  • Performance management is not a new phenomenon50
    years of work in the making to link resources
    with results
  • Budget Accounting Procedures Act (BAPA) of 1950
  • Planning-Programming-Budgeting System (PPBS),
    1965-1971
  • Management by Objectives (MBO) 1973-1974
  • Zero-Based Budgeting (ZBB), 1977-1981

7
Finally a Move in the Right Direction
  • GPRA(1993) Government Performance Results Act
  • PMA(2001) Presidents Management Agenda
  • PART(2002) Program Assessment Rating Tool
  • Mandates focusing on performance and
    accountabilitybottom line results

8
Program Assessment Rating Tool (PART)
  • OMB Tool to Evaluate Effectiveness of Program
    Performance in Budget Submissions for Individual
    Programs
  • Designed to evaluate 20 of programs per fiscal
    year
  • Methodological
  • Standardized
  • Evidence-basedand Transparent
  • Consistent with Government Performance Results
    Act (GPRA), 1993

9
Four Sections of PART
  • Program Relevance
  • Mission and purpose of program
  • Program Planning
  • Focus on programs strategic objectives
  • Program Management
  • Stewardship by front-line managers
  • Program Results
  • Program accountability to strategic objectives

10
PART Categories
  • Competitive Grant Programs
  • Block/Formula Grant Programs
  • Regulatory-Based Programs
  • Capital Assets and Service Acquisition Programs
  • Credit Programs
  • Research and Development Programs
  • Direct Federal Programs

11
FY06 Budget Lessons Learned
12
Lessons Learned
  • Root Cause of Failure Poor Plans, Bad Measures,
    Weak Management
  • Lack of meaningful, results-oriented performance
    measures
  • Failure to address management deficiencies
    identified by GAO, IGs, et al.
  • Ill-defined, conflicting, duplicative program
    purpose

13
Lessons Learned
  • Causes for Success
  • Good alignment towards Performance Plans
  • Effective management of strategic plans
  • Strong linkage between successful implementation
    of strategic goals

14
Making the GradeOMBs weighting for each
section
  • Section One 20
  • Section Two 10
  • Section Three 20
  • Section Four 50

15
Section One Relevance
  • Clear agency mission
  • Unique contribution
  • Specific interest, problem or need
  • Optimally Designed to address specifics

16
Section Two Planning
  • Long-Term Performance Goals
  • Annual Performance Goals
  • Stakeholder Dialogue
  • Budget-Performance Integration

17
Section Three Management
  • Collection of timely, credible performance
    information
  • Manager Accountability
  • Strict allocation of funds
  • Budget-Performance Integration

18
Section Four Program Results
  • Demonstrable progress towards goals
  • Achievement of annual goals
  • Improvement in efficiencies and cost
    effectiveness
  • Favorable performance to comprable programs

19
Performance Measurement Issues Manageable by
PART
  • Outcomes are extremely difficult to measure
  • Are among many contributors to a desired outcome
  • Have results that will not be achieved for many
    years
  • Relate to deterrence or prevention of specific
    behaviors
  • Have multiple purposes and funding that can be
    used for a range of activities
  • Are administrative or process oriented

20
Performance Management in Non-profits translates
a mission into reality and evaluates the results
to all stakeholders
  • Strategic Level
  • Measure Progress on Issues
  • Define Validate Policy Strategies
  • Enhance Stakeholder Satisfaction and Support
  • Operational Level
  • Drive Change to Implement Organizational
    Strategies
  • Ensure Compliance
  • Achieve Efficiencies
  • Improve Cycle Time
  • Individual Level
  • Improved Morale/Retention
  • Achieve Clarity of Responsibilities
  • Transparency of Performance to
  • Donors
  • Elected Leaders
  • Senior Management
  • Oversight Entities
  • Employees
  • Customers
  • Partners

21
Effective Performance Measures are SMART
  • S PECIFIC
  • M EASUREABLE
  • A CCOUNTABLE
  • R ESULTS-ORIENTED (1)
  • T IME-BOUND

22
Applying SMART
  • End-Outcome Reduce smoking-related deaths,
    illness and costs
  • Intermediate Outcome Reduce the number of new
    youth smokers (10-18) by 2 each year
  • Results-oriented Youth smoking is where you can
    stop the habit before it takes hold and has a
    lasting health impact
  • Specific number of new youth smokers (10-18)
  • Accountable You have the ability to make it
    happen
  • Measurable reduce by 2
  • Time-Bound per year

23
Selecting Performance Measures The Doctor
Analogy
  • Outcome Goal (End Outcome)
  • Achieve and Maintain Proper Health and Quality of
    Life
  • Measures DozensPulse, Years Lived Satisfaction
    Survey, etc.
  • Objective Goals (Intermediate Outcomes)
  • -Achieve Appropriate Fitness for Age, Gender
  • -Stop Pain in Y
  • -Restore Function of X
  • Measures Thousands Weight, Blood Analyses,
    Scans, etc.

24
Performance Measure Selection Criteria
(Step 4)
25
Performance Measure Selection Criteria
(Step 4)
26
Performance measurement is a culture shift
  • From
  • These measures are draining valuable resources
    and are a data burden
  • I cant measure my outcomes I can only measure
    activities
  • I need these measures because my employees feel
    it is important
  • You cant measure my program.
  • To
  • We are committed to tracking measures that matter
    most.
  • We are accountable for delivering our outputs and
    our intermediate outcomes.
  • We are responsible for our end outcomes.

27
  • Identifying Characteristics of Effective
    Performance Management Systems

28
8 Critical Success Factors for Effective
Performance Management Systems
  • Defining and Aligning to Enterprise Strategy
  • Developing Meaningful Performance Measures
  • Increasing Data Availability
  • Maximizing Data Integrity
  • Enhancing Performance Reporting
  • Improving Evaluation and Analysis
  • Achieving Performance Integration
  • Driving Decision-Making

29
1 Defining and Aligning to Enterprise Strategy
  • 1.1 Has clearly defined its mission, vision and
    values
  • 1.2 Has specific strategies in place to achieve
    organizational results (based on a SWOT or other
    strategic landscape analysis)
  • 1.3 All structures (divisions, support functions)
    are fully aligned with enterprise-wide
    strategies
  • 1.4 A formal strategic plan is clearly
    communicated to all employees at all levels of
    the organization

Survey Questions 1, 2, 3, 4
30
2 Developing Meaningful Performance Measures
  • 2.1 Reliable measurement and reporting on
    Outcomes
  • 2.2 Reliable measurement and reporting on
    Strategies
  • 2.3 Organizational process metrics (Quality,
    Cycle Time, Efficiency)
  • 2.4 Goals and measures enjoy support and buy-in
    from internal and external stakeholders

Survey Questions 5, 6, 7, 8
Module Two Characteristics of Performance
Management System
31
3 Increasing Data Availability
  • 3.1 Data sources are identified and readily
    accessible
  • 3.2 Data burden is worth the information gleaned

Survey Questions 9, 10
Module Two Characteristics of Performance
Management System
32
4 Maximizing Data Integrity
  • 4.1 Data is collected, managed, and analyzed in a
    uniform and consistent manner
  • 4.2 Data is validated or verified through
    sampling or independent means

PI Management Survey Question 11, 12
Module Two Characteristics of Performance
Management System
33
5 Enhancing Performance Reporting
  • 5.1 Internal reporting produces information for
    frontline managers and senior decision-makers on
    a real time basis.
  • 5.2 Has a reporting system that produces
    comprehensive performance reports that include
    measures, analysis, trends, suggestions for
    improvement

PI Management Survey Questions 13, 14
34
6 Improving Evaluation and Analysis
  • 6.1 For process measures, benchmarks and service
    levels are evaluated (1-2 year cycles)
  • 6.2 For outcome and strategy measures, program
    performance is evaluated for cause-effect (2-5
    year cycles)

PI Management Survey Questions 15, 16
35
7 Achieving Performance Integration
  • 7.1 INTERNAL Integration Support services
    contributions (HR, IT, Finance, etc.) to program
    performance is documented and managed
  • 7.2 EXTERNAL Integration Performance
    contributions of multiple contributors in
    same measurement area are tracked and compared

PI Management Survey Questions 17
36
8 Driving Decision-Making
  • 8.1 Budgets and investments are made based on
    clear contributions to performance
  • 8.2 Supply chain partners are held accountable
    for products and services
  • 8.3 Employee bonuses and pay increases are linked
    to individual performance evaluations.

PI Management Survey Questions 18, 19, 20
Module Two Characteristics of Performance
Management System
37
Top Five/Bottom Five by EXECUTION
TOP FIVE INITIATIVES
BOTTOM FIVE INITIATIVES
  • All structures are fully aligned with
    enterprise-wide strategies (1.3)
  • Reliable measurement and reporting on strategies
    (2.2)
  • Internal integration of support service alignment
    to performance (7.1)
  • Data is collected, managed, and analyzed in a
    uniform manner (4.1)
  • Goals and measures enjoy support from
    internal/external stakeholders (2.4)
  • Publishing a strategic plan (1.4)
  • Has specific strategies in place to achieve
    organizational results (1.2)
  • Organizational process metrics (2.3)
  • Evaluation of process measures, benchmarks (6.1)
  • Budgets investments are made based on
    contributions to performance (8.1)

Module Two Characteristics of Performance
Management System
38
Top Five/Bottom Five by IMPACT
TOP FIVE INITIATIVES
BOTTOM FIVE INITIATIVES
  • Has clearly defined its mission, vision and
    values (1.1)
  • Data burden is worth the information gleaned
    (3.2)
  • Publishing a Strategic Plan (1.4)
  • Comprehensive performance reports (5.1)
  • Data is validated through sampling or independent
    means (4.2)
  • All structures are fully aligned with
    enterprise-wide strategies (1.3)
  • Reliable measurement and reporting on strategies
    (2.2)
  • Employee bonuses and pay increases are linked to
    individual performance (8.3)
  • Budgets investments are made based on
    contributions to performance (8.1)
  • Internal reporting produces real-time data for
    decision-making (5.1)

39
  • Module Three
  • Understanding Logic Models

40
What is a logic model?
  • Logical chain of events providing blueprint for
    mission achievement
  • Graphic representation that illustrates the
    rationale behind a program or organization
  • Depicts causal relationships between activities,
    strategies, and end results
  • Contains goals and performance measures
  • Integrates various program activities into a
    cohesive whole
  • Vehicle for dialogue, planning, program
    management and evaluation

41
What does a logic model look like?
  • Graphic display of boxes and arrows vertical or
    horizontal
  • Relationships, linkages
  • Any shape
  • Circular, dynamic
  • Cultural adaptations, storyboards
  • Level of detail
  • Simple
  • Complex
  • Multiple models

42
Logic modeling is based on mapping and defining
linkages between what we do and why we do it .
Series of If-Then Relationships
IF THEN IF THEN IF THEN
IF THEN
Have Better Image, Feel Better Live Longer
I Work Out for One Hour Each Day
I Will Burn More Calories Than I Consume
Lose Fat and Build Muscle
Improve My Looks and Health
INPUTS OUTPUTS
OUTCOMES
Assumptions improving looks better self image
Factors Health History
43
Clarifying the terms
Inputs People and resources required to achieve
outcomes
Activities/Outputs What the inputs produce
End Outcome End goal or ultimate benefit
Immediate and Intermediate Outcomes Changes
required to achieve end outcome
Assumptions beliefs or evidence that supports
your IF-THEN logic
Factors external influences beyond control that
effect IF-THEN relationships
44
What is your goal at your annual dental check-up?

IF THEN IF
THEN IF THEN
  • Toothpaste
  • Floss
  • Tooth brush
  • Brush twice a day
  • Floss once a day

Have fewer (ideally zero) cavities
Remove plaque (decrease plaque in my mouth)
INPUTS OUTPUTS
OUTCOMES
Assumptions Plaque causes tooth decay
Factors Genetics
45
Logic Model V
Top-Line Return
Alignment
Measurement
Linkage
Bottom-Line Investment
46
Logic Model V Performance Dimensions
47
Value Chain Diagram

Distribute program grants
So That
Ultimate Program Intent
Child violence and abuse can be prevented and
detected
Output
So That
Child health and development can be protected and
maintained
Intermediate Outcome
So That
Children can grow into productive citizens and
attain their intended impacts on society
Degree of Influence by Department
Intermediate Outcome
High-Level Outcome
48
Global Logic Model Childhood Lead Poisoning
Program
Early Activities
Early Outcomes
Later Outcomes
Later Activities
If we do Outreach Screening ID of elevated
kids
Then. EBLL kids get medical
treatment Family performs in-home
techniques Lead source identified Environment
gets cleaned up Lead source removed
And then EBLL reduced Developmental slide
stopped Quality of life improves
And we do Case mgmt of EBLL kids Refer EBLL
kids for medical treatment Train family in
in-home detection techniques Assess environment
of EBLL child Refer environment for clean-up

Definition EBLL Elevated Blood Lead Levels
Module Three Using Logic Models
49
Most logic models incorporate the following
elements.
EFFECT
CONTROL
Inputs
Activities
Outputs
End Outcomes
Intermediate Outcomes Attitudes Behaviors
Conditions
WHY?
HOW
50
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
51
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
  • Processes and roles
  • What the program does
  • Subject of on-going process improvement and
    strategy change
  • System integration through linkages to
  • Stakeholder satisfaction
  • Assessment quality
  • Desired outcomes
  • Efficiency measures

52
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
  • Products and services delivered (e.g. grants,
    audits, research studies, impact
    assessments, etc.)
  • Indicate strategy deployment
  • Foundation step for attainment of all types of
    outcomes
  • Often measured by low-level outcome types
  • -Process vital signs
  • -Customer satisfaction
  • Good source of short-term, readily available
    results
  • 12 months-1 year

53
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
  • Show cross-agency/program accountabilities
  • Can be one or several in number
  • Often measured by attitudes, behaviors and
    conditions
  • Can show short- to medium-term change
  • 1-5 years

54
Connecting strategies, intermediate outcomes and
measures
Intermediate Outcomes

U.S. Department of Labor Womens Bureau Strategy
Provide training in high-growth, demand-driven
occupations to women Intermediate Outcome
Increase hard skills in high-growth,
demand-driven occupations for participants
Intermediate Outcome Performance Measure of
women participants who successfully complete
training, education or certification for
high-growth, demand-driven occupations example
only
Module Four Identifying outcomes and measures
55
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
  • Shows ultimate benefit to tax payer
  • Often measured by long-term indicators
  • Changes in economic, policy conditions
  • Captured in the strategic plan as end goals
  • 5-10 years

56
Connecting mission, goals, end outcomes and
measures
End Outcomes

U.S. Department of Labor Womens Bureau Mission
Improve the status of wage earning women,
improve their working conditions, increase their
efficiency, and advance their opportunities for
profitable employment. Goal Improve status of
working women through better jobs. End Outcome
Increase womens employment in high-growth,
demand-driven occupations. End Outcome
Performance Measure of women participants who
find employment in high-growth, demand-driven
occupation
57
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
  • Efficiency Measures
  • Ratio of outputs to inputs
  • Usually unit costs or service units per FTE
  • Examples
  • Cost per assessment
  • Average value of grants
  • Investigations completed per FTE

58
Appreciating outcomes vs. outputs
Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
59
Relating Logic Models to Strategic Plan Elements

Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
Outcome Goals Developed from statute based mission
Strategies Changes in attitudes, behaviors or
conditions required to achieve outcome goals
Programs Projects, tasks or initiatives designed
to contribute to end outcomes
/FTE
Overall
Internal/External Environment Statutory
Authority Based on strategic assessment
Performance Measures
Mission Values
60
Control and Influence

Drivers
Inputs
Activities
Outputs
Intermediate Outcomes
End Outcomes
Typical Program Focus
Typical Agency Focus
Suggested Agency Focus
61
End Outcomes Where is UCEDD?
  • More individualsare independent and
    self-sufficientparticipate in and contribute to
    the life of their communities.
  • Ensure better access to Services
  • of individuals receiving direct services.
  • A leadership position in the developmental
    disabilities.

62
Logic Models and Performance Systems
Strategic Plan
Beginnings If your assumptions about the factors
that influence your issues hold true
Intended Results Should contribute to the results
you expect based on this theory of change
Planning Design
what weve done
Logic Model
what we hope to do
Implementation
Evaluation Communication
Management Plan
Performance Reports
how we will do what we said
Planned Work Then, the activities you plan to do
which build on these assumptions
Adapted from W.K. Kellogg Foundation, Logic
Model Development Template
63
The Performance Logic Model
64
Introducing The Performance Logic Model
What are the results of specific strategies that
will contribute to achieving end outcomes? What
changes in attitudes, behaviors and conditions
are required?
What activities can be performed and what
products and services can be delivered to achieve
the outcomes?
FTE FTE FTE

What are the ultimate benefits to the public?

Performance Measures
65
What does The Performance Logic Model do?
66
Align
Performance Logic Model Framework captures senior
leadership and operational manager outcomes
Prioritize
Link
Management
Senior Leadership
Goals
Outcomes
Outputs
Activities
Attitudes Behaviors Conditions
FTE FTE FTE
Gap
Goals
Outcomes
Outputs
Activities
Manage
Metrics
Measure
Comment This does not represent a physical
breakout of the Logic Model, but rather an
illustration to distinguish the differing level
of key stakeholders (e.g., agency-program,
executive-manager, congress-agency, etc).
67
Why use The Performance Logic Model?
  • Brings detail to broad goals helps in planning,
    evaluation, management, and communications.
  • Builds understanding and promotes consensus about
    what the organization does and how it will
    work--builds buy-in and teamwork.
  • Helps to clarify what is appropriate to evaluate,
    and when, so that evaluation resources are used
    wisely.
  • Summarizes complex programs to communicate with
    stakeholders, funders, audiences.

68
Performance Logic Model
Input Activities and Outputs
Intermediate Outcomes End
Outcomes
FTE FTE FTE


Input Activities and Outputs
Intermediate Outcomes End Outcomes
Strategic Plan
Annual Performance Plan
Human Capital Plan
Competitive Sourcing/Contracting
Information Technology/E-Government Plan
Activity-Based Costing/Performance Budgeting
Improved Financial Management
Manager and Employee Performance Plans
Accountability and Performance Report
69
UCEED Intermediate Outcomes and Measures
Focusing on the ABCs of Change Not End State
70
If your organization were successful, what would
you SEE, HEAR, FEEL and DO?
71
Identifying outcomes
  • End outcomes are grounded in mission and statute,
    assess progress toward strategic goals
  • Intermediate outcomes evaluate progress toward
    end outcomes, assess impact of strategies
    measures changes in attitudes, behaviors or
    conditions required to achieve end outcomes
  • When goals and strategies are results-based,
    outcomes leap off the page.

72
Every outcome has a measure
  • Measures are the indicators of results. Good
    measures align activities and resources to
    achieve outcomes. Measures communicate if or to
    what extent activities have delivered the desired
    outcomes.
  • Outcome Increase basic literacy skills in youth
    ages 14-17 who are basic skills deficient
  • Measure of youth enrolled in basic literacy
    skills training programs who increase basic
    skills by 1 educational functioning level

73
Performance Measures Definitions
  • Performance Measures Indicators, stats, metrics
    used to gauge program performance
  • Target Quantifiable characteristic that
    communicates to what extent a program must
    accomplish a performance measure
  • Outcome Measures Intended result of carrying out
    a program. Define an event or condition external
    that is a direct impact the public.
  • Output Measures Describes the level of activity
    that will be provided over a period of time
  • Efficiency Measures Measures that capture the
    skillfulness in executing programs, implementing
    activities, and achieving results

74
Current state of measurement
  • Too Many Measures
  • Wrong Kinds of Measures
  • Too process and activity oriented
  • No clearly defined Logic Model
  • No measures of strategy
  • Few measures of end outcome
  • Dumbing-down of Measures
  • Measuring only the things you can count rather
    than things that are strategically important

75
Recommendations for Measures
  • Balance across three arms of performance
  • Efficiency (e.g. reduced cost, reduced cycle
    time)
  • Effectiveness (e.g. improved customer access,
    awareness or satisfaction)
  • Quality (e.g. reduced error rate, increased
    compliance)
  • Select measures for each outcome, output and
    activity (metrics)
  • Validate measures
  • Do they pass the gut check are they good
    measures?
  • Can you identify the data needed to calculate the
    measure?
  • Is the data readily available?
  • Evolve measures over time as outputs change or as
    efficacy of measures decreases

76
Checking measures
  • Is the measure results-oriented?
  • Does it indicate achievement of outcomes?
  • Is this measure specific and meaningful?
  • Can this be measured? (Consider how?)
  • Will someone be held accountable for the measure?
  • Is the measure time bound?
  • Does it indicate when the results will be
    realized?

77
Identifying intermediate outcomes
  • Intermediate outcomes measure the results of
    strategies deployed to achieve the end outcome.
    Intermediate outcomes target the center of
    gravity of a particular problem to cause a
    change in the direction of the end outcome.
    Often problems addressed by the government and
    social services are the result of specific
    attitudes, behaviors, or conditions. Strategies
    and their intended outcomes (intermediate
    outcomes) target these attitudes, behaviors and
    conditions.
  • Attitudes Behaviors Conditions

78
Intermediate outcomes target the changes in
attitudes, behaviors or conditions that are
required to achieve end outcomes
  • Reducing teen smoking
  • Attitudes Alter the belief that smoking is
    cool
  • Behaviors Decrease number of new smokers
  • ages 12-15
  • Conditions Reduce the amount of cigarettes sold
    to underage smokers

79
Identifying intermediate outcomes through Center
of Gravity Analysis
  • Center of Gravity Analysis
  • What attitude, behavior or condition needs to
    change to achieve the end outcomes? (Target)
  • 2. Identify who possesses the critical capability
    to cause the change or achieve the end outcomes.
    What must they do? (Who What)
  • 3. How can you get them to do that? (How?)

80
Module Six7 Steps to Building a Performance
Logic Model Welfare to Work


81
Building a Performance Logic Model
Step 1 Identify end outcomes and their measures
grounded in mission and values
Step 2 Identify intermediate outcomes and their
measures informed by assumptions, factors
Step 3 Identify activities and outputs and their
metrics required to achieve outcomes


Step 5 Set targets for chosen measures
Step 6 Allocate resources required to achieve
outcomes
Step 4 Narrow cast and choose measures for
management
Step 7 Clearly define, collect, analyze and
report on measures
82
Step 1 Identify end outcome measures grounded
in mission and customer values
  • What is the bottom line of your program?
  • If you had to defend your programs value/benefit
    before a grand jury, what 2-3 pieces of evidence
    would prove you were a success rather than a
    failure?
  • What is the end benefit to the taxpayer or
    society from your program?
  • How will you know you have been so successful
    that you can shut your program down?
  • How will you measure this outcome?

83
Clarifying the Logic of the Program (Step 1)
Logic Model Template The mission of the
(Program) Is to produce/provide
(Products or Services) To (Target of
Change) So that they can
(Intermediate Outcome Change) Resulting
ultimately in (End Outcome Goal)
84
Step 2 Identify intermediate outcome measures
informed by assumptions, factors
  • Given the end outcomes you seek
  • What attitudes, behaviors or conditions must
    change to achieve your end outcomes?
  • What must increase, decrease or stay the same to
    achieve your end outcomes?
  • What must change in the status quo to create the
    conditions necessary to achieve your end outcome?
  • How will you measure this outcome?

85
Step 3a Identify activities, outputs and metrics
required to achieve outcomes .
  • For each intermediate outcome
  • What specific things can this agency do to cause
    change to happen?
  • What specific things to influence that target of
    change?
  • What products could you produce?
  • What services could you provide?
  • What is the actual workload that is to be
    handled?
  • (Note Dont include administrative items inside
    your program. Think of what things actually leave
    the four walls of your program.)

86
Separating Activities from Outputs (Step 3a)
Activity Definition Template The purpose of
(Specific Program Work
Activity) Is to produce/provide (Output) To
(Target of Change) So that they
can (Intermediate Outcome Change)
87
Step 3b Develop metrics for activities/ outputs
  • What metrics can and should be tracked to assess
    progress of
  • activities conducted?
  • Metrics assess progress of activities
    conducted, may be quantitative or qualitative
    numerical value of outputs also known as
    operational measures, efficiency measures,
    workload measures, productivity measures
  • Examples number of contracts processed, percent
    of contracts processed right the first time, unit
    cost per contract,

Module 6 Building a Performance Logic Model
88
Identifying metrics for activities/outputs (Step
3b)
  • Metrics assess progress of activities
    conducted, may be quantitative or qualitative
    also know as operational measures, efficiency
    measures, workload measures, productivity
    measures
  • Examples number of contracts processed, percent
    of contracts processed right the first time, unit
    cost per contract

89
Checking metrics (Step 3b)
  • Do metrics reflect that which is most important
    to the customer?
  • Will these metrics track progress and activity
    completion?
  • Can this data be collected and analyzed?
  • Is the data collection burden worth the result?

90
Selecting Your Measures The Program Performance
Assessment Window
Attention Needed
Proven Success
I M P O R T A N C E
Factors a I4, P2 b I3, P3 c I2, P1 d I1,
P4
a
4
3
b
Exit Opportunity
ResourcesAvailable
2
c
1
d
1
2
3
4
PERFORMANCE
91
Developing performance targets
(Step 5)
  • Targets
  • numerical value of the performance measures
  • establish desired results within a specific
    timeframe measures degrees of progress toward
    outcomes
  • established from baseline data--targets should
    NOT be established without base line data.
  • Example 55 of women participants will
    successfully complete training, education or
    certification for
  • high-growth, demand-driven occupations

92
Crafting a System of Performance Measures
Agency
Performance Measures
Department
Performance Measures
Measures Defined
Program
Measures Reported
Performance Measures
Individuals
Performance Measures
93
Cascading The Performance Logic Model
  • It is possible to
  • use one model for all required measures from all
    funding sources
  • drill-down and roll-up logic models
  • logic model across functions and operations
  • capture both internal and external stakeholders

94
Benefits of The Performance Logic Model
  • Program Integration Tracks and coordinates the
    contributions of multiple programs, bureaus,
    agencies, levels of government, and sectors of
    society (non-profits, for-profits, etc.)
  • Systems Integration Allows for the
    comprehensive integration of different elements
    of the management agenda, such as IT,
    acquisition, human resources, etc.
  • Accountability Links program goals to individual
    achievement and vendor/contractor performance

95
2005 UCEDD Logic Model
  • End Outcome
  • More Individuals with developmental disabilities
    are independent and self-sufficient. More
    individuals with developmental disabilities
    participate in and contribute to the life of
    their communities

Is this the vision? The Ultimate Goal? Can we
measure it?
96
2005 UCEDD Logic Model
  • Intermediate Outcomes
  • More models are field tested and expert knowledge
    relevant to the DD field is gained and
    disseminated
  • More students leaders are trained and remain in
    the field of DD.
  • More individuals with DD receive services from
    trained individuals.
  • More communities and policymakers are
    knowledgeable about DD issues.
  • More individuals with DD receive high quality
    services from trained individuals and through
    improved access and expanded capacity
  • Are these outcomes, outputs or a little of both?

97
2005 UCEDD Logic Model
  • Outputs
  • of trainees who gained knowledge skills
  • of individuals in the community who gained
  • who received services and support
  • of research and evaluation activities conducted
  • who remain in the field
  • of products developed disseminated
  • of recipients of products disseminated

Do these overlap or align with the intermediate
outcomes?
Write a Comment
User Comments (0)
About PowerShow.com