Title: Evaluation Performance MeasurementLogic Models
1Evaluation/Performance Measurement/Logic Models
- Maria Aristigueta
- Management Decision Making
2Topics Covered
- Evaluation
- Performance Measures
- Social Indicators
- Logic Models
3Performance Measurement and Evaluation
- Performance Measurement allows us to monitor the
results, outputs, inputs, and efficiency of a
program. - The purpose of evaluation is to increase our
understanding of the major relationships imbedded
in the design of social programs (Blalock 1999).
4Differences
- Performance measurement is the ongoing monitoring
and reporting. - Continuous
- Questions the what
- Evaluation is the systematic program assessment.
- Infrequent
- Questions the why and how
5Synergies between p.m. and evaluation
- Performance measures provide data for evaluation.
- Monitoring is of key importance to improve
program performance.
- Evaluation provides information for performance
management. - Evaluation is necessary to determine program
impact, causality, and to explore alternative
applications.
6Evaluations to Improve Performance
- Identifies program goals
- Objectives
- Performance Indicators
- Data sources
- Analyses
7Evaluability Assessment
- Use to determine a programs readiness for
evaluation by determining - If program goals, objectives, important side
effects, and priority information needs are well
defined. - Goals and objectives are plausible
- Performance data is attainable
- Agreement on use of evaluation information
8Evaluability Assessment
- Shows not only whether a program can be
meaningfully evaluated - but also, will it contribute to improved program
performance.
9Key Steps
- Involve intended users--Evaluators often work in
isolation. - Clarify program intent--Clarify the assumed
relationships among program resources, program
activities, and expected outcomes from the
perspective of the key policy makers, managers
and staff, and interest groups.
10Key Steps continue
- Program reality--operations and results may
reveal that program reality is far from program
intent by higher management and policy makers. - Reach agreement on needed changes in the program
design--these improvements may be made before a
more formal evaluation. A component of
qualitative evaluation process.
11Evaluability Assessment
- Explore Alternative Evaluation Designs--Measuremen
ts that could be taken, comparisons that could be
made, likely costs, uses for resulting
information. - Agree on evaluation priorities and intended use
of information.
12Gaining Management Support
- Evaluators need mechanisms that will convince
managers it is worth their while to become and
stay engaged in the evaluation process. - Managers skepticism is overcome by quickly
providing objectives, and credible information
relevant to problems the managers face.
13Keys to Securing Necessary Decisions
- Hold the interest of those in charge of the
program by providing evaluability assessment
products - Continue interaction with evaluation users
- Brief key managers and policymakers on findings
and options get their opinions - Provide additional information and help program
prepare for implementation
14Areas for Evaluation Management
- Clarify the evaluation mandate
- Gain initial agreement
- Check the mandate during the evaluation
processshifts occur. - Staffing
- Level, type, knowledge and experience required.
- Mix senior and junior staff.
15Evaluators as Change Agents
- Understand the organizations decision-making
apparatusstructure evaluation approach around
it. - Other needed attributes critical thinking,
credibility and objectivity. - Pattonutilization focused evaluation.
- Weissevaluation to enlighten.
16Effective recommendations
- Timely
- Realistic
- Directed at appropriate person/entity
- Comprehensible
- Specific
- Link recommendations with findings
- In some cases, options are appropriate instead or
recommendations.
17Useful exhibits found in Wholey, et. al.
- Briefings
- Graphics
- Sharing results
18Quality Control of Evaluation Process
- Provide for peer review of evaluation design and
evaluation reports. - Program staff should be given opportunity to
respond to reports. - Peer review of evaluation office
- Regularly review the work of evaluation
contractors. - Ensure that benefits of evaluation exceed costs
19Trends in Evaluation
- Program improvements
- Legislative branch interested in outcomes
- Monitoring of program quality and results
- Benchmarking
- Client feedback procedures
- Technology for trained observers
- Less expensive and faster data entry and analysis
- Linking of program financial rewards to
performance
20Performance Measurement and Social Indicators
21Measurement Models
- as impetus for accountability to the funding
source and public. - Measurement Models
- Social Indicators--called benchmarks or
milestones in U.S. State government. Also
frequently used in the non-profit sector. - Performance Measures
22Measures Driving Reforms
- Ideally, these improvements, to include social
indicators, will alter policymaking and policy
management by facilitating the emergence of
outcome based accountability systems,
systems-wide coordination and integration
efforts, performance-based competitive service
models, and public sector privatization and
democratization schemes (Corbett 1997, xix)
23Definitions
- Performance measurement is the monitoring on a
regular basis of the results and efficiency of
services or programs. - Social indicators are descriptions of conditions
that are intended to inform public opinion and
policy making (Duncan, 1974) - Social indicators measure the well-being of
society, while agency performance measures,
measure the input, output or outcome of a
specific program
24Principles ofPerformance Measurement
- Result Oriented
- Selective
- Reliable
- Useful
- Shared
25Criteria forPerformance Measures
- Relevance
- Reliability
- Validity
- Coverage
- Cost Effectiveness
26Limitations of Performance Measures
- Do not tell why the outcome occurred--this
requires program evaluation - Some outcomes cannot be measured directly.
- The information by itself is not sufficient for
decisions--need political judgments, leadership.
27Performance measures should
- Focus on outcomes
- Capture data that is accurate, verifiable, and
consistent over time - Yield information that helps make reality-based
decisions - Be reported regularly
- Be logically and directly related to an agencys
goals, strategies and functionality - Be worth the cost of collecting and analyzing
28Types of Performance Measures
- Outcome Measures
- -observable changes in desired skills,
attitudes, knowledge, behavior, functioning, etc. - Efficiency Measures
- -measures resource costs in , employee time,
or equipment used per unit produced or service
output
29Types of Performance Measures Continued
- Output Measures
- -count the goods and services produced by an
agency - Input Measures
- -show the resources used to produce services
- Social Indicators
- -use societal information for public decision
making
30Examples of Performance Measures
- Outcome Measures (intermediate and end outcomes)
- Evidence of increased learning by students.
- Recovering owed child support payments from
absent parents - People completing employment training programs
where program participants are volunteering
(intermediate outcome).Â
31Examples continued
- Efficiency Measures
- - Average cost per client served
- - Average cost per inspection
- - Cost per mile maintained-Asphalt
- - Average wait time per customer (measure of
quality service delivery)
32Examples continued
- Output Measures
- - Number of clients served
- - Number of criminal cases proceeded
- - Number of inspections conducted
- - Number of lane miles maintained per year
- - Number of customers served per day
- Input Measures
- amount of resources used (expressed as funds or
number of employee-years, or both)
33Intermediate Outcomes
- Intermediate outcomes are expected to lead to the
ends desired but are not themselves ends. - Service quality as intermediate outcome--how well
a service was delivered NOT what results occurred
after the service.
34Desirable Prerequisites
- High-level support for performance measurement
- Reasonable program stability
- At least some computerized data-processing
capability - Agreement on use of measures
- Adequate time allocated for data collection.
35Outcome Indicator
- Identifies a specific numerical measurement that
indicates progress towards achieving an outcome.
It is not usually an outcome. - Percent of, ratio of, proportion of
36Social Indicators Further Defined
- The quantification of societal phenomena for
public decision making. - Time series measurements allowing for the
identification of long-term trends, periodic
changes, and fluctuations in rate of change in
conditions affecting well being of individuals
and communities.
37Examples of Social Indicators
- Perhaps, best known are the Indicators of
Childrens well-being (KIDS COUNT) in the 50
states funded through the Casey Foundation. - Other social indicator reports in the United
States include the Bureau of the Census, Social
Indicators the National Center for Education
Statistics, The Condition of Education the
National Center for Health Statistics, Health
United States - In addition, States maintain their own indicators
on state conditions including children, families,
environment, crime, perceptions of community and
government.
38States with Indicators
- Cross-Sector Children Families Only
- Alaska ?
- Arizona ?
- Colorado ?
- Connecticut ?
- Delaware ?
- Georgia ?
- Hawaii ?
- Indiana ?
- Iowa ?
- Maryland ?
- Michigan ?
- Minnesota ?
- Missouri ?
- New York ?
- North Carolina ?
- Ohio ?
- Oregon ?
- Rhode Island ?
39Uses for Indicators
- State governments are using social indicators as
basic tools of governance for - state and local planning initiatives,
- pre-policy to help define a problem or provide a
perspective, - accountability efforts to share program
initiatives to address the problem - For example, Delawares pre and post natal care
programs to address low-birth babies.
40Other Uses
- Tools for policy development
- Program design and administration
- Resource allocation
- Intergovernmental relations
- Alignment with program performance measures
- Community projects to assist in provision of
services to the public
41Findings for Use of State Indicators by
Non-profits
- Policy decision-making
- Internal management
- Accountability to the public
- A source for community projects
- Professional knowledge/growth
- Awareness
42Criteria to Develop Indicators
- Indicators should assess well-being across a
broad array of outcomes, behavior, and processes - Be easily and readily understood by the public
and the policy makers - Assess positive and negative aspects of
well-being - Meet standards for validity and reliability
- And be reflective of social goals.
43Misuse of Social Indicators
- Data that is available sometimes becomes the
indicator rather than what is needed. - Explanation Data on indicators are expensive
and sometimes difficult to obtain. - Unevenness in coverage of quantifiable societal
conditions sometimes occurs do to personal or
political area of interest. - Explanation The process for using social
indicators--selection, collection, mode of
presentation, choice of baseline or comparative
data--is subjective.
44Misuse
- Agencies may be held accountable for social
indicator. - Factors may include societal trends (increase in
births to single parents) nations demographics
(increase in social service needs among elderly
population). - Data presented without explanation.
- Possibilities for unwarranted conclusions or
alternate interpretations.
45Benefit of Social Indicators
- Social reporting de-mystifies politics.
- Provides a common ground on which everyone can
learn something on the status of the communities
well-being. - Useful in discussions of the limitations of
government. - Stimulate debate for best solutions to social
problems.
46Basic Types of Breakouts for Outcome Data
- Workload or customer characteristic
- Organizational unit or project
- Geographical location
- Type and amount of services
- Note may be used for social indicators and
performance measures.
47Common Problems in Communicating Performance Data
- Failing to provide context and interpretation for
performance data - Failing to anticipate different legislative
information needs - Providing too much or too little detail
48Making the Data Decision A Balancing Act
- Too many measures reported overwhelm the process
- Performance measures should be layered
- Let them know what data you have without
necessarily including it in report - Too few measures will not provide the information
needed to assess performance
49Types of Data to Report
- Make sure the measures report the bottom line
- Are things getting better, worse, or remaining
the same?
50Communicating in a Legislative Environment
- Make performance data
- Stimulating
- Informative
- Self-Contained
- Provide human element when possible in discussion
51Logic Models
52Logic Models
- A logic model describes a critical set of
governance, production, and contextual
relationships that work collectively to achieve
desired outputs and end outcomes
53Logic Model Examples
- Organizational/administrative models-- maps
relationship among administrative units or
governmental levels. - Process models--maps relationships across key
processes that transform inputs into results.
Process models unify responsibilities, cutting
across organizational boundaries or governmental
levels.
54Logic Model ExampleTraining and Development
Â
Â
Â
55Logic Model Discussion Questions
- What logic model underlies your delivery system?
- Does your agency plan and/or program plans
incorporate structures consistent with underlying
logic models?