Title: Understanding Evaluation from the Start
1Understanding Evaluation from the Start
- Mario Rivera, Program Evaluator
- CDPHE
- Colorado MCH Conference
- May 7, 2009
2Introductory questions
- What roles have you had in program evaluation?
- What are you hoping to learn from this session?
3What you may learn (and not learn) from this
presentation
- Wont learn how to do evaluation, but rather
- Learn how to use the process of evaluation so you
can get the most out of it by - Learning things to consider when conducting,
overseeing or planning for evaluation.
4Two major themes in the presentation
- 1st theme- How do you know if your program is
ready for evaluation - 2nd theme- Suggestions or guidelines on how to
get the most out of program evaluation
5A goal in mind
- Hopefully, these ideas will help you to be a more
astute user of evaluation. - Additionally, there are by-products that can be
derived from the evaluation process.
6Brainstorming question
- Why do we evaluate programs?
7A definition of evaluation
- Program evaluation is the systematic collection
of information about the activities,
characteristics, and results of programs to- - Make judgments about the program
- Improve or further develop program effectiveness
- Inform decisions about future program
- Increase understanding
- Obtain future funding
81st Theme
- How do you know if your program is ready for
evaluation?
9Brainstorming question
- What should you look at to see if your program
can be evaluated?
- Determine evaluation readiness of a
- program by an Evaluability Assessment.
- This involves 4 tasks
10Conducting an Evaluability Assessment-
Task 1. Study the program history, design, and
operation
- History
- What is the problem that the program addresses?
- How did the program start?
- How long has the program been operating?
- Part of a larger agency?
11Conducting an Evaluability Assessment-Task 1.
Study the program history, design, and operation
- Design
- What are the goals and objectives?
- Is there a logic model?
- What resources are in place?
- What is the programs maximum capacity?
- What is the duration of the program?
- Are there known problems with implementation?
12Conducting an Evaluability Assessment- Task 1.
Study the program history, design, and operation
- Operation
- How are the clients needs assessed?
- What services are provided?
- How is staff trained?
- Is there an implementation plan?
13Conducting an Evaluability Assessment- Task 2.
Watch the program in action
- Does the programs work plan (e.g. MCH Plan)
differ from the program in practice?
14Conducting an Evaluability Assessment- Task 3.
Determine the programs capacity for data
collection, management, and analysis
- What data are collected?
- What is the quality of the data?
- Are additional data needed?
15Conducting an Evaluability Assessment- Task 4.
Assess the likelihood that the program will reach
its goals and objectives
- Are the objectives realistic and achievable?
16Two common reasons why a program is not ready for
an evaluation
- Program Design Issues
- No formal program design or model is in place
- The program model or design is unsound- problems
with objectives and activities - Program Implementation Issues
- Is the intended population served?
- Does the program have resources (i.e. staff,
equipment, etc.)? - Are the activities being implemented as specified
in a work plan?
172nd Theme
- How to get the most out of program evaluation
18How to get the most out of program evaluation
- Use the Personal Factor
- Use a collaborative approach such as
Utilization-Focused Evaluation - Learn how to counteract threats to utility
- When to do program evaluation
- Who can do evaluation
- Peak inside the head of a program evaluator
- Questions to ask a potential evaluator
19Brainstorming question
- Is your MCH evaluation planning a collaborative
process?
201. Use The Personal Factor(use the help of
others)
- Clearly identify people who can benefit from an
evaluation - Narrow the list of people to a more specific
group of primary intended users - Remember- People, not organizations, use
evaluation information - The Personal Factor represents the leadership,
interest, enthusiasm, determination, commitment,
assertiveness, and caring of specific people - These are users of the evaluation who seek
information to make judgments, reduce decision
uncertainties, and want to increase their ability
to predict the outcomes of a programs activities
212. The Steps in a Utilization-Focused Evaluation
Process (A useful process for evaluation
planning)
- Bring together the group of intended users of the
evaluation (using the Personal Factor) - The group determines the focus of the evaluation
(weigh the importance of goals, program
implementation, and/or theory of action) - The group determines the methods, measurement,
and design - The group interprets findings, makes judgments,
and generates recommendations - The group decides about dissemination of
evaluation reports
223. Learn how to counteract threats to utility
- Relevance- does the evaluation relate to my
programs needs? - Understandability- do the findings make sense?
- Conclusiveness- are there conclusions that have
program implications? - Actionability- is there something concrete that
can be done with the findings? - Political viability- does the evaluation consider
political or other real world realities? - Fairness- can the findings be trusted?
- Utility- overall, is the program evaluation
useful enough to spend time on?
234. When can evaluations be conducted?
- Anytime but beginning at the planning stage is
ideal - The Plan-Do-Study-Act Cycle
245. Who can do evaluation?
- Trained evaluators and program staff
- Examples-
- evaluation of this conference
- educational training programs
- access to prenatal care
256. Peak inside the head of a program evaluator
- Evaluators want to-
- Be clear about everything
- Be specific
- Focus and prioritize
- Be systematic (e.g. plan your work and work your
plan) - Make assumptions explicit (determine what can or
cannot be evaluated) - Operationalize program concepts, ideas, and
objectives - example- By September 30, 2009,decrease the motor
vehicle hospitalization rate by 1
266. Peak inside the head of a program evaluator
- Evaluators want to-
- Distinguish processes from outcomes
- example- children enrolled in CHP
- process measure (create network to assess
enrollment is an output) - outcome measure (enrollment rate will be
increased by 25) - Draw conclusions
- Separate data-based statements of fact from
interpretations and judgments - Limit generalizations and causal explanations to
what the data support - Understand cultural variations and factors
27Brainstorming question
- What are some questions to ask a potential
evaluator?
287. What questions are good to ask a potential
evaluator?
- Besides asking about-
- Formal education
- Experience conducting evaluation of programs
similar to yours - His or her resume
- References from directors of programs that the
evaluator has worked with in the past - Samples of the evaluators work
297. What questions are good to ask a potential
evaluator?
- There are two key more questions to ask about
- Evaluation philosophy- different evaluators view
the evaluation process differently. Look for an
evaluator who believes that evaluation is a
collaborative process. Some formal names for
this type of evaluation philosophy include
participatory evaluation, utilization-focused
evaluation, and empowerment evaluation. - Communication skills- beyond being able to
clearly present findings in written and oral
form, a collaborative evaluator will function
with the group by bringing about and facilitating
discussions through the evaluation process to
foster program development.
30Brainstorming question
- What are some benefits from evaluation besides
the findings?
31By-products
- Additionally, there are by-products that can be
derived from the evaluation process beyond just
using the evaluation findings - Enhances shared understanding
- Examples- agreement on the programs model and
expected outcomes, gives voice to diverse
perspectives - Supports and reinforces the program intervention
- Example- desired program outcomes are achieved in
part through the effects of data collection - Increases engagement, self-determination, and
ownership - Examples- learning evaluation by doing
evaluation, building evaluation capacity,
reflective practice - Supports program and organizational development
- Demonstrate success to continue funding
Hopefully, these ideas will help you to be a more
astute user of evaluation.
32References
- Michael Quinn Patton. Utilization-focused
evaluation. 4th edition. Thousand Oaks, CA, 2008. - Evaluability Assessment Examining the Readiness
of a Program for Evaluation. Office of Juvenile
Justice and Delinquency Prevention - Hiring and Working with an Evaluator. Office of
Juvenile Justice and Delinquency Prevention