Title: PPA 503
1PPA 503 The Public Policy-Making Process
2Evaluating Public Programs
- Program evaluation is a way of bringing to public
decision-makers the available knowledge about a
problem, about the relative effectiveness of past
and current strategies for addressing or reducing
that problem, and about the observed
effectiveness of particular programs.
3Administrative Purposes for Evaluation
- Policy formulation to assess or justify the
need for a new program and to design it optimally
on the basis of past experience. - Information on the problem addressed by the
program how big is it? What is its frequency and
direction? How is it changing? - Information on the results of past programs that
dealt with the problem were those programs
feasible? Were they successful? What difficulties
did they encounter? - Information allowing the selection of one program
over another what are the comparative costs and
benefits? What kinds of growth records were
experienced?
4Administrative Purposes for Evaluation
- Policy execution to ensure that a program is
implemented in the most cost-effective and
technically competent way. - Information on program implementation how
operational is the program? How similar is it
across sites? Does it conform to the policies and
expectations formulated? How much does it cost?
How do stakeholders feel about it? Are there
delivery problems or error, fraud, and abuse?
5Administrative Purposes for Evaluation
- Policy execution to ensure that a program is
implemented in the most cost-effective and
technically competent way. - Information on program management what degree of
control exists over expenditures? What are the
qualifications and credentials of the personnel?
What is the allocation of resources? How is
program information used in decision making? - Ongoing information on the current state of the
problem or threat addressed in the program is
the problem growing? Is it diminishing? Is it
diminishing enough so that the program is no
longer needed? Is it changing in terms of its
significant characteristics.
6Administrative Purposes for Evaluation
- Accountability in public decision making to
determine the effectiveness of an operating
program and the need for its continuation,
modification, or termination. - Information on program outcomes or effects what
happened as a result of program implementation? - Information on the degree to which the program
made or is making a difference what change in
the problem or threat has occurred that can be
directly attributed to the program? - Information on the unexpected (and expected)
effects of the program.
7Functions and Roles of Evaluation Sponsors
- Executive branch (federal, state, local).
- Program managers (cost-effectiveness).
- Agency heads and top policy makers (need,
effectiveness). - Central budget or policy authorities
(effectiveness, need).
8Functions and Roles of Evaluation Sponsors
- Legislative branch
- Congressional and legislative policy and
evaluation offices (all aspects). - Legislative authorization, appropriations, and
budget committees (program funding and
refunding). - Oversight committees (all aspects).
- Regardless of sponsor, evaluators should clearly
specify the objectives and limitations of each
evaluation.
9Functions and Roles of Evaluation Sponsors
- As a general rule, public administrators should
expect their work on program effectiveness and
feasibility to be of more general use than their
work on implementation, which will be of most use
to program managers and agency heads. - Information needs will be larger for large
programs than small, new programs over old.
10Evaluation Approaches
- Front-end analysis evaluative work conducted
before a decision to move ahead with a program. - Evaluability assessment reasonableness of
assumptions and objectives, comparison of
objectives to program activities, feasibility of
full-scale evaluation.
11Evaluation Approaches
- Process evaluation describe and analyze the
processes of implemented program activities
management strategies, operations, costs,
interactions, etc. - Effectiveness or impact evaluation how well has
a program been working? Are the changes the
result of the program?
12Evaluation Approaches
- Program and problem monitoring continuous
rather than snapshot inform on problem
characteristics or track program or problem
progress in several areas. - Metaevaluation or evaluation synthesis
reanalyzes findings from several analyses to
determine what has been learned.
13Evaluation Approaches
14Introduction to Evaluation Procedures
- Program evaluation is the use of social research
methods to systematically investigate the
effectiveness of social intervention programs. - Draws on techniques and concepts of social
science disciplines - Intended to be used for improving programs and
informing social action aimed at ameliorating
social problems.
15Introduction to Evaluation Procedures
- Modern evaluation research grew from pioneering
efforts in the 1930s and burgeoned in the
post-war years as new methodologies were
developed. - The social policy and public administration
movements have contributed to the
professionalization of the field and to the
sophistication of the consumers of evaluation
research.
16Introduction to Evaluation Procedures
- The need for program evaluation is undiminished
in the 2000s and may even be expected to grow. - Contemporary concern over the allocation of
scarce resources makes it more essential than
ever to evaluate the effectiveness of social
interventions.
17Introduction to Evaluation Procedures
- Evaluation must be tailored to the political and
organizational context of the program to be
evaluated.
18Introduction to Evaluation Procedures
- The assessment of one or more program domains
- The need for the program
- The design of the program
- The program implementation and service delivery
- The program impact or outcomes
- Program efficiency
- Accurate description of program performance and
assessment against relevant standards or criteria.
19Introduction to Evaluation Procedures
- Program evaluation presents many challenges to
the evaluator - Changes in circumstances and activities during an
evaluation. - Appropriate balance between science and
pragmatism. - Diversity of perspectives and approaches.
20Introduction to Evaluation Procedures
- Most evaluators are trained as social scientists
or social researchers. - Complex evaluations may require specialized
staffs. - Basic knowledge is good for researchers and
consumers.
21Tailoring evaluations
- Every evaluation must be tailored to the
circumstances of the program to yield credible
and useful answers to specific questions while
still allowing practical implementation.
22Tailoring evaluations
- Influences on evaluation plans include the
purpose of the evaluation. - Provide feedback for program improvement to
program managers and sponsors. - Establish accountability to decision-makers with
responsibility to ensure that the program is
effective. - Contribute to knowledge about some form of social
intervention.
23Tailoring evaluations
- Influences also include the nature of program
structure and circumstances. - The program must be responsive to
- How new or open to change the program is.
- The degree of consensus or conflict among
stakeholders about the nature and mission of the
program. - The values and concepts inherent in the program
rationale and design. - The way in which the program is organized and
administered.
24Tailoring evaluations
- Evaluation planning must also accommodate
limitations on resources. - Resources include
- Funding
- Time for completion
- Pertinent technical expertise
- Program and stakeholder cooperation
- Access to important records and program material.
- Balance between what is desirable and what is
feasible.
25Tailoring evaluations
- The evaluation design can be structured around
three issues. - The questions the evaluation is to answer
- The methods and procedures to be used to answer
these questions - The nature of the evaluator-stakeholder
interactions during the course of the evaluation.
26Tailoring evaluations
- Deciding on the appropriate relationship between
the evaluator and the evaluation sponsor, as well
as other major stakeholders, is an often
neglected, but critical aspect of an evaluation
plan. - Independent is often expected
- But participatory or collaborative may enhance
stakeholders skills or political influence.
27Tailoring evaluations
- Evaluation questions and methods fall into five
categories - Need for services
- Program conceptualization and design
- Program implementation
- Program outcomes and
- Program efficiency.
28Tailoring evaluations
- Evaluation terms corresponding to these
categories include needs assessment, process
evaluation, and impact assessment. - Much of evaluation planning consists of
identifying the evaluation approach corresponding
to the type of questions to be answered and
tailoring specifics to the program situation.
29Identifying issues and formulating questions
- A critical phase in evaluation planning is the
identification and formulation of the questions
that the evaluation will address. - These questions focus the evaluation on the areas
of program performance most at issue for key
stakeholders and guide the design so that it will
provide meaningful information about program
performance.
30Identifying issues and formulating questions
- Good evaluation questions must identify clear,
observable dimensions of program performance that
are relevant to the programs goals and represent
domains in which the program can realistically be
expected to have accomplishments.
31Identifying issues and formulating questions
- What most distinguishes evaluation questions,
however, is that they involve criteria by which
the identified dimensions of program performance
can be judged. - If the formulation of the evaluation questions
can include performance standards on which key
stakeholders agree, evaluation planning will be
easier and the potential for disagreement with
the results reduced.
32Identifying issues and formulating questions
- To ensure that matters of greatest significance
are covered in the evaluation design, the
evaluation questions are best formulated through
interaction and negotiation with the evaluation
sponsors and other stakeholders representative of
significant groups or distinctly positioned in
relation to program decision-making.
33Identifying issues and formulating questions
- Although stakeholder input is critical, the
evaluator must be prepared to identify program
issues that warrant inquiry. - Evaluator should conduct a somewhat independent
analysis of the assumptions and expectations on
which the program is based.
34Identifying issues and formulating questions
- Make the program theory explicit.
- Program Theory
- Programs Organizational Plan
- Service Utilization Plan
- Impact Theory
35Identifying issues and formulating questions
36Identifying issues and formulating questions
- Program theory describes the assumptions inherent
in a program. - Encompasses impact theory, which links program
actions to intended outcomes and - Process theory, which describes a programs
organizational plan and scheme for ensuring
utilization of its services by the target
population.
37Identifying issues and formulating questions
- When these procedures have generated a full set
of evaluation questions, evaluator must organize
them into related clusters. - Draw on stakeholder input and professional
judgment to set priorities. - With the priority evaluation questions
determined, evaluator is ready to design the part
of the evaluation devoted to answering them.
38Meeting the Need for Evaluation
- Three basic questions
- Can the results of the evaluation influence
decisions about the program? - Can the evaluation be done in time to be useful?
- Is the program significant enough to merit
evaluation?
39Choices Facing Evaluators
- Evaluation design
- What are the evaluation questions?
- What comparisons are needed?
- What measurements are needed?
- How will the resulting information be used?
- What breakouts (disaggregations of data) are
needed, such as by facility or type of client?
40Choices Facing Evaluators
- Data Collection
- What are the primary data sources?
- How should data be collected?
- Is sampling required? Where and how?
- How large a sample is needed?
- How will data quality be ensured?
41Choices Facing Evaluators
- Data Analysis
- What analytical techniques are available (given
the data)? - Which analytical tools will be most appropriate?
- In what format will the data be most useful?
- Getting Evaluation Information Used
- How should evaluation findings be packaged for
different audiences? - Should specific recommendations accompany
evaluation reports to encourage action? - What mechanisms can be used to check on
implementation of recommendations?