Title: CLASS I
1CLASS I
- INTRODUCTION TO PROGRAM EVALUATION
2 Upon completion of this class, students should
be able to
- Explain the course expectations and organization.
- Define extension evaluation
- Define basic terms of evaluation
- Compare formative and summative evaluations and
- Describe the main difference between research and
evaluation. - Describe the main steps of the evaluation process
3Course Organization
- This will be delivered as an online course.
- The Internet is the major communication channel.
- Visit the course website and be familiarize with
the course syllabus, class schedule, assignments,
tests, and exams. - Use the Web CT Vista for course related work.
4Course Organization
- Learning materials related to each class will be
posted at least one week prior to the class date. - Students are expected to read the materials and
complete learning assignments. - The class discussion board will be used to
exchange students views.
5Exams
- There are two (take home open book) exams
- Mid-term exam counts for 15 of the course grade
- Final exam counts for 20 of the course grade
6Reading Assignments
- Most of the classes will have a reading
assignment. Reading assignments account for 20
of the course grade.
7Assignments
- Evaluation Model Counts for 10 of the course
grade - Evaluation Article Critique Counts for 5 of
the course grade - Term project This is a group project and
accounts for 20 of the course grade.
8Class Participation
- Students are expected to participate in class
discussions. - Web CT Vista will be used to facilitate class
discussions. - Class discussions account for 10 of the course
grade.
9Course Grading
- This course will be graded using the and -
system. The breakdown of the grading system is as
follows - A 97-100 A 94-96 A-
90-93 B 87-89 B 84-86
B- 80-83 C 77-79 C
74-76 C- 70-73 D 67-69
D 64-66 D- 60-63 F 59
10WHAT DO WE MEAN BY EVALUATION?
- We do evaluation in our everyday life. Most of
the time it is informal. - Informal evaluation is a basic form of human
behavior. Sometimes it is thorough, structured,
and formal. More often, it is impressionistic and
private (Worthen, B. R., Sanders, J. R., and
Fitzpatrick, J. L. 1997, P.7). - Example Evaluation of a burger or a cookie for
its quality that you like.
11WHAT DO WE MEAN BY EVALUATION
- Formal evaluation is based on systematic efforts
to define explicit criteria and obtain accurate
information about alternatives Worthen, B. R.,
Sanders, J. R., and Fitzpatrick, J. L. 1997,
P.7). - It is a comparison between actual situation and
expected situation to make decisions about the
program.
12WHAT DO WE MEAN BY EVALUATION
- Evaluation is the systematic application of
social research procedures for assessing the
conceptualization, design, implementation, and
utility of social intervention programs (Rossi,
P. H. and Freeman, H. E. 1993). - It is the process whose duty is the systematic
and objective determination of merit, worth, or
value. Without such a process, there is no way to
distinguish the worthwhile from the worthless.
(Scriven, M. 1991)
13WHAT DO WE MEAN BY EVALUATION
- Program evaluation is the systematic collection
of information about the activities,
characteristics, and outcomes of programs to make
judgments about the program, improve program
effectiveness, and/or inform decisions about
future programming (Patton, M. Q. 2002).
14PROGRAM EVALUATION
Exceeding the goal
What is?
Just achieving the goal
Program Goal
What should be?
What is?
Not reaching the goal
What is?
Systematic Comparison
Expected Outcome
Actual Outcome
15EVALUATION PROCESS
16ESSENTIAL ELEMENTSOF EVALUATION DEFINITIONS
- It is a systematic process.
- Associated with a program
- It is a comparison.
- It is about objectives and outcomes.
- It facilitates value judgments about the program.
17 WRITE YOUR OWN DEFINITION OF PROGRAM EVALUATION
18PROGRAM EVALUATION
It is the process by which educational outcomes
are systematically compared with the goals and
objectives in order to make value judgments about
the educational program.
19Knowledge and Skills Essential for Program
Evaluation
- Learning to define key terms of program
evaluation. - Learning to apply evaluation standards.
- Learning to work with key stakeholders.
- Reviewing evaluation approaches.
- Reviewing evaluation models.
- Learning to use evaluation models to focus the
evaluation. - Learning to conduct an evaluability assessment.
- Design data collection methods and tools
- Collecting data
- Analyzing data
- Writing evaluation reports
- Utilizing evaluation
- Meta evaluation
20FORMATIVE AND SUMMATIVE EVALUATION
- Evaluation has two functional roles namely
formative and summative (Scriven, M. 1967) - Formative evaluation is conducted to provide
program staff evaluative information useful in
improving the program. - Summative evaluation is conducted and made
public to provide program decision makers and
potential consumers with judgments about that
programs worth or merit in relation to important
criteria.
21FORMATIVE AND SUMMATIVE EVALUATION
- Both formative and summative evaluations are
essential because decisions are needed during the
developmental stages of a program to improve and
strengthen it, and again, when it has stabilized,
to judge its final worth or determine its future
(Worthen, B. R., Sanders, J. R., and Fitzpatrick,
J. L. 1997, P.15 and 14).
22FORMATIVE VS. SUMMATIVE EVALUATION
23FORMATIVE AND SUMMATIVE EVALUATION
- When the cook tastes the soup, thats
formative when the guests taste the soup, thats
summative.
24BASIC TERMS OF EVALUATION
- Benchmarking Recording the initial situation or
condition before an intervention. - Cost benefit analysis The comparison of program
expenditure and benefits in dollar terms. - Evaluability assessment The procedure used to
lay the foundation for an evaluation based on
stakeholders interests. - Evaluand The thing being evaluated Scriven, M.
S. p.73 - Evaluation instrument Survey questionnaire
designed to collect evaluation data. - Educational program Sequence of planned
educational activities to achieve a set learning
goal - Immediate outcome Benefits or results
participants can derive be the end of a program
25BASIC TERMS OF EVALUATION
- Logic model Systematic presentation of
resources, activities, and results of a program
to visualize dynamic relationships among each of
these. - Long-term outcomes Benefits or results
participants can derive long time (after six
months) after completing a program. Long-term
outcome are similar to impact. - Meta evaluation Critical review of an
implemented evaluation for further improvement. - Outcomes Results, changes or benefits derived
from a program - Program impact Condition or situation
improvement as a result of a program
26BASIC TERMS OF EVALUATION
- Program Input Resources allocated for a program
- Program output Educational materials and
activities developed and delivered - Qualitative methods Approaches used to explore
the evaluating situation in detail in its natural
setting to answer evaluation questions. - Quantitative methods Approaches used to generate
numerical data to answer evaluation questions. - Stakeholder An individual who may be involved
in, interested in, or affected by an extension
program.
27What is the Difference between Research and
Evaluation?
- The main difference between research and
evaluation is that research is usually conducted
with to the intent to generalize the findings
from a sample to a larger population.
Evaluation, on the other hand, usually focuses on
an internal situation, such as collecting data
about specific programs, with no intent to
generalize the results to other settings and
situations. In other words, research
generalizes, evaluation particularizes. Priest,
S. (2001). -
28What is the Difference between Research and
Evaluation?
- The main purpose of evaluation is improvement and
accountability where as the main purpose of
research is testing or investigating a concept or
theory. That is why some people say evaluation is
to improve research is to prove.
29MAIN STEPS OF THE EVALUATION PROCESS
- Identify the key stakeholders of the program.
- Clarify the evaluation expectations of the key
stakeholders. - Plan evaluation by carrying out an evaluability
assessment prior to undertake the full scale
evaluation. - Design data collection methods and tools
- Collect data
- Analyze and interpret data
- Write evaluation reports
- Utilize evaluation
- Meta evaluation
30SUMMARY
- Reviewed class objectives
- Reviewed the course organization expectations
- Explored the meaning of evaluation
- Defined basic evaluation terms
- Developed a definition for extension evaluation
- Compared formative and summative evaluations
- Reviewed main steps of the evaluation process
31RFERENCES
- Patton, M. Q. (2002). Qualitative research and
evaluation methods (3rd Ed.) p. 10. Thousand
Oaks, CA Sage Publications. - Priest, S. (2001). A program evaluation primer.
Journal of Experiential Education, 24(1), 34-40. - Rossi, P. H. and Freeman, H. E. (1993).
Evaluation a systematic approach (5th Ed.).
Newbury Park, CA Sage Publications. - Shadish, W. R., Cook, T. D., Leviton, L.
C.(1991). Foundations of program evaluation.
Newbury Park. p73. CA Sage Publications. - Scriven, M. (1991). Evaluation thesaurus (4th
Ed.) p.4. Newbury Park, CA Sage Publications. - Scriven, M. (1967). The methodology of
evaluation. In R. E. Stake (Ed.), Curriculum
evaluation. American Educational Research
Association Monograph Series on Evaluation, No.
1, pp. 39-83. Chicago Rand McNally. - Wholey, J. S., Hatry, H. P., Newcomer, K. E.
(2004). Handbook of practical program evaluation
(2nd Ed.) Pp33-60. San Francisco Jossey-Bass. - W. K. Kellogg Foundation Evaluation Hand Book.
(1998). http//www.publichealth.arizona.edu/chwtoo
lkit/PDFs/Logicmod/chapter1.pdf - Worthen, B. R., Sanders, J. R., and Fitzpatrick,
J. L. (1997). Program evaluation Alternative
approaches and practical guidelines. (2nd Ed.)
p.7. New York, Longman Publishers.