Title: DESIGNING EVALUATION INSTRUMENTS
1DESIGNING EVALUATION INSTRUMENTS
2- Upon completion of this lesson, students should
be able to - List the step by step procedures for developing
quality evaluation instruments - Describe the errors that must be controlled in
evaluation instruments - Develop different forms of questions to record
outcomes such as change of knowledge, attitudes,
skills, aspirations, and behaviors - Write process evaluation questions
- Describe reliability and validity
- Identify double barreled questions and
- Develop an evaluation instrument.
3How to Design Your Data Collecting Instrument?
4Begin with the information needs of key
stakeholders
- Information needs for program improvement
- Information needs for accountability
5Designing Instruments Step 1 Identify the type
of data and information you need to collect.
- Focus on the information needs of key
stakeholders. - Clearly identify what data and information are
needed to collect for this purpose. - Identify the major categories of information that
you need to collect. - List subcategories of information under major
categories.
6Designing Instruments Step 2 Develop the
Sketch of Your Instrument.
- List the major items in your instrument to
structure it. - Organize the structure of your instrument to
collect needed data. - Organize subcategories under each major topic.
- Include the demographic data collection section
at the very end of the instrument.
7Designing Instruments Step 3 Identify Necessary
Scales and Questions.
- Determine the scales you need to include in your
instrument. - Determine the types of questions you need ask.
8Designing Instruments Step 4 Be Consistent in
Numbering Answer Choices and Scales
- It is a good idea to use low numbers for lower
manifestation of a measuring variable. - For example levels of education answer choices
should be coded as follows - High school diploma
- Bachelors degree
- Masters degree
- Doctorate
- By using a consistent pattern throughout the
instrument you can easily interpret results.
9Designing Instruments Step 5 Writing Questions
- As a general rule, when writing questions, you
must ask why am I asking this question? - Remember your evaluation information needs
always. - Think about the answer before you write any
question. - There are two ways to write a question
- Open-ended
- Example What methods do you use to educate
farmers on sustainable agriculture? - Closed ended
- Example What methods do you use to educate
farmers on sustainable agriculture? - Field days
- Workshops
- Seminars
- Printed materials
- Electronic materials
- Others (please specify)___________________
10Designing Instruments Writing Open-Ended
Questions
- Things to remember when writing questions
- Write questions clearly and concisely.
- Start with least sensitive or non-threatening
questions. - Write questions by thinking about the reading
level of the target population. - Avoid double negatives.
- Avoid double-barreled questions.
- Example Are you satisfied with the place and
time of the program?
11Designing Instruments Writing Open-Ended Questions
- Open-ended questions are useful to explore a
topic in depth. - However, open-ended questions are difficult to
- Analyze
- Respond
- Therefore, limit the number of open-ended
questions to the needed minimum. - When you need to ask a sensitive question it is
appropriate to use a closed-ended question with
response categories for the sensitive
information. - Example Asking income or age (Ask what is your
age group and provide age categories instead of
asking how old are you?)
12Designing Instruments Writing Closed-Ended
Questions
- When writing closed-ended questions
- Make sure to include all possible response
categories. - If you have not included all possible answer
categories, it is a good idea to include a
category called Other and provide instruction
to specify what the respondent means under this
category. - Make sure that your answer categories are
mutually exclusive. - Example What is your age group?
- Less than 20 years
- 20-30 years
- 31-40 years
- 41-50 years
- Above 50 years
13Designing Instruments Writing Closed-Ended
Questions
- Closed-ended questions are
- Easy to analyze.
- Not exploratory in terms of searching
information.
14Scale Development
- Develop scales if you need to include in your
instrument.
15Guidelines For Scale Development
- Scales are developed to measure elusive phenomena
that cannot be observed directly. Example
Attitudes, Aspirations. - Therefore, scale development should be based on
the theories related to the phenomenon to be
measured. - Thinking clearly about the content of a scale
requires thinking clearly about the construct
being measured.
16Guidelines For Scale DevelopmentGenerate an Item
Pool
- The properties of a scale are determined by the
items that make it up. - At this stage, you need to develop more items
than you plan to include in the final scale.
17Characteristics of Good Items
- Unambiguous.
- Avoid exceptionally lengthy items.
- Consider reading levels of the target
respondents. - Include positively and negatively worded items.
The purpose of wording items both positively and
negatively within the same scale is usually to
avoid acquiescence, affirmation, or agreement
bias.
18Guidelines For Scale DevelopmentDetermine the
Format for Measurement
- There are different formats
- Identify the format you would like to use with
your items. - Determine how many response categories you need
to include in your format. -
19Guidelines For Scale DevelopmentDetermine the
Format for Measurement
- The number of response categories should be
limited to the respondents ability to
discriminate meaningfully. - Normally 5-7 response categories are adequate for
extension and education program evaluations. - Example 1. Strongly disagree 2. Disagree 3.
Somewhat agree 4. Agree 5.Strongly agree
20Guidelines For Scale Development Likert Scale
- Named after Rensis Likert.
- This is the most common format
- The response options should be worded so as to
have roughly equal intervals with respect to
agreement. That is to say the difference in
agreement between any adjacent pair of responses
should be about the same as for any other
adjacent pair of response options. - Common choices for a mid point include neither
agree nor disagree and Neutral.
21Guidelines for Scale Development Likert Scale
- Example for items in Likert format
- Strongly Disagree
- Disagree
- Somewhat Agree
- Agree
- Strongly Agree
22Guidelines For Scale DevelopmentSemantic
Differential Scaling
- There are several numbers between the adjectives
that constitute the response options. - Example The quality of training session
- Poor 1 2 3 4 5 6 7 Excellent
23Instrument DevelopmentStep 6 Provide Necessary
Instructions to Complete the Survey
- Clear instruction is essential to facilitate the
responding process. - Instructions should be clearly and politely
stated. - Clear instructions increase your return rate as
well as accuracy of your data.
24When You Develop a Questionnaire
- Keep it short, simple, and clear
- Include only needed questions for indicators
- Should be compatible with the reading level of
the respondents - When you use closed-ended questions make sure to
include all possible answer choices.
25Instrument DevelopmentStep 7 Format Your
Instrument
- Appearance and editing of your instrument are
important determinants of response rate. - Therefore, format, structure, and edit your
instrument professionally.
26Instrument DevelopmentStep 8 Establish Validity
and Reliability of Your Instrument
- Reliability refers to the extent to which a
measuring instrument is consistent in measuring
what it measures. - Test-retest method We administer the instrument
to a sample of subjects on two occasions and
correlate the paired scores to establish the
reliability. - Validity refers to the extent to which an
instrument measures what it is intended to
measure. - Use experts views to establish validity.
27APPLICATION OF STEPS
28Determine Your Evaluation Questions
- Identify the precise questions need to be
answered. - Use the logic model to narrow the focus of
evaluation.
29LOGIC MODEL
Measuring Program Impact
INPUTS OUTPUTS OUTCOMES -
IMPACT
Activities Participation LEARNING ACTION
IMPACT
What resources does your program need to achieve
its objectives?
What should you do in order to achieve program
goals and objectives?
Who should - participate - be
involved? - be reached?
What do you expect the participants will know,
feel or be able to do immediately after the
program?
What do you expect that participants will do
differently after the program?
What kind of impact can result if the
participants behave or act differently?
Staff Volunteer Time Money Materials Equipment
Technology Partners
Workshops Meetings Camps
Demonstrations Publications Media
Web site Projects Field Days
Number of target clients Their
characteristics Their reactions
Awareness Knowledge Attitudes Skills
Aspirations
Behavior Practice Decisions
Policies Social Action
Social Economic Environmental
30Possible Question Categories
- Process evaluation questions (These are mostly
open-ended) - Questions on client characteristics
- Are we reaching the target clients?
- Questions on program delivery
- Did the program use effective teaching methods?
- What are the weaknesses?
- Impact evaluation questions
- Questions on clients satisfaction
- Did the target clients find the program useful?
- Outcomes
- Did the program participants change KASA?
- Did the program participants change their
practices? - Impacts
- Did the participants save money/improve health
condition?
31What Data Are Needed for Program Improvement?
- Were participants satisfied with
- Information received
- Instructors
- Facilities
- Quality of training
- What do they like/dislike about the training
- Did the training meet their expectations?
- If not, Why
- Ideas for further improvement
- Look for data that you can use to fix weaknesses
and build on strengths.
32How to Collect Training Improvement Data?
Please circle the appropriate number for your
level of response.
33How to Collect Training Improvement Data?
- Did the training session meet your expectation?
- Yes
- No
- Would you recommend this training workshop to
others? - 1. Yes
- 2. No
- If not, why____________________________________
- What did you like the most about this training?
- What did you like the least about this training?
- How could this training be further improved?
34Other Data
- Demographics
- What is your gender?
- ____ Male____ Female
- How do you identify yourself?
- ___African American
- ___American Indian/Alaskan
- ___Asian
- ___Hispanic/Latino
- ___Native Hawaiian/Pacific Islander
- ___White
- ___Other
35What Data Are Needed for Program Accountability?
- You need impact data
- To prove that your program achieved its
objectives
36How to Document Perceived Knowledge Change?
Example for Agriculture
37How to Document Levels of Aspirations?
- At the end of a successful training session,
participants will have a heightened level of
aspirations to apply what they learned. - They are ready to taking charge of what they
learned. - Participants are asked whether they intend to
apply what they learned. - Example As a result of this training, do you
intend to drink reduced fat milk? The answers to
this question would be - No
- Maybe
- Yes
- Im already doing this
38How to Document Aspirations?
Example for FCS
Please circle the number that best describes your
answer.
39Retrospective Pre and Post Evaluations
- Advantages
- Simple Easy to collect data
- Disadvantages
- Not appropriate for collecting data from very
young audiences and low literacy adult audiences.
Because they will not be able to compare before
and after situation retrospectively.
40Pre and Post Evaluations
- Pre Evaluation is administrated before your
training session. - Post Evaluation is administrated at the end of
your training session. - You need to match pre and post evaluations for
comparison. - Pre and Post Evaluation will document three
impact indicators - Change in Knowledge
- Change in Skills
- Levels of Aspirations
41How to Document Change in Knowledge?
- Ask same set of questions before and after your
educational session and compare their answers to
document the knowledge gain from the program.
42How to Document Change in Knowledge?
Example for FCS
43How to Write Knowledge Testing Questions
- Dont use general knowledge questions.
- Dont include attitudinal or perceptual
statements. - Example Growers should practice conservation
tillage. __True __False
44True and False Questions vs Multiple Choice
Questions
- True and False questions save your time and
respondents time. - Easy to analyze.
- Help you keep your survey short.
45How to Document Change in Skills?
- Skill changes are measured indirectly by using
participants levels of confidence to carry out
the specific tasks focused in the program.
Example Participants confidence to calibrate
equipment.
46How to Document Change in Skills?
- We record their levels of confidence to carry out
specific tasks before and after the program on a
Likert scale. - Compare pre and post responses to document the
skill change.
47How to Document Change in Skills?
Example for Agriculture
48Pre and Post Evaluations
- Advantages
- Comparison is more accurate than retrospective
pre and post evaluation. - Appropriate for young and low reading audiences.
- Disadvantages
- If you want to compare pre and post evaluations
to assess the change you must match pre and post
evaluations for each participant. - This is somewhat challenging.
49 Change Attitudes
- Difficult to measure
- Need to be very careful in designing scales to
measure attitudes - Not a practical indicator
- Pre/Post tests
50CHECKING ATTITUDES
To what extent do you agree or disagree with each
of the following statements
Statement Strongly
Disagree Undecided Agree Strongly Disagre
e Agree 1 2
3 4 5
- Conservation Tillage is profitable. 1
2 3 4 5 -
- (Need to include at least 10-15 items to achieve
desired level of validity and reliability)
51How to Document Behavior Change?
- You need to understand the behavior change
process for designing evaluation questions.
52Understanding Behavior Change Process
- Behavior change is a process.
- Prochaska and DiClemente developed a model to
explain the human behavior change process. This
model is called the Transtheoretical Model. - According to the Transtheoretical Model, there
are five stages in behavior change process.
53Prochaska and DiClementes Stages of Change
Prochaska, J. O. and DiClemente, C. C. (1994).
The Transtheoritical Approach Crossing
Traditional Boundaries of Therapy.
Malabar, Florida Kerieger Publishing Company.
54Evaluation Template
For each of the following practices, please
circle the number that best describes your
current behavior.
55How to Collect Impact Data from Multi-Session
Programs
- Benchmark Survey is administrated before the
Extension program. - End of Program Survey is administrated at the
end of the extension program. - By comparing benchmark and end of program surveys
you will be able to document the change of
participants behaviors/practices and skills.