Title: What is Evaluation?
1What is Evaluation?
- David Dwayne WilliamsBrigham Young
UniversityDavid_Williams_at_byu.edu
2Evaluation, Assessment, Measurement, and Research
- Evaluation usually includes describing
- what is and
- what should be, then
- judging or comparing the two, as in a balance.
- Measurement is an essential tool for gathering
information about what is. John Brown example. - Assessments involve using measurement processes
regularly for established purposes. - Research involves measuring what is, then seeking
to understand and explain, not to judge.
3Vocational Rehabilitation Examples
- Utah Statewide ASSESSMENT of the Rehabilitation
Needs of Individuals with Disabilities 2006-07
Final Report. - Michael Leahys presentation yesterday on a
synergistic program evaluation MODEL PARTNERSHIP - From the announcement of this conference
consumer satisfaction studies, surveys, case
file reviews, comprehensive needs assessments,
economic impact studies, and use of other quality
assurance measures. - Program Evaluation and Justification Review of
the Rehabilitation Program Administered by the
Department of Labor and Employment Security,
Report No. 98-04, July 1998
4Vocational Rehabilitation Evaluation
- Has a long history in the literature (I found
resources from the 1970s forward). - However, as in many fields, evaluations may turn
out to be assessments, measures, or research
rather than full evaluation. - Lets look at what the field of evaluation says
about evaluation and then we can decide whether
Vocational Rehabilitation might gain from what
they offer
5An Evaluation Framework based on Ideas From
Several Theorists
- Alkin, 2004
- Fetterman, 2001
- Guba Lincoln, 1989
- Patton, 2002, 2008
- Stake, 2004
- Stufflebeam, 2001, 2007
- Weiss, 1998
- Fitzpatrick, Sanders, Worthen, 2003
6Evaluation Framework Overview
Evaluation Checklist
Program Evaluations Meta-Evaluation Checklist
- Background Information
- Audience Stakeholders
- Evaluand Information
- Stakeholder Concerns
- Judging Criteria
- Questions to Answer
- Data Collection Processes
- Data Analysis
- Reporting Strategies
- Results
- Recommendations
- Resource Valuation
- Budget and Schedule
- Self-Critique using meta-evaluation
- Overview
- Meeting Requirements for Utility
- Meeting Requirements of Feasibility
- Meeting Requirements for Propriety
- Meeting Requirements for Accuracy
7Context for understanding an Evaluation
- What does the literature associated with the
evaluand say are the key issues? - How did this evaluand come to be of interest to
you? - What is your background that is relevant to this
evaluation? - What evaluation has been done on this evaluand
already? - Is the evaluand evaluable at present?
- Why is an evaluation appropriate now?
- What approaches to evaluation were considered and
which will be used and why?
7
8Possible VR Context Questions
- What does the literature about Vocational
Rehabilitation say ought to be included in a
study? - How did this program or this counseling technique
or this client come to be of interest to you? - What perspectives are you taking on this
evaluation because of your particular background?
- What might you be missing because of that?
- What alternative views do you need to insist on
including, besides your own? - What evaluation have you or others already done
on this evaluand? - What has been learned from previous evaluations?
9Who are the stakeholders who care? Why?
- Who asked for the evaluation and why?
- Who stands to benefit from the evaluation and
how? - Who is served by the evaluand or should be?
- Who is likely to use the evaluation results to do
something helpful? - Who does not usually have a voice in matters
associated with the evaluand but has a stake in
it?
9
10VR evaluators might ask about stakeholders
- Who else besides me cares about this treatment,
these resources, or this program? - Have any of them asked for an evaluation?
- If so, why? If not, why not?
- Why do I and these other people care about this
program? - What do we stand to lose or gain by what happens
with this program? - Who else is served by this program or should be
and therefore should have an interest in its
evaluation? - Are the administrators, other counselors, family
members, employers, or others likely to use
evaluation results to do something different?
11What is the evaluand or thing the stakeholders
care about?
- What do you already know about the evaluand?
- What or Who it is
- What its or their objectives are
- How it works or what they are doing
- What more do you need to learn to refine the
description and definition of the evaluand so you
can focus your evaluation on it or them?
11
12What are the things or people VR evaluators
might evaluate?
- One key evaluand may be them as counselors,
- Or it may be the curriculum or program theyre
using, - Or a particular technique they are piloting,
- Their clients current performance, employment,
concerns, and associated needs for improvement, - The relationships among several components of a
program. - Or a test used to ascertain growth in client
performance?
13What criteria do stakeholders have for judging
the evaluand?
- What values do the stakeholders manifest
regarding the evaluand? - What do they think the evaluand should be
accomplishing (criteria for success)? - What standards do they have or how completely do
they hope the evaluand will meet the criteria? - How will they know when the evaluand is
successful to their satisfaction?
13
14VR evaluators might ask these criteria questions
- What do we and other stakeholders value that
should guide our evaluation efforts? - What should clients who participate in this
program activity be able to do when they finish? - How well should clients perform on the selected
criteria if the program is going to be considered
successful? - What should counselors be doing, at what level of
performance to help clients be successful?
15What questions do stakeholders want to answer?
- Based on the previous points, what evaluation
questions should be asked? - Based on a rating or ranking of all possible
questions raised, which are the highest
priorities? - Which questions will this study address and why?
15
16VR evaluators might ask these questions to match
the criteria
- How are clients performing compared to the ideal?
- Is there a need for an intervention change?
- How well was the program implemented?
- How many of the clients performed at or above 80
on the job placement test? - How well did this counselor do in preparing their
clients to apply for a job? - How well are we evaluating our interventions in
terms of implementation and outcome?
17What processes will be used to collect analyze
data to answer the questions compare the
evaluand to its criteria?
- For each question listed earlier, what
- Information will be collected analyzed?
- Using what data collection procedures?
- By whom and when?
- How will each procedure be refined to ensure
validity, reliability, credibility,
trustworthiness, etc.
17
18VR evaluators may collect analyze quantitative
or qualitative data by
- Drawing upon formal measures developed by others
or creating their own tests and performance
activities, - Conducting informal interviews and observations,
- Engaging clients in dialogues digitally
recording them for analysis by the clients or
others, - Analyzing these and other data both qualitatively
quantitatively, - Comparing these descriptions of what is to the
criteria and standards identified earlier.
19What reporting recommendation strategies will
be used?
- What interim reports will be given to whom and
when? - What final reports will be given to whom and
when? - How will the reports be organized, around what
points? - Will there be oral reports? Written reports?
Other formats? - How will results be organized and displayed?
- What are the results or what results are
anticipated? - Where will recommendations come from?
- Will you be qualified to make recommendations and
why? - What recommendations are there, who should
implement them and how?
19
20VR evaluators report results and recommendations
through the use of
- Informal oral reports for own program
evaluations, - Interim reports to share with others,
- Formal written reports with charts and tables,
- Reports on study progress stakeholder
involvement, - Implications for future evaluation activities,
- Evaluative judgments about quality of evaluands,
- Realistic recommendations- using processes that
involve the stakeholders who will implement the
recommendations.
21Metaevaluation of Evaluation Plans,
Implementation, Outcomes
- Encourages high quality evaluations
- Can be done internally or externally
- Could involve Standards established by the Joint
Committee of thoughtful professionals - Seeks to enhance evaluation quality in terms of
- Utility
- Feasibility
- Propriety
- Accuracy
22VR evaluators should meta-evaluate to enhance
quality
- When anticipating conducting an evaluation,
- While conducting one, and/or
- While reviewing evaluations performed by
themselves or others. - Using Joint Committee standards to help clarify
what they want to evaluate, - Using the Standards to judge how well they are
evaluating, - By clarifying who they are serving with their
evaluation, and how they value the results of
their evaluation efforts.
23Implications for VR Participants
- Use Measurement and Assessment in a broader
Evaluation context to enhance VR programs by - Attending to context, background and literature
- Serving the values interests of all
stakeholders - Involving stakeholders in clarifying the
evaluand, criteria, standards they care most
about - Targeting stakeholders questions with a variety
of data collection and analysis methods that
involve measures of high quality to assess how
well what is matches up with what should be
for the stakeholders - Sharing results and recommendations that are
realistic and useful for the stakeholders in ways
they can use.
24Come Learn More This Afternoon at a Workshop. We
will
- Review the evaluation framework presented here
- Discuss the premise that measurement and
assessment are means for doing evaluation and
research - Discuss and write down current practices and
questions about evaluating your work activities. - Develop plans for applying these ideas to your
practice - Share emerging plans with other participants for
feedback - Receive guidance and feedback from presenter
- Accept the challenge to apply this plan at home
and to contact the presenter with questions and
further guidance if wanted.
25For more information or questions,
- Contact
- David Williams
- 150 G MCKB
- Brigham Young University
- Provo, UT 84602 USA
- David_Williams_at_byu.edu