Title: The Role of Evaluation
1The Role of Evaluation Assessment in EE
M. Lynette Fleming, Ph.D. Research, Evaluation
Development Services Tucson, AZ fleming_at_cox.net
2Improving Programs with Evaluation Assessment
M. Lynette Fleming, Ph.D. Suzanne Dhruv, M.A.
CandidateArizona Association for Environmental
Education Leadership Clinic, 2003 The Role of
Evaluation Assessment in EE Leadership Clinics
Kim A. Freier, Ed.D L. Kate Wiltz, M.S. M.
Lynette Fleming, Ph.D. LCDW, 2004
Adapted from
3Session Goals
- By the end of the session, participants will be
able to - Define evaluation, stakeholder, assessment,
authentic assessment - Compare and contrast evaluation and assessment
- Name three types of evaluation that parallel
three stages of program development - Identify reasons for evaluating and assessing EE
Leadership Clinics - List at least five tools that can be used to
collect data for an evaluation or assessment - Describe SMART program objectives and their role
in assessing participant outcomes - Identify resources for evaluation tools and
information
4What Is Evaluation?
- Evaluation is the process of establishing
judgments of merit or worth based on evidence
about a program or product. - It is a form of disciplined inquiry with similar
methods but different purposes than research.
5What is Assessment?
- Assessment is an evaluation of skills and
knowledge acquired by learners during a learning
experience.
6Assessment Goals
- To improve instruction and consequently
- help students become more successful learners
- help teachers become more effective educators
- To measure or judge learning criteria
7Evaluate your ProgramandAssess your
Participants Outcomes
- Assessment is a part of evaluation
8Program Evaluation is
- the SYSTEMATIC COLLECTION of information about
- the ACTIVITIES, CHARACTERISTICS, OUTCOMES of
programs - to MAKE JUDGEMENTS about the program, IMPROVE
program effectiveness, - and/or INFORM DECISIONS about future programming.
(Patton, 1997)
9Whose Judgments Count?
- Stakeholders -- Individuals groups who have an
interest in the success (or failure) of the
program - Administrators, scientists, staff, volunteers
- Participants, learners, teachers other
educators, audience - Parents, children, significant others
- Commerce and industry
- Sponsors, potential funding agencies
- Legislators
- Municipal officials
- General public, community residents
- Experts, professionals in field
10Where Does Evaluation Fit?
Plan
Design
Results
Develop
Do
What should we do?
What happened?
What is going on? How can it be improved?
Summative
Formative Remedial
Front-end
11Evaluation at the Start
- Front-end Evaluation/Needs Assessment
- Gathers information/data about the gap between
desired and current audience knowledge,
behaviors and attitudes. - Front-end evaluation is conducted during the
early stages of program or exhibit development. - The data help determine audience appropriateness,
identify stakeholders and collaborators, as well
as define goals and objectives.
12Evaluation in the Middle
- Formative Evaluation
- Gathers information/data about an audiences
reactions to and learning from a pilot program or
prototype exhibit. Changes are made as a result
of formative evaluation. - Remedial Evaluation
- Gathers information/data about problems with the
delivery and outcomes of a program after
implementation. Corrections are often made as a
result of remedial evaluation.
13Evaluation at the End
- Summative Evaluation
- Gathers information/data about an audiences
knowledge, behaviors and attitudes after the
implementation of a program. Summative evaluation
data inform funders about the value of a program. - Gathers information about the process of program
development. Summative evaluation data can offer
insights into the development of the next
project. - Decisions regarding the continuation, expansion
or contraction of a program are generally made as
a result of summative evaluation.
14Evaluation and Your Program
- No matter when the data collection
- is taking place,
- planning for and using evaluation information is
an ongoing part of planning and implementing your
program. - The time to think about evaluation is now!
15(No Transcript)
16Design an Evaluation
- Define the purpose of the evaluation.
- What questions will the evaluation answer? Who
wants the evaluation? Is everyone involved
committed to it? - How will it help decision making?
- Determine resources available.
- Budget, personnel, time
- Identify information source(s).
- Who can provide the answers to the question(s)
you want answered?
17Design an Evaluation, 2
- Choose the tool(s) for collecting data.
- Check the literature. Talk with colleagues.
- Develop (or borrow!) tools/instrument(s).
- Define sampling procedures.
- Decide how to report the results.
- How will you record the data?
- How will you analyze and interpret the data?
- Carry out your plan - data gathering, entry,
analysis, reduction occurs now - Report your results.
18What about theLeadership Clinic????
- Evaluating your clinic is a piece of your EE
Program Evaluation, so evaluate your clinic and
assess your participants learning - It still follows these steps just on a smaller
scale than evaluating your whole EE program
19What do you want to know?
- Clinic goals and objectives
- What did your participants learn?
- Broader goals and objectives for your EE program
- Reporting data for funders, agencies
- Demographics, satisfaction ratings
- Standard questions across programs
- Howd you hear about our organization?
- Are these questions
- Reactions or demographics?
- Outcomes or long term impacts?
20What Can Evaluation Assessment Do for
Environmental Educators?
- Determine audience appropriateness
- Measure efficiency in delivery
- Compare effectiveness
- Describe value to agency
- Identify how program meets resource goals
- Measure effectiveness, efficiency, fairness or
justice, acceptability, aesthetics - MORE -
21What Cant Evaluation Assessment Do for You?
- Cant tell you the answer
- may tell you whats happening, but often cant
tell you exactly why or what to do about it - Cant answer all your questions
- Cant make your decisions for you
- Cant resolve all conflicts
22Choosing Your Tool(s)
- Choose the tools that are the easiest and least
expensive ways to provide data you can analyze to
answer your questions. - Choose the tools that maximize advantages and
minimize disadvantages given the situation,
audience and resources. - Time, money, skill and philosophy of the
evaluator strongly influence tool choices. - No tool is perfect in all situations.
23Tools (with each you can gather qualitative
quantitative data)
- Observations
- by people, by media (audio, video,
electronic/magnetic devices) - Questionnaires/Surveys/Tests
- self-administered (mail, newspaper/ magazine,
other), partially self-administered - Interviews
- individual, in-person (informal or structured),
focus groups (usually informal, but structured) - Plus document or product review, photographs and
artwork, performances/presentations, case study,
concept maps, behavior trace measures, web site
hits
24Approaches to Assessment
- Two important views when considering the process
of assessment - 1) One type of assessment cannot meet the needs
of all purposes or audiences. - 2) Assessment is a critical part of learning.
Assessment can build on instruction and create a
dynamic learning environment.
25Authentic Assessment
- Performance assessments - not artificial or
contrived. - Characteristics of authentic assessment
- Authentic, true-to-life, in touch with learners
- Student-centered and experiential
- Can be on-going, throughout the school year -
portfolios - Allows for a range of possible answers
- Rubrics checklists tell learners expectations
and how they will be judged - Encourages student involvement
- Integrated into lessons activities
- Data can be both quantitative and qualitative
26Authentic Assessment Tools
- Journals/field notes
- Written or oral stories or anecdotes
- Comment books/logs/letters
- Performances/presentations/demonstration
- Audiotape/videotape
- Photographs and artwork
- Portfolios
- Concept Maps
- Judged using rubrics and other scoring tools
27Getting StartedWriting SMART Objectives
- Most programs are goal-based
- Measurable objectives can translate your goals
into evaluation questions - SMART objectives will tell you
- What to ask whom when