Title: Avoiding Data Analysis and DecisionMaking Pitfalls: How an Evaluation Guide Can Help
1Avoiding Data Analysis and Decision-Making
Pitfalls How an Evaluation Guide Can Help
- Karin Aure Dixon, Ed.D.
- 87th Annual CERA Conference
- December 4, 2008
2How it all began
3Data Analysis and Decision Making Pitfalls
We began with identifying problems in our data
analysis and decision making
- Misuse of data (e.g., CST data to place
individual students) - Misinterpretation of data (e.g., CST comparisons
across grade levels) - Disproportionate valuation of data (e.g., CELDT
as only measure of ELD) - Inappropriate data for program purpose (e.g., CST
data for Afterschool Program) - Lack of comprehensive data (e.g., all outcome
data)
4The Old Stupid vs. The New Stupid
The OLD Stupid Resistance to performance Measures
The NEW Stupid Reflexive reliance on a few simple
metrics
- Today's enthusiastic embrace of data has
waltzed us directly from a petulant resistance to
performance measures to a reflexive and
unsophisticated reliance on a few simple
metricsnamely graduation rates, expenditures,
and the reading and math test scores of students
in grades 3 through 8. The result has been a
nifty pirouette from one troubling mind-set to
another with nary a misstep, we have pivoted
from the "old stupid" to the new stupid.
(Frederick M. Hess, 2008)
5Solutions to the Pitfalls
Next, we began to identify solutions to the
pitfalls
- Address and improve the process of evaluation
- Create a protocol for evaluating programs and
practices - Create a guide describing the new protocol
- Educate stakeholders about the protocol
- Let the protocol become a habit of mind
6Guidance for Using Data to Inform Decision Making
From the December 2008/January 2009 issue of
Educational Leadership
- Focus on questions, not data
- Be skeptical of easy answers
- Become assessment literate
- Think beyond test scores
- Use informed judgment
7Evaluation Wheel
Then came the wheel
Actually, it started as a pie
8Evaluation Guide
Then came the guide
9Evaluation Steps
1. Identify Question
2. Identify System of Focus
6. Use Evaluation Findings
3. Identify and Collect Data
5. Interpret Data
4. Organize and Analyze Data
Examples
10Step 1 Identify question
- What do you want to know?
- Guiding questions
- Focus your evaluation
- Define your purpose
- Determine data to be collected
- Characteristics of good questions
- Open-ended
- Allow for all possibilities
- Feasible
Examples
11Evaluate these sample questions
- Are counseling groups at the high school
worthwhile? - How does participation in a girls group affect
students understanding of bullying and
harassment? - Are LUSD graduates becoming contributing members
of society? - What is the effect of Head-Pollett math
instruction on student performance in Measurement
and Geometry? - What are grade students favorite colors?
12Step 2 Identify System of Focus
- ? Cognitive System
- ? Meta Cognitive System
- ? Self System
- Focuses evaluation
- Aligns with district professional development
and strategic direction
13Step 3 Identify and Collect Data
- Identify existing data that address your question
(e.g., state tests, local tests, staff
development records, observation records) - Determine what information is lacking and design
new data collection tools/techniques, as
necessary (e.g., survey, focus group, assessment) - Consult Evaluation Wheel to ensure comprehensive
coverage (e.g., outcome and process data,
multiple data sources)
14Step 4 Organize and Analyze Data
- Summarize the data (e.g., percentage of students
proficient, average student growth, common
patterns in student perceptions) - Disaggregate by important factors (e.g., CELDT
proficiency level, grade, program status,
implementation, level of support) - Use charts and graphs to analyze data visually
- Identify limitations of tools and data collection
strategies (e.g., no comparison group or data,
small sample, imperfect assessments)
15Step 5 Interpret Data
- Review summarized data
- Generate broad statements of results
- Consult Evaluation Wheel to identify important
context factors and relationships - Formulate explanations of the data
- Determine recommendations
- What accounted for these results and what should
be done in response?
16Step 6 Use Evaluation Findings
- Create an action plan
- Include continued data collection
- Determine next review cycle
- Get to work!
17Putting the Guide to Work Examples
- Systematic English Language Development
- Question How has EL student academic performance
changed since the implementation of SELD? - Data Outcome CELDT, CST ELA, Express test
Process Classroom observations, review of
materials - READ 180
- Question How are R180 students performing
academically over time and compared to Non-R180
students in Reading? - Data Outcome CST, SRI lexiles, rSkills,
independent reading quizzes Process Teacher
survey, classroom observation protocol, principal
response, software zones
18Putting the Guide to Work Examples (cont.)
- Dual Language Immersion Program
- Question How are Dual Immersion students
performing over time and compared to Non-dual
students in ELA and SLA? - Data Outcome CST ELA, STS, Aprenda, CAT/6,
student writing samples (Spanish) Process
Student presentations, teacher observations - Reading Proficiency
- Question How are elementary students performing
in Reading? - Data Outcome CST ELA Reading Comprehension,
CAT/6 Reading, SRI, DRA, classroom reading data
Process Principal response, literacy framework
review
19Questions and Comments
- Thank you!
- Karin
- kdixon_at_lindsay.k12.ca.us