Title: Data Based Decision Making
1Data Based Decision Making
2Reading Review
- Stanovich, 2010
- Fuchs Fuchs -- Progress Monitoring
3- "It ain't so much the things we don't know that
get us into trouble. It's the things we know that
just ain't so." - -Josh Billings
-
Perhaps the second most famous humor writer and
lecturer in the United States in the second half
of the 19th century after Mark Twain
4We Never Know for sure
- Even practices with the best research base may
not work for some students. - So if you are using a research based
intervention implement COLLECT DATA! - And if you are struggling to identify a
research-based intervention implement COLLECT
DATA!
5Critical Concept Data Based Decision Making
- Continuous, purposeful process of collecting,
interpreting, presenting and using data to inform
actions that support positive educational
outcomes. - Data based decision making considers the
learners progress within the contexts of
instruction, curriculum and environment.
6Necessary components of Assessment
- When a student is experiencing difficulty,
several related complementary types of
assessment should be performed - Assessment of the Learner (Student)
- Assessment of Instruction (or Intervention)
Curriculum and Environment - Learner
- Instruction/ Intervention
- Curriculum
- Environment
7Measuring -ICE Instruction, Curriculum,
Environment
- What questions might you have about the
instruction/intervention or curriculum? - Are the instructional/interventions methods
research based? - Implementation fidelity?
- Is the classroom environment suitable to learning
- Time on task
- Instructional time
- Academic engaged time
- Opportunities to Respond Correct Responses
- Positive to Negative Ratio
- Student problem behavior
8Models for Data Based Decision Making
- Problem Solving Models Outcomes Driven Models
9Supporting Social Competence Academic
Achievement
OUTCOMES
Supporting Decision Making
SYSTEMS
DATA
Supporting Staff Behavior
PRACTICES
Supporting Student Behavior
10Outcomes Driven Model
- In an Outcomes Driven Model, the bottom line is
achievement of essential educational or social
outcomes - What are the desired outcomes?
- Are students attaining the necessary skills to be
successful? - If not, what changes can we make?
- Are the changes increasing student progress?
11Research Based Frameworks Needed
- How do we know what to measure when?
- Reading
- RTI Big 5 Ideas of Reading
- Math
- RTI
- Behavior
- PBIS, Function of Behavior ABA
12Big 5 Ideas of Reading
Reading Comprehension
Vocabulary
Oral Reading Fluency Accuracy
Phonics (Alphabetic Principle)
Acquisition
Fluency
Maintenance Generalization
Phonemic Awareness
133. Accurately identify those who are on track and
those who will need more support
We must identify struggling students, BEFORE they
fall too far behind
Good, Simmons, Smith (1998)
14Response to Intervention
Academic Systems
Behavioral Systems
1-5
1-5
5-10
5-10
80-90
80-90
Circa 1996
15 Identify Problems
Team Initiated Problem Solving (TIPS) Model
Develop Hypothesis
Evaluate and Revise Action Plan
Collect and Use Data
Discuss and Select Solutions
Develop and Implement Action Plan
Problem Solving Meeting Foundations
16Purposes of Assessment
- Screening
- Which students need more support?
- Progress Monitoring
- Is the student making adequate progress?
- Diagnostic
- What and how do we need to teach this student?
- Outcome
- Has our instruction been successful?
17Outcomes Driven Model
Screening
Screening
Outcome
Diagnostic
Diagnostic
Progress Monitoring
18Effective Data Collection
19Use the right tools for the right job
- Screening
- Progress Monitoring
- Diagnostic Assessment
- Outcomes
20Use Good Tools Technically Adequate
- Reliability Consistency
- The extent that an assessment will be consistent
in finding the same results across conditions
(across different administrators, across time,
etc.) - If same measure is given several times to the
same person, their scores would remain stable
not randomly fluctuate
21Use Good Tools Technically Adequate
- Validity extent that an assessment measures
what it is supposed to measure - First we need to know what we should be
measuring! - Research Based Frameworks for Measurement
- Students who do well on valid reading tests are
proficient readers - Valid assessing reading by having the student
read a passage aloud and monitoring errors and
rate - Not Valid assessing reading by having a student
match printed letters on a page (this is an
assessment matching visual figures)
Draw a line to Match the letters A f U p
w w E A f I U v B p
22Use Good Tools A Concern for self-developed
assessments
- Technical Adequacy can be a problem with
self-developed measures - Challenge with Professional Learning Team model
- Which often rely on teacher-developed assessments
to measure important student outcomes guide
decision making
23Low Inference
- Students are tested using materials that are
directly related to important instructional
outcomes - Low inference
- Making judgments on a childs reading skills
based on listening to them read out loud. - High inference
- Making judgments on a childs emotional state
based on pictures theyve drawn
24Use the tools correctly Standardized
Administration
- Administered, scored, and interpreted in the same
way - Directions given to students are consistent
- Student responses are scored in the same way
- Every student has the exact same opportunity on
the assessment
25Efficiency
- Time is precious in classrooms, efficiency is an
important consideration - When evaluating efficiency of an assessment tool,
we must consider - Time personnel required to design, administer
and score assessment tools
Design Administration Scoring
PNRTs Already designed Time intensive (1-2 hours/child)
CBA Some already designed, Some teacher-created Quick and Easy (1-10 min/child)
CBM Already designed Quick and Easy (1-10 min/child)
26Screening
271. Compare ALL students to the same grade-level
standard
- ALL students are assessed against the grade
level-standard, regardless of instructional level - "If you don't know where you are going, you will
wind up somewhere else. - Yogi Berra
282. Be efficient, standardized, reliable, and valid
- Robust indicator of academic health
- Brief and easy to administer
- Can be administered frequently
- Must have multiple, equivalent forms
- (If the metric isnt the same, the data are
meaningless) - Must be sensitive to growth
293. Accurately identify those who are on track and
those who will need more support
We must identify struggling students, BEFORE they
fall too far behind
Good, Simmons, Smith (1998)
304. Evaluate the quality of your schoolwide
instructional system
- Are 80 of your students proficient?
- Are 80 of students reaching benchmarks and on
track for next goal? - If not, then the core curriculum needs to be
addressed
31What are Screening Tools?
- DIBELS
- Oral Reading Fluency
- Maze
- EasyCBM
- CBM Math Computation
- CBM Writing Story Starters
- CBM Algebra
- CBM Early Numeracy
- Quick Phonics Screener
- QRI-IV
- DRA2
- Running Records
- Report cards
- Meeting OAKS standards
- Core curriculum weekly tests on skills that are
learned
32- One Page of a 3-Page CBM in Math Concepts and
Applications (24 Total Blanks)
33Previous Years Discipline data
Who needs to be on our radar from Day 1?
Who had FBA/BSPs last year?
Which students moved on? Which are returning this
year?
Can we get data for our incoming class new
students?
Decision Rule
34Progress Monitoring
35Progress Monitoring Tools
Brief Easy
Sensitive to growth
Equivalent forms
Frequent
36Where are we?
What is our goal?
What course should we follow?
How are we doing?
37Progress Monitoring The GPS for Educators!
38Purpose of Progress Monitoring
- Answers the question(s)
- Are the children learning?
- How can we tell?
- Are they making enough progress?
- Can we remove some of our supports?
- Do we need to change or intensify our supports?
39How often do you progress monitor students?
- Determined by district decision rules and level
of need - Best practice recommendations
- Intensive 1-2 x per week
- Strategic 1x or 2x per month
40How do we know if a student is making adequate
progress?
Decision Rules
41Questions to Consider
- How many data points below the line before you
make a change in instruction/intervention? - What do you change?
- Group size?
- Time?
- Curriculum?
- Other factors?
42Progress Monitoring
Phonics for Reading
43We do not use progress monitoring data to
- select specific short-term instructional goals
- take a lot of time away from instruction
- diagnose educational problems
- assign grades to students
- evaluate teachers
44What are Progress Monitoring Tools?
- Progress Monitoring Tools
- Not Progress Monitoring Tools
- DIBELS
- Oral Reading Fluency
- Maze
- EasyCBM
- CBM Math Computation
- CBM Writing Story Starters
- CBM Algebra
- CBM Early Numeracy
- Quick Phonics Screener
- QRI-IV
- DRA2
- Running Records
- Report cards
- Meeting OAKS standards
- Core curriculum weekly tests on skills that are
learned
45- Progress Monitoring data tell us WHEN a change is
needed - Progress Monitoring data does not always tell us
WHAT change is needed
46Point Card
47Look at Individual Student graph for Targeted
Student(s)
48Diagnostic Assessment
- Answer the question.
- Why?
WARNING! Critical Thinking Skills may be Required
49Collecting Diagnostic Data
- The major purpose for administering diagnostic
tests is to provide information that is useful in
planning more effective instruction.
- Diagnostic tests should only be given when there
is a clear expectation that they will provide new
information about a childs difficulties learning
to read that can be used to provide more focused,
or more powerful instruction.
50Diagnostic Assessment Questions
- Why is the student not performing at the
expected level? - (Defining the Problem)
- What is the students instructional need?
- (Designing an Intervention)
51Digging Deeper
- In order to be diagnostic
- We need to know the sequence of skill development
- Content knowledge may need further development
52Enabling Skills
- Enabling skills are skills that could be
considered prerequisite skills for the
demonstration of proficient performances on
larger assessments measures - They represent the sub-skills of higher order
performance demonstration - Deficiencies in enabling skills will often result
in lower performance on assessments
53Phonemic Awareness Developmental Continuum
Vital for Diagnostic Process!
Hard
- Phoneme deletion and manipulation
- Blending and segmenting individual phonemes
- Onset-rime blending and segmentation
- Syllable segmentation and blending
- Sentence segmentation
- Rhyming
- Word comparison
IF DIFFICULTY DETECTED HERE..
Easy
54Reading Diagnostic assessments may include
- In curriculum assessments
- Quick Phonics Screener
- Weekly assessment data
- Unit and Benchmark assessment data
- Survey Level Assessments
- Error Analysis or Running Records
- Any formal or informal assessment that answers
the question - Why is the student having a problem?
55(No Transcript)
56Survey Level Assessment
- Start at expected level and move backward until
specific skill deficits are identified - Match interventions to address specific skill
deficits - Example
- 2nd Grade Math Assignment Double Digit Math
FACTS sheet (,-,x,/) -- student cannot do - Progress backward in assessment to see where
student can be successful - Cannot do basic facts division ? multiplication ?
or double digit subtraction or addition - Can do single digit addition to 5 successfully
57Error Analysis
- Select a 250 word passage on which you estimate
that the student will be 80-85 accurate. - Record the students errors on your copy of the
reading probe. - Use at least 25 errors for students in grade 1 to
conduct an error analysis and at least 50 errors
for students in second grade and above. - Use an error analysis sheet to conduct error
analysis.
58Error Analysis
59We do not use diagnostic data
- for all students
- to monitor progress towards a long-term goal
- to compare students to each other
60Outcome
- Was the goal reached?
- Often times, the same assessment as your screener
- Can be CBM, State-testing (OAKS), other high
stakes assessments. - Should be linked to district standards and
benchmarks