Title: Benchmark Screening: What, Why and How
1Benchmark Screening What, Why and How
- A module for pre-service and in-service
professional development - MN RTI Center
- Author Lisa H. Stewart, PhD
- Minnesota State University Moorhead
- www.scred.k12.mn.us click on RTI Center
2MN RTI Center Training Modules
- This module was developed with funding from the
MN legislature - It is part of a series of modules available from
the MN RTI Center for use in preservice and
inservice training
3Overview
- This module is Part 1 of 2
- Module 1 Benchmark Screening What, Why and How
- What is screening?
- Why screen students?
- Criteria for screeners/what tools?
- Screening logistics
- Module 2 Using Benchmark Screening Data
4Assessment One of the Key Components in RTI
Curriculum and Instruction
Assessment
School Wide Organization Problem Solving
Systems (Teams, Process, etc)
Adapted from Logan City School District, 2002
5Assessment and Response to Intervention (RTI)
- A core feature of RTI is identifying a
measurement system - Screen large numbers of students
- Identify students in need of additional
intervention - Monitor students of concern more frequently
- 1 to 4x per month
- Typically weekly
- Diagnostic testing used for instructional
planning to help target interventions as needed
5
6Why Do Screening?
- Activity
- What does it mean to screen students?
- Why is screening so important in a Response to
Intervention system? (e.g., what assumptions of
RTI require a good screening system?) - What happens if you do NOT have an efficient,
systematic screening system in place in the
school?
7Screening is part of a problem-solving system
- Helps identify students at-risk in a PROACTIVE
way - Gives feedback to the system about how students
progress throughout the year at a gross (3x per
year) level - If students are on track in the fall are they
still on track in the winter? - What is happening with students who started the
year below target, are they catching up? - Gives feedback to the system about changes from
year to year - Is our new reading curriculum having the impact
we were expecting?
7
8What Screening Looks Like in a Nutshell
- School decides on brief tests to be given at each
grade level and trains staff in the
administration, scoring and use of the data - Students are given the tests 3x per year (Fall,
Winter, Spring) - Person or team assigned in each building to
organize data collection - All students are given the tests for their grade
level within a short time frame (e.g., 1-2 weeks
or less). Some tests may be group administered,
others are individually administered. - Benchmark testing about 5 minutes per student,
desk to test (individually administered) - Administered by special ed, reading, or general
ed teachers or paras - Entered into a computer/web based reporting
system by clerical staff - Reports show the spread of student skills and
lists student scores, etc. to use in
instructional and resource planning
9Example Screening DataSpring Gr 1 Oral Reading
Fluency
- 10/51 (20) high risk
- 22/51 (43) some risk
- 19/51 (37) low risk on or above target
Class lists then identify specific students (and
scores) in each category
10Screening Data
- Gives an idea of what the range of student skills
is like in your building and how much growth over
time students are making
11Screening Data can be linked to Progress
Monitoring
- The goal is to have a
cohesive system. - If possible, use the same measures for both
screening and progress monitoring
(e.g, CBM).
Screen ALL students 3x per year (F, W, S)
Strategic Support and Monitoring Students at Some
Risk
Intensive Support Monitoring for Students at
Extreme Risk
12A Smart System Structure
School-Wide Systems for Student Success
- Intensive, Individual Interventions
- Individual Students
- Assessment-based
- Intense, durable procedures
5-10
5-10
10-15
10-15
13Terminology Check
- Screening
- Collecting data on all or a targeted group of
students in a grade level or in the school - Universal Screening
- Same as above but implies that all students are
screened - Benchmarking
- Often used synonymously with the terms above, but
typically implies universal screening done 3x per
year and data are interpreted using criterion
target or benchmark scores
14Benchmark Screening
- Schools typically use cut off or criterion scores
to decide if a student is at-risk or not. Those
scores or targets are also referred to as
benchmarks, thus the term benchmarking - Some states or published curriculum also use the
term benchmarking but in a different way (e.g.,
to refer to the documentation of achieving a
specific state standard) that has nothing to do
with screening.
15What to Measure for Screening?Create a
Measurement Net
16How do you decide what Measures to Use for
Screening?
- Lots of ways to measure reading in the schools
- Measure of Academic Progress (MAP)
- Guided Reading (Leveled Reading)
- Statewide Accountability Tests
- Published Curriculum Tests
- Teacher Made Tests
- General Outcome Measures (Curriculum-Based
Measurement family) - STAR Reading
- Etc
- Not all of these are appropriate. Some are not
reliable enough for screening, others are
designed for another purpose and are not valid or
practical for screening all students 3x per year
17Characteristics of An Effective Measurement
System for RTI
valid reliable simple quick
inexpensive easily understood can be given
often sensitive to growth over short periods of
time
Credit K Gibbons, M Shinn
18Effective Screening Measures
- Specific
- Identifies at risk students who really are at
risk - Sensitive
- Students who pass really do go on to do well
- Practical
- Brief and simple (cheap is nice too)
- Do no harm
- If a student is identified as at risk will they
get help or is it just a label?
Reference Hughes Dexter, RTI Action Network
19Buyer Beware!
- Many tools may make claims about being a good
screener
20Measurement and RTI Screening
- Reliability coefficients of at least r .80.
Higher is better, especially for screening
specificity. - Well documented predictive validity
- Evidence the criterion (cut score) being used is
reasonable and creates not too many false
positives (students identified as at risk who
arent) or false negatives (students who are at
risk who arent identified as such) - Brief, easy to use, affordable, and
results/reports are accessible almost immediately
21National Center for RTI Review of Screening Tools
Note Only reviews tests submitted, if it is not
on the list it doesnt mean it is bad, just that
it wasnt reviewed www.rti4success.org
22RTI, General Outcome Measures and Curriculum
Based Measurement
- Many schools use Curriculum Based Measurement
(CBM) general outcome measures for screening and
progress monitoring - You dont have to use CBM, but many schools do
- Most common CBM tool in Grades 1- 8 is Oral
Reading Fluency (ORF) - Measure of reading rate ( of words correct per
minute on a grade level passage) and a strong
indicator of overall reading skill, including
comprehension - Early Literacy Measures are also available such
as Nonsense Word Fluency (NWF), Phoneme
Segmentation Fluency (PSF), Letter Name Fluency
(LNF) and Letter Sound Fluency (LSF)
22
23Why GOMs/CBM?
- Typically meet the criteria needed for RTI
screening and progress monitoring - Reliable, valid, specific, sensitive, practical
- Also, some utility for instructional planning
(e.g., grouping) - They are INDICATORS of whether there might be a
problem, not diagnostic! - Like taking your temperature or sticking a
toothpick into a cake - Oral reading fluency is a great INDICATOR of
reading decoding, fluency and reading
comprehension - Fluency based because automaticity helps
discriminate between students at different points
of learning a skill
23
24GOMCBM DIBELS AIMSweb
DRAFT May 27, 2009
24
25CBM Oral Reading Fluency
- Give 3 grade-level passages using standardized
admin and scoring use median (middle) score - 3-second rule (Tell the student the word point
to next word) - Discontinue rule (after 0 correct in first row,
if lt10 correct on 1st passage do not give other
passages)
25
26 Fluency and Comprehension
The purpose of reading is comprehension
A good measures of overall reading proficiency is
reading fluency because of its strong correlation
to measures of comprehension.
27Screening Logistics
- What materials?
- When to collect?
- Who collects it?
- How to enter and report the data?
28What Materials?
- Use computer or PDA-based testing system
- -OR-
- Download reading passages, early literacy probes,
etc. from the internet - Many sources of CBM materials available free or
low cost Aimsweb, DIBELS, edcheckup, etc. - Often organized as booklets for ease of use
- Can use plastic cover and markers for scoring to
save copy costs
29Screening Materials in K and Gr 1
- Screening Measures will change from Fall to
Winter to Spring slightly - Early literacy subskill measurement is dropped
as reading develops - Downloaded materials and booklets
30K and Gr 1 MeasuresAIMSweb Early Literacy and
R-CBM(ORF)
General Literacy Risk Factor Black, Alphabetic
Principle Green Phonemic Awareness Purple,
Vocabulary Blue Fluency with Connected Text
Comprehension Red
31Gr 2 to 12 AIMSweb Early Literacy and CBM
Measures
32Screening Logistics Timing
- Typically 3x per year Fall, Winter, Spring
- Have a district-wide testing window!
(all grades and schools collect
data within the same 2 week period) - In Fall K sometimes either test right away and
again a month later or wait a little while to
test - Benchmark testing about 5 minutes per student
(individually administered) - In the classroom
- In stations in a commons area, lunchroom, etc.
33Screening Logistics People
- Administered by trained staff
- paras, special ed teachers, reading teachers,
general ed teachers, school psychologists, speech
language, etc. - Good training is essential!
- Measurement person assigned in each building to
organize data collection - Either collected electronically or entered into a
web-based data management tool by clerical staff
34Screening Logistics Math Quiz ?
- If you have a classroom with 25 students and to
administer the screening measures takes approx. 5
min. per student (individual assessment time) - How long would it take 5 people to screen the
entire classroom?
35Remember Garbage IN. Garbage OUT.
- Make sure your data are reliable and valid
indicators or they wont be good for nuthin - Training
- Assessment Integrity checks/refreshers
- Well chosen tasks/indicators
36Use Technology to Facilitate Screening
37Using Technology to Capture Data
- Collect the data using technology such as a PDA
- Example http//www.wirelessgeneration.com/
- http//www.aimsweb.com
- Students take the test on a computer
- Example STAR Reading
- http//www.renlearn.com/sr/
38Using Technology to Organize and Report Data
- Enter data into web-based data management system
- Data gets back into the hands of the teachers and
teams quickly and in meaningful reports for
problem solving - Examples
- http//dibels.uoregon.edu
- http//www.aimsweb.com
- http//www.edcheckup.com
39Screening is just one part of an overall
assessment system for making decisions
40Remember Screening is part of a
problem-solving system
- Helps identify students at-risk in a PROACTIVE
way - Gives feedback to the system about how students
progress throughout the year at a gross (3x per
year) level - If students are on track in the fall are they
still on track in the winter? - What is happening with students who started the
year below target, are they catching up? - Gives feedback to the system about changes from
year to year - Is our new reading curriculum having the impact
we were expecting?
41Build in Time to USE the Data!
Schedule data retreats or grade level meeting
times immediately after screening so you can look
at and USE the data for planning.
42Common Mistakes
- Not enough professional development and
communication about why these measures were
picked, what the scores do and dont mean, the
rationale for screening, etc - Low or questionable quality of administration
and scoring - Too much reliance on a small group of people for
data collection - Teaching to the test
- Limited sample of students tested (e.g., only
Title students! ?) - Slow turn around on reports
- Data are not used
43Using Screening Data See Module 2!
44Articles available with this module
- Stewart Silberglitt. (2008). Best practices in
developing academic local norms. In A. Thomas
J. Grimes (Eds.) Best Practices in School
Psychology, V, NASP Publications.(pp. 225-242). - NCRLD RTI Manual (2006). Chapter 1 School-wide
screening Retrieved from http//www.nrcld.org/rti
_manual/pages/RTIManualSection1.pdf 6/26/09 - Jenkins Johnson. Universal screening for
reading problems Why and how should we do this?
Retrieved 6/23/09, from RTI Action Network site
http//www.rtinetwork.org/Essential/Assessment/Uni
versal/ar/ReadingProblems - Kovaleski Pederson (2008) Best practices in
data analysis teaming. In A. Thomas J. Grimes
(Eds.) Best Practices in School Psychology, V,
NASP - Ikeda, Neessen, Witt. (2008). Best practices in
universal screening. In A. Thomas J. Grimes
(Eds.) Best Practices in School Psychology, V,
NASP Publications.(pp. 103-114). - Gibbons, K (2008). Necessary Assessments in RTI.
Retrieved from http//www.tqsource.org/forum/docu
ments/GibbonsPaper.doc on 6/26/09
45RTI Related Resources
- National Center on RTI
- http//www.rti4success.org/
- RTI Action Network links for Assessment and
Universal Screening - http//www.rtinetwork.org
- MN RTI Center
- http//www.scred.k12.mn.us/ and click on link
- National Center on Student Progress Monitoring
- http//www.studentprogress.org/
- Research Institute on Progress Monitoring
- http//progressmonitoring.net/
46RTI Related Resources (Contd)
- National Association of School Psychologists
- www.nasponline.org
- National Association of State Directors of
Special Education (NADSE) - www.nasdse.org
- Council of Administrators of Special Education
- www.casecec.org
- Office of Special Education Programs (OSEP)
toolkit and RTI materials - http//www.osepideasthatwork.org/toolkit/ta_respon
siveness_intervention.asp
47Key Sources for Reading Research, Assessment and
Intervention
- University of Oregon IDEA (Institute for the
Development of Educational Achievement) Big Ideas
of Reading Site - http//reading.uoregon.edu/
- Florida Center for Reading Research
- http//www.fcrr.org/
- Texas Vaughn Gross Center for Reading and
Language Arts - http//www.texasreading.org/utcrla/
- American Federation of Teachers Reading resources
(what works 1999 publications) - http//www.aft.org/teachers/pubs-reports/index.htm
reading - National Reading Panel
- http//www.nationalreadingpanel.org/
48Recommended Sites with Multiple Resources
- Intervention Central- by Jim Wright (school psych
from central NY) - http//www.interventioncentral.org
- Center on Instruction
- http//www.centeroninstruction.org
- St. Croix River Education District
- http//scred.k12.mn.us
49Quiz
- 1.) A core feature of RTI is identifying a(n)
_________ system. - 2.) Collecting data on all or a targeted group of
students in a grade level or in the school is
called what? - A.) Curriculum
- B.) Screening
- C.) Intervention
- D.) Review
50Quiz (Contd)
- 3.) What is a characteristic of an efficient
measurement system for RTI? - A.) Valid
- B.) Reliable
- C.) Simple
- D.) Quick
- E.) All of the above
51Quiz (Contd)
- 4) Why screen students?
- 5) Why would general education teachers need to
be trained on the measures used if they arent
part of the data collection?
52Quiz (Contd)
- 6) True or False? If possible the same tools
should be used for screening and progress
monitoring. - 7.) List at least 3 common mistakes when doing
screening and how they can be avoided.
53The End ?
- Note The MN RTI Center does not endorse any
particular product. Examples used are for
instructional purposes only. - Special Thanks
- Thank you to Dr. Ann Casey, director of the MN
RTI Center, for her leadership - Thank you to Aimee Hochstein, Kristen Bouwman,
and Nathan Rowe, Minnesota State University
Moorhead graduate students, for editing work,
writing quizzes, and enhancing the quality of
these training materials