Title: Curriculum Based Measurement (CBM) Training
1Curriculum Based Measurement(CBM) Training
- Middle School CBM Training
2Characteristics of General Outcome Measures (GOMs)
- Powerful measures that are
- Simple
- Easier to obtain data (less time and good data)
- Accurate
- Very specific data
- Efficient
- Only a few minutes to administer
- Generalizable
- Reliable
- Can compare and contrast student performance
across school, district, country
Adapted from www.aimsweb.com
3General Outcome Measures (GOMs) from Other Fields
Medicine measures height, weight, temperature,
and/or blood pressure. Federal Reserve Board
measures the Consumer Price Index. Wall Street
measures the Dow-Jones Industrial
Average. Companies report earnings per share.
McDonalds measures how many hamburgers they
sell. In Education, Curriculum Based Measurement
is a General Outcome Measure
Adapted from www.aimsweb.com
4Using Curriculum Based Measures as General
Outcome Measures
- Its about using General Outcome Measures (GOMs)
for formative assessment/evaluation to - Inform teaching
- AND
- ensure accountability.
- Its different from, but related to, summative
high-stakes testing/evaluation, which - Doesnt inform teaching.
- Mostly used for accountability/motivation.
Adapted from www.aimsweb.com
5Using Curriculum Based Measurement as a General
Outcome Measure
- Universal (school-wide) screening using CBMs
allows us to add systematic Formative Evaluation
to current practice. - For Teachers (and Students)
- Early Identification of At Risk Students
- Instructional Planning
- Monitoring Student Progress
- For Parents
- Opportunities for Communication/Involvement
- Accountability
- For Administrators
- Resource Allocation/Planning and Support
- Accountability
Adapted from www.aimsweb.com
6Using Curriculum Based Measurement as a General
Outcome Measure Research
- Curriculum-Based Measurement (CBM) was developed
more than 20 years ago by Stanley Deno at the
University of Minnesota through a federal
contract to develop a reliable and valid
measurement system for evaluating basic skills
growth. - CBM is supported by more than 25 years of
school-based research by the US Department of
Education. - Supporting documentation can be found in 100s of
articles, book chapters, and books in the
professional literature describing the use of CBM
to make a variety of important educational
decisions.
Adapted from www.aimsweb.com
7Summary of Research Validating Curriculum Based
Measurement
Reliable and valid indicator of student
achievement Simple, efficient, and of short
duration to facilitate frequent administration by
teachers Provides assessment information that
helps teachers plan better instruction Sensitive
to the improvement of students achievement over
time Easily understood by teachers and
parents Improves achievement when used to monitor
progress
Adapted from www.aimsweb.com
8Curriculum Based Measurement Advantages
- Direct measure of student performance
- Helps target specific areas of instructional need
for students - Quick to administer
- Provides visual representation (reports) of
individual student progress and how classes are
acquiring essential reading skills - Sensitive to even small improvements in
performance - Capable of having many forms
- Monitoring frequently enables staff to see trends
in individual and group performanceand compare
those trends with targets set for their students.
- Correlates strongly with best practices for
instruction and assessment, and
research-supported methods for assessment and
intervention.
9Curriculum Based Measurement Things to Remember
- Designed to serve as indicators of general
reading achievement CBM probes dont measure
everything, but measure the important things. - Standardized tests to be given, scored, and
interpreted in a standard way. - Researched with respect to psychometric
properties to ensure accurate measures of
learning. - Are sensitive to improvement in brief intervals
of time. - Tell us how students earned their scores
(qualitative information). - Designed to be as short as possible to ensure
do-ability. - Are linked to decision making for promoting
positive achievement and problem-solving.
Adapted from www.aimsweb.com
10Curriculum Based Measurement
- CBM has been shown to posses high levels of
reliability - Reliability - the extent to which the
measurements of a test remain consistent over
repeated tests of the same subject under
identical conditions - 42 one-minute CBM type assessments in reading,
math, and written expression for grade K-5 were
found to have reliability coefficients between
.90-.99 with just three one-minute
administrations (Jenkins, 2002)
10
11Curriculum Based Measurement
- Discriminant Validity - Does it appear to measure
what its supposed to measure? - And
- Doesnt associate with constructs that shouldnt
be related. -
- Several studies have demonstrated the ability of
CBM to differentiate between students receiving
special education services, students receiving
Chapter 1 services, and students not receiving
any of those services (Deno, Marston, Shinn, and
Tindal, 1983 Marston and Deno, 1982 Shinn and
Marston, 1985 and Shinn, Tindal, Spira, and
Marston, 1987).
11
12What is Curriculum Based Measurement?
- Curriculum-based measurement
- Data collection tools derived directly from the
curriculum that student is expected to learn - CBM assessment tools created by teacher (pull
material from class curriculum) - CBA assessments pulled from a package (i.e.,
Skill Builders, DIBELS, Aims-Web)
13Curriculum Based Measurement
- CBM is believed to reduce the gap between
assessment and instruction - Aides teachers in generating superior student
achievement - Improved communication
- Higher level of sensitivity
- Enhancement of the database
- Administration time is shorter
- More cost effective
14Why Fluency Measures?
- Lots of good data can be obtained in small
amounts of time - Fluency measures are significantly related to
longer tests - Not what you know but how well do you know it
15Student Driver
16Correlation Studies looking at EOG and CBM
Assessments
- EOG and ORF correlation coefficients
- 3rd grade .69
- 4th grade .59
- 5th grade .53
- EOG and Maze Fluency correlation coefficients
- 3rd grade .61
- 4th grade .63
- 5th grade .63
17Correlation Studies looking at District
performance on EOG and CBM Assessments
- EOG and Skill Builder Word Problem probes
correlation coefficients - 3rd grade .64
- 4th grade .49
- 5th grade .60
18Cleveland County Schools EOG/CBM Data 2007
19Local Norms for Both Elementary and Middle Schools
- Currently have CBM norms for K-5th grade
- Currently gathering data from all middle schools
for middle school norms - Norm sheet handouts
205th Grade End of Year Norms
21Curriculum Based Measurements
- Any skill can be measured with a curriculum based
measure - Example
22Curriculum Based Measurements at the Middle
School Level
- Oral Reading Fluency
- Maze Fluency
- Math Computation
- Math Word Problems
- Written Expression
23MAZE Fluency (Comprehension)
- Students read silently for 3 minutes from AIMSweb
Standard Reading MAZE Passages - Determine the number of correct answers
- Record the total number of correct answers
followed by the total number of errors (e.g.,
35/2, 45/0)
24Student Copy
25Examiner Copy
26Administering the MAZE Probes
- MAZE is a standardized test.
- Procedures and directions must be uniform.
- Once students are familiar with the test
directions, the shortened familiar directions
may be used.
27Important Points
- Administer a simple practice test to familiarize
the student with the procedure. - Attach a cover sheet to the students probe so
that student does not begin the test prematurely. - Monitor student to ensure that he/she is circling
the answers instead of writing them. - Discard the MAZE passage and administer another
if there are any interruptions.
28(No Transcript)
29(No Transcript)
30(No Transcript)
31Scoring MAZE
- Score MAZE probes.
- Use the answer key and put a slash(/)
- through incorrect words.
- Determine the number of correct answers.
- Subtract the number of incorrect answers from the
total number of items attempted. - Record the total number of correct answers and
the total number of errors (e.g., 20/4,15/0).
32Threats to Validity
- Patterns of responses that may suggest the
students performance on a MAZE probe may be
invalid - High number of correct responses with a high
number of errors - Correct beginning responses followed by many
errors - Suspected cheating
33Oral Reading Fluency
- Many ways to obtain data
- DIBELS is nice because they have standardized
directions - DIBELS also has standardized passages by grade
level - DIBELS only goes up to 6th grade so we use
AimsWeb standardized passages in middle school
34DIBELS Oral Reading Fluency(DORF)
- Examiner shows reading passage to student.
Student reads the passage. - Score Number of words read correctly in 1 minute.
35Oral Reading Fluency Probes Example
Examiner Copy
Student Copy
36Materials
- Administrator copy
- Student passage
- Clipboard
- Stopwatch
- Pen or pencil
37Directions for Administration
- Place the scoring booklet on the clipboard and
position so that the student cannot see what you
record. - Place the reading passage in front of the
student.
38Directions
- Say these specific directions to the student
- Please read this (point) out loud. If you get
stuck, I will tell you the word so you can keep
reading. When I say stop I may ask you to tell
me about what you read, so do your best reading.
Start here (point to first word of the passage).
Begin.
39Directions
- Start your stopwatch after the student says the
first word of the passage. - Follow along on the examiner scoring page. Put a
slash (/) over words read incorrectly. - If student hesitates on a word for 3 seconds
supply the word for the student - At the end of 1 minute place a bracket () after
the last word read, say Stop and stop your
stopwatch. - Record the total number of words read correctly
on the bottom of the scoring page.
40Timing Rule for DORFContinuous for 1 Minute
- Start your stopwatch after the student says the
first word. - At the end of 1 minute place a bracket () after
the last word read, say Stop and stop your
stopwatch.
41Wait Rule for DORF 3 Seconds
- Maximum time for each word is 3 seconds.
- If the student does not read a word within 3
seconds, say the word and mark the word as
incorrect. - If necessary, indicate for the student to
continue with the next word.
42Discontinue RulePart I Zero (0) Words in the
First Row
- If the student does not read any words
correctly in the first row of the first
passage, discontinue administering the
passage and record a score of zero (0).
43Discontinue RulePart II Fewer than ten (10)
words in First Passage
- If the student reads fewer than 10 words per
minute in the first passage, do not administer
the next two passages. Record the score from the
first passage.
44Directions for Scoring
- Put a slash (/) over any word read incorrectly or
omitted. - Do not mark words read correctly or any words
added or repeated.
45Scoring Examples
- Mispronounced Words
- A word is scored as correct if it is pronounced
correctly in the context of the sentence. - If the word is mispronounced in the context, it
is scored as an error.
46Scoring Examples
- Numerals
- Numerals must be read correctly in the context
of the sentence.
47Scoring Examples
- Repeated Words
- Words that are repeated are ignored in scoring.
48Scoring Examples
- Inserted Words
- Inserted words are ignored and not counted as
errors. - The student does not get additional credit for
inserted words.
49Scoring Examples
- Omitted Words
- Omitted words are scored as incorrect.
50Scoring Examples
- Word Order
- All words that are read correctly but in the
wrong order are scored as incorrect.
51Scoring Examples
- Abbreviations
- Abbreviations should be read in the way you
would normally pronounce the abbreviation in
conversation.
52Note
- Self Corrects
- A word is scored as correct if it is initially
mispronounced but the student self-corrects
within 3 seconds. - Mark SC above the word and score as correct.
SC
53Note
- Articulation and Dialect
- The student is not penalized for imperfect
pronunciation due to dialect, articulation, or
different first language. - Example The student consistently says /th/ for
/s/ and reads rest as retht.
54Final Score Scoring Page
- Add the number of words read correctly up to the
bracket. Record total number of words read
correctly in space provided in the lower right
hand of scoring page.
55What Else Can We Tell From Oral Reading Fluency?
- Is the student highly fluent (both speed and
accuracy)? - Does the student use effective strategies to
decode words? - Does the student adjust pacing (i.e., slows down
and speeds up) according to level of text
difficulty? - Does the student read with expression and attend
to punctuation? - Does the student possess prediction-orientation,
i.e., seem to look ahead and read at a
sentence/paragraph level? - Does the student self-correct?
- Does the student make only meaning preservation
errors? - Does the student display automaticity on reread
words?
56Math Computation
- Using AimsWeb standardized math probes
57(No Transcript)
58(No Transcript)
59CBM Procedures
- Correct Sequences for written expression
- Two words form a sequence, word and punctuation
form a sequence. - Most words and punctuation are used twice
- 4 minutes
- 1 minute to think
- 3 to write/edit
60(No Transcript)
61AIMSweb Data Management
61
62AIMSweb Data Management
62
63Other types of CBMs www.interventioncentral.org
- Website has many CBM probes available for free
- You can create multiple forms of early literacy
and numeracy probes - Many national norms available for comparison
63
64Time Series Analysis Graph in Reading
Words Correct Per Minute
Words Correct Per Minute
0 5 10
15 20 25
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
Weeks
65Graph Current Status
Words Correct Per Minute
Class24
Egbert11
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
66Determine Goal Class1.5 wd growth per week
Egbert Goal 2 wd growth per week
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
67Monitor Egberts Progress Relative to Goal
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
68Formative Evaluation Change Intervention
Change Intervention
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
69Continue Intervention and Monitor Progress
Change Intervention
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
70Raise Goal to 2.5 WCM Growth
Change Intervention
Change Goal
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
71Continue Intervention and Monitor Progress
Change Intervention
Change Goal
Words Correct Per Minute
Class Growth
Class24
Egbert11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
72Determine Goal Class1.5 wd growth per week
Egberta Goal 2 wd growth per week
Words Correct Per Minute
Class Growth
Class24
Egberta11
Egbert goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
73Monitor Egbertas Progress Relative to Goal
Words Correct Per Minute
Class Growth
Class24
Egberta11
Egberta goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
74Change Egbertas Intervention
Change Intervention
Words Correct Per Minute
Class Growth
Class24
Egberta11
Egberta goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
75Implement Revised Intervention and Continue to
Monitor Progress
Change Intervention
Words Correct Per Minute
Class Growth
Egberta goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
76Implement Second Intervention Revision
Change Intervention
Words Correct Per Minute
Class Growth
Egberta goal line
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
77Implement Second Intervention Revision and
Monitor Results
Change Intervention
Words Correct Per Minute
Class Growth
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
78Gap Not Closing Consider Eligibility and More
Intensive Interventions
Change Intervention
Class WCM54
Words Correct Per Minute
Class Growth
Egberta WCM32
0 1 2 3 4 5 6 7 8 9
10 12 14 16 18 20
Weeks
79Measuring Behavior
- What about behavior?
- Must consider behavior difficulties just like we
consider academic difficulties - Environment (School and Classroom)
- Curriculum
- Instruction
- Learner
- What does this remind you of?
79
80Measuring Behavior
- After considering Environment, Curriculum,
Instruction, Learner and a behavior still exists,
it is time to determine the FUNCTION of the
behavior - Functional Behavior Assessment
- Many times, the function of the behavior is
related to the academic difficulties! - Address both behavior and academics at the same
time
Problem Solving
80
81Measuring Behavior
- Functional Behavior Assessment
- Provides an operational definition of behavior
- Identifies events that are related to the
behavior - Identifies consequences that maintain the
behavior - Forms a hypothesis about the function of the
behavior - Uses direct observations to confirm hypothesis
81
82Measuring Behavior
- Functional Behavior Assessment
- Identify Behaviors and Concerns
- Define the Target Behavior
- Gather Data, Direct Assessment
- Context of the Behavior
- Setting, Physiological, Environmental, Academics
- Function of the Behavior
- Attention, Self-Stimulation, Escape,
Power/Control - Hypothesis
- When this occurs, the student does, to
get/avoid
82
83Measuring Behavior
- How do we systematically record behavior? (CBM
equivalent?) - Identify behavior
- Structured observations with comparison peer
- In the structured observation, also include ratio
of interactions - 8 positive to each 1 negative
83
84Measuring Behavior Observation Recording
Methods
- Event Recording
- Can only be used for discrete behaviors (obvious
beginning and end), i.e., hitting, throwing an
object - Simple frequency count of the behavior
- Count is made within a specified observation
period (reading group, 1000 1030, lunch) - Method of choice when the objective is to
increase or decrease the amount of times a
student engages in a discrete behavior - Can easily be done on a sticky note with hash
marks - Examples Number of times Michael talks out in
one hour, number of times Joe hit another student
in 30 minutes
84
85Measuring Behavior Observation Recording
Methods
- Interval Recording
- Way of recording an estimate of the actual number
of times a behavior occurs. Continuous behaviors
are better tracked with interval recording. - Behaviors that occur at high frequency
- Behavior that occurs for extended time periods
- How? Define a specific time period and divide it
into equal intervals (10 seconds) - Record if the behavior occurred at any time
during the interval and a if the behavior did
not occur - Limitations
- Actual number of occurrences is not included
- Difficult to teach a class and conduct this
method - Difficult to have a comparison student
85
86Measuring Behavior Observation Recording
Methods
- Time Sampling
- Set period of time at intervals (15 minutes at 10
second intervals) - Note with or if the behavior happened at the
end of the interval - Suitable to behaviors that are long in duration
and for behaviors that happen with high frequency - Can use a comparison student
- Expressed in terms of percentage
86
87Measuring Behavior Observation Recording
Methods
- Duration Recording
- Focus is on measures of time rather than
instances of behavior - Used when concern is length of time a student
engages in a behavior - Suitable for discrete behaviors
- Can be used when event recording does not give
the whole picture (length of time student is out
of seat)
87
88Measuring Behavior Observation Recording
Methods
- Latency Recording
- Used when primary concern is how long a student
takes to begin performing a behavior once it has
been requested - Measures the length of time between the
presentation of an antecedent stimulus and the
initiation of behavior
88
89Measuring Behavior The Daily Behavior Report
Card
- DBRCs have been referred to under a number of
different titles, including home notes (Blechman,
Schrader Taylor, 1981), home-based
reinforcement (Bailey, Wolf, Phillips, 1970),
daily report cards (Dougherty Dougherty, 1977),
and home-school notes (Long Edwards, 1994). - Within the literature on DBRCs, a consistent
description or definition has not evolved, and a
variety of options exist when creating a daily
rating card.
89
90Thanks to Chris Reilly-Tillman,
ECU interventioncentral.org
Measuring Behavior The Daily Behavior Report
Card
- While the lack of a common definition or title
has not emerged, common characteristics across
DBRCs can be identified. These characteristics
include - A behavior(s) is specified,
- Rating of the behavior(s) occurs at least daily,
- Obtained information is shared across individuals
(e.g., parents, teachers, students), and - The card is used to monitor the effects of an
intervention and/or as a component of an
intervention.
91Thanks to Chris Reilly-Tillman,
ECU interventioncentral.org
Measuring Behavior The Daily Behavior Report
Card
- DBRCs are intuitively appealing to educators, as
they can provide a simple, inexpensive, and
flexible method of providing frequent feedback to
students and parents. - DBRCs require only minor changes in existing
classroom practices. - DBRCs are effective at monitoring behavior
changes. - The potential dual role DBRCs to serve as both a
monitoring device and an intervention component. - Another related reason for the appeal of DBRCs
relates to the home/school orientation to
intervention and data collection .
92Measuring Behavior The Daily Behavior Report
Card
- www.interventioncentral.org
92
93Questions?