Ohio Integrated Systems Model: Intensive Supports - PowerPoint PPT Presentation

1 / 89
About This Presentation
Title:

Ohio Integrated Systems Model: Intensive Supports

Description:

Characteristic and Examples of Intensive Supports and ... Carrie S. Intervention. 22. Setting Up A Graph. to Display Baseline Data. Horizontal axis = time ... – PowerPoint PPT presentation

Number of Views:97
Avg rating:3.0/5.0
Slides: 90
Provided by: educ9
Category:

less

Transcript and Presenter's Notes

Title: Ohio Integrated Systems Model: Intensive Supports


1
Ohio Integrated Systems Model Intensive
Supports
Adapted from an ODE Presentation
A Statewide Model to Close the Achievement Gap
for Students with Disabilities and Other At-Risk
Learners
2
Outline
  • Characteristic and Examples of Intensive Supports
    and Interventions
  • Culturally Responsive Collaborative Problem
    Solving for Individual Students
  • Involving students in the intervention
  • Applying Elements of Effective Instruction at the
    Intensive Tier
  • Introduction of the Response-To-Intervention
    (RTI) Model for Special Education Determination
  • Using Reading and Behavioral Data for IEP Goal
    Setting and Progress Monitoring

3
Ohio Integrated Systems Model for Academic and
Behavior Supports
1-5 Intensive Individualized Interventions
Decisions about tiers of support are data-based
Adapted from OSEP Effective School-Wide
Interventions
4
The Purpose of the Intensive Tier
  • To provide continued and sustained support for
    children who do not adequately progress with
    targeted supports or whose initial assessment
    data indicates the need for support at all 3
    levels
  • Data indicate that students may need ongoing,
    substantial support to make progress

5
Instruction at the Intensive Tier
  • Time and Opportunities to Practice
  • Reading Daily sessions in small group with
    substantial opportunities to practice
  • Behavior Increased opportunity to receive
    explicit instruction, reinforcement, and practice
    in varied settings
  • Curriculum
  • Research-based, intensive, sustained
    intervention, carefully designed, explicit,
    systematic

6
Instruction at the Intensive Tier (cont.)
  • Instructor
  • Highly skilled, trained interventionist
  • Group size
  • Small group (13 or 11)
  • Progress monitoring
  • Once a week on specific skill
  • Regularly scheduled team meetings to review
    progress

7
Collaborative Problem Solving at the Intensive
Tier
  • Occurs at the
  • School Level
  • Systems planning
  • Student Level
  • Functional Assessments
  • (behavior or academic)

8
Tasks for School Planning Team (Intensive Tier)
  • The Building Leadership Team must consider the
    extent to which the following already occurs
  • Effective core curriculum and school-wide social
    skill instruction
  • A continuum of intervention support available
    with increasing intensity
  • A timely systematic process in place so
    team-based collaborative problem solving can
    occur for students needing individualized
    supports
  • The use of brief, repeatable, formative
    assessments (school-wide data)

9
Tasks for School Planning Team (Intensive Tier)
  • The Building Leadership Team must consider the
    extent to which the following already occurs
  • Integrated, flexible general and special
    education service delivery
  • Instructional support decisions based on reliable
    and valid data
  • A process to maximize school resources directed
    toward prevention
  • If not.. Theres remedial work to be done

10
Collaborative Problem Solving(Intensive Tier)
- School Level -
  • Selecting Students for Intensive Supports by
  • Examining school-wide DIBELS/SWIS data to
    identify students falling within the intensive
    range of need
  • Examining performance data of students who do not
    make adequate progress with targeted supports

11
Collaborative Problem Solving
12
Collaborative Problem Solving(Intensive Tier) -
Student Level -
  • The Individual Student Problem Solving Team must
    consider the the following
  • Team membership
  • Role of team members
  • Team members knowledge and use of the
    collaborative problem solving process
  • Family involvement

13
Forming the Team
  • Family members
  • Student
  • Teachers
  • Support Staff
  • Specialist(s) as needed
  • Counselor
  • Nurse
  • Mental Health
  • Speech Language Pathologist
  • Reading
  • ESL
  • Interpreter
  • Administrator(s)
  • Peers/Friends
  • Community agency support

14
The Individual Student Problem Solving Team
must consider
  • Culturally Responsive Practice
  • Making school norms and hidden rules overt to
    everyone on the team including family members
  • Involving someone who is knowledgeable of both
    the school culture and the students home culture
    including the home language to facilitate
    understanding across family members and school
    personnel
  • Ensuring that family input is sought and
    decisions are socially valid to school and family
    members

15
Collaborative Problem Solving
Problem Definition What is the Problem?
Problem Analysis Why is the problem occurring?
Plan Evaluation Did our plan work?
Intervention Planning Implementation What is
our plan for instruction and supports?
Goal Setting What do we want to see happening and
by when?
16
The Collaborative Problem Solving Process is
Functional Assessment
  • Functional assessment is a process for collecting
    meaningful, relevant information about student
    behavior in order to answer specific questions
    about student academic or behavior functioning in
    a particular setting.
  • This information is used to plan effective
    individualized interventions, to make specific
    educational decisions, and to write appropriate
    goals.
  • http//cecp.air.org/fba/default.asp

17

Step 1 Problem DefinitionWhat is the problem?
  • Questions Tasks
  • Identify concerns related to the problem
    situation
  • Logically group concerns and prioritize top
    concern
  • Gather baseline information using direct
    assessment tools
  • Define priority concern in terms of the
    difference between expected and actual
    performance, in observable, measurable terms

18
Pitfalls to avoid when defining the problem
  • Admiring the problem
  • Talking in vague or general terms
  • Using jargon or labels
  • Jumping to solutions
  • Not collecting baseline data

19
Characteristics of Direct Assessment Tools
  • Linked to the curriculum and instruction
  • Direct measures of observable behavior
  • Repeatable
  • Brief
  • Focused on the individual in context
  • Used in the natural environment
  • Reduces error associate with high inference
    assessment tools
  • Includes physical setting, instruction, peers,
    etc.

20

Step 1 Problem DefinitionTools
  • Direct observation
  • Interviews
  • Checklists
  • Curriculum standards
  • DIBELS/CBM
  • Discipline Referrals

21
Graphing Student Data is an Essential Part of
Collaborative Problem Solving
  • Graphs allow teams to
  • Visually inspect data trends across time
  • Draw conclusions about student performance
  • Make timely decisions about instruction

22
Setting Up A Graph to Display Baseline Data
  • Horizontal axis time
  • Vertical axis behavior
  • Plot the baseline data
  • Separate baseline from intervention
  • Plot the goal
  • Draw the goal line
  • Plot the data during intervention

23
(No Transcript)
24
(No Transcript)
25
Example Problem Definition Statement for
Intensive
  • Sean is reading 20 words correct per minute in
    third grade material. It is expected that third
    graders read 70-110 words correct per minute in
    third grade material.

26

Step 2 Problem Analysis Why is the problem
occurring?
  • Questions Tasks
  • What factors are contributing to the mismatch
    between actual and desired performance?
  • Consider issues related to the student,
    instructional strategies, curriculum, and factors
    in the physical or social environment.
  • Focus on variables directly relevant to the
    problem situation and intervention planning.
  • Discuss the factors that can be changed

27

Step 2 Problem Analysis Why is the problem
occurring?
  • Questions Tasks (contd)
  • To what extent is there a skill deficit and/or a
    performance deficit?
  • What happens before (antecedent) after
    (consequences)?
  • What seems to trigger the problem?
  • Examine situations when the problem occurs and
    situations when the problem doesnt occur or gets
    better.
  • Is this a problem for multiple students?

28

Step 2 Problem Analysis Why is the problem
occurring?
  • Questions Tasks (contd)
  • What function does the problem behavior serve or
    what is sustaining the behavior?
  • Does performing the behavior require prerequisite
    skills that the student may not have?
  • Based on information gathered to address these
    questions, generate hypotheses about why the
    problem is occurring or what would it take to
    diminish the problem.
  • Your hypothesis is based on the function of the
    behavior.
  • The hypothesis directs you towards the
    intervention.

29
A hypothesis is
  • A data-based guess about why the problem is
    occurring.
  • An informed prediction about the actions or
    strategies that will likely resolve the problem
    situation.

30
THE HYPOTHESIS LINKS DIRECTLY TO INTERVENTION
FUNCTIONS OF SOCIAL BEHAVIOR
INTERVENTION Escape or Avoidance Teach
coping skills Social Attention..
Attention for appropriate behavior/Teach
skills Control.. Structure
environment and provide
choice Tangible Reinforcement Reinforce
other behaviors Sensory/Neurological
. Teach replacement behaviors
31
Problem Behaviors are Typically Sustained by
  • Performance deficit or lack of motivation
  • AND/OR
  • Skill deficit including lacking necessary
    prerequisite skills

32
THE HYPOTHESIS LINKS DIRECTLY TO INTERVENTION
FUNCTIONS ACADEMIC INTERVENTION Has the
skill but doesnt use it. To motivate Accurate
in skill but not fluent. Increase opportunity
to practice structured
task Moderately accurate Guided practice
with feedback Can only perform
the skills in limited situations.. Guided
practice self-
monitoring Doesnt know the skill. Explicit
instruction frequent practice at level
33
Developing Reading Intervention Plans Involves
Assessing Where to Begin Instruction
  • Determine lowest skill deficit.
  • Start instruction at lowest skill.
  • Begin instruction on next sequential skill as
    soon as skill is mastered.

34
Determining Reading Skill Deficit(s)
  • Determine lowest skill deficit by examining
    current and past DIBELS benchmark data or
    assessing backwards to identify the lowest areas
    of skill deficit.
  • When necessary gather any additional information
    about the students reading performance to
    develop a hypothesis about where to begin
    instruction and progress monitoring.
  • Verify the skill deficit (s) by re-assessing the
    specific skill area(s) identified as low with
    DIBELS

35
DIBELS Steppingstones to Reading
36
Core Components for the instruction of Reading
Across 3 Tiers
  • 1. Phonemic Awareness The ability to hear and
    manipulate sound in words.
  • 2. Alphabetic Principle (Phonics) The ability
    to associate sounds with letters and use these
    sounds to read words.
  • 3. Fluency The effortless, automatic ability
    to read words in isolation (orthographic reading)
    and connected text.
  • 4. Vocabulary Development The ability to
    understand (receptive) and use (expressive) words
    to acquire and convey meaning.
  • 5. Reading Comprehension The complex
    cognitive process involving the intentional
    interaction between reader and text to extract
    meaning.

37
Step 2 Problem AnalysisPitfalls
  • Viewing the problem as internal to the child
  • Failing to examine situations in which the
    problem does not occur
  • Making hypotheses about variables that can not be
    changed
  • Jumping to solutions

38
Step 2 Problem AnalysisTools
  • Gathering information from the team
  • Direct observation (ABC)
  • Teacher report or teacher collected data
  • Examination of work samples
  • DIBELS/CBM
  • Behavioral rating scales
  • Structured Interviews

39
Example Hypothesis Statement
  • Sean is reading 20 words correct per minute in
    third grade material because he is not fluent
    associating sounds with letters and using these
    sounds to read words. He read 30 letter sounds
    per minute on the Nonsense Word Fluency measure.
  • If Sean received increased explicit instruction
    in the alphabetic principle along with frequent
    monitoring and practice opportunities then he
    will increase his reading fluency.

40
Step 3 Goal SettingWhat do we want to see
happen and by when?
  • Questions Tasks
  • Consider what you want to see instead of what is
    happening now. How much of a change is desired?
  • Make sure goals are measurable linked to
    baseline data.
  • Consider typical peer performance or a
    performance standard.
  • State the goals for the behavior in terms of what
    you want to see, not exclusively what you want
    eliminated.

41
Step 3 Goal SettingPitfalls
  • Goal not linked to target behavior
  • Stating the goals only in eliminating terms
    rather than what is desired
  • Stating a goal that is overly ambitious or not
    ambitious enough
  • Not using an appropriate performance standard

42
Step 3 Goal SettingTools
  • Academic content standards
  • Building or class-wide rules
  • CBM instructional guidelines
  • DIBELS benchmarks
  • Typical peer performance
  • Student past performance
  • Parent or teacher expectations

43
Step 4 InterventionWhat instruction and
supports will we provide to help the student?
  • Questions and Tasks
  • What can we do to help this learner meet the
    goal(s)?
  • Generate a list of strategies that address the
    hypothesis,
  • Select strategies to evaluate for possible use,
  • Determine if the selected strategies are
    research validated and acceptable to team
    members.
  • Write down the specifics steps for implementing
    strategies selected by the team (e.g. who, what,
    when, how).

44
Step 4 InterventionWhat instruction and
supports will we provide to help the student?
  • Questions and Tasks
  • How will we monitor progress toward the goal?
  • How will we make sure the intervention plan is
    implemented as designed?
  • What supports do we need to provide implementers?

45
Step 4 InterventionTools
  • Behavior and Reading Intervention Plans
  • Implementation integrity or verification
    checklists/scripts
  • Progress Monitoring
  • Monitor progress using the same assessment tools
    used to identify baseline levels of performance.
  • DIBELS/CBM
  • ORD, Sociograms, etc.

46
Planning for Progress Monitoring in Reading
  • Weekly monitoring of lowest skill deficit
    identified.
  • Periodic checks of progress towards grade-level
    benchmarks
  • Once goal is met, begin instruction and
    monitoring on next sequential skill

47
  • Remember intervention plans should be based on
    sound and valid research

48
All Intervention Plans Should Include the
Elements of Strong Instruction
  • Teach Academic or
  • Social Skill(s)
  • Model correct performance
  • Increase practice
  • Reinforce correct performance
  • Correct errors immediately
  • Share clear goals and progress monitoring
    information with student
  • Motivation/Performance
  • Share clear goals and progress monitoring
    information with student
  • Set up successes
  • Provide positive consequences for performance
  • Provide natural or logical consequences for not
    performing

49
Behavior Intervention Plans (BIP)
  • Elements of strong BIP
  • Antecedent and setting event strategies
  • Alternative/Replacement Skills Instruction
  • Consequence Strategies
  • Reinforcement
  • Reduction
  • Long-term Prevention Strategies
  • Support for team members
  • Kincaid, D., University of South Florida

50
Antecedent and Setting Event Strategies
  • Remove the antecedent
  • Modify the antecedent
  • Intersperse other events
  • Neutralize negative events
  • Add events that promote desired behaviors

51
Alternative/Replacement Skills Instruction
  • Direct Instruction in functional skills that meet
    the same need as the behavior of concern
  • Social skills instruction
  • Organizational skills
  • Conflict Resolution

52
Consequence Strategies-- Reinforcement --
  • Reinforcement of appropriate behavior
  • Verbal Praise
  • Proximity
  • Self-monitoring
  • Prompt use of another skill
  • Humor
  • Verbal recognition
  • Provide choices
  • Label behavior, not the student

53
Consequence Strategies-- Reduction --
  • Age-appropriate loss of privileges or restitution
  • Corrective feedback
  • Redirect to another activity
  • Planned ignoring
  • Time-out
  • Crisis Planning

54
Long-Term Prevention Strategies
  • Quality of Life
  • Friendship making skills
  • Recreational skills
  • Opportunities to make choices
  • After school community activities
  • Teach problem solving skills

55
Supports for Implementation
  • Materials
  • Instruction for implementers
  • Encouragement
  • Equitable division of labor

56
Involving Students in Intervention Planning and
Implementation
  • Students participate in the intervention planning
    process as appropriate
  • Get student perspective on what specifically is
    the problem situation
  • Gather student input about why the problem is
    occurring
  • Gather student input about what would help

57
Involving Students in Intervention Planning and
Implementation
  • Students participate in setting goals
  • Through student-teacher conference, students
    learn to understand the relationship between
    their own goal and grade-level expectations
  • Students document their own progress on graphs

58
Verifying Intervention Implementation
  • Plan a way to determine if the intervention was
    implemented as designed
  • Consider
  • Permanent Products
  • Artifacts
  • Checklists
  • Rubrics
  • Self-Ratings and Observer Ratings

59
Step 4 InterventionPitfalls
  • Evaluating ideas as they are generated
  • Limiting suggestions to what is currently in
    place
  • Seeing a person or a place as an intervention
  • Selecting interventions unrelated to the
    hypothesis
  • Jumping to an intrusive strategy/restrictive
    setting
  • Giving implementers no support for implementing
    interventions
  • Failure to thoroughly describe the intervention
  • Failure to incorporate a monitoring plan

60
Step 5 Evaluate the InterventionDid the plan
work?
  • Questions Tasks
  • How is the intervention working?
  • Examine progress monitoring data
  • How much progress has been made toward the goal?
  • Is there a need to
  • Continue the intervention?
  • Modify the intervention
  • Improve accuracy of implementation?
  • Redefine the problem or hypothesis?

61
Examining Progress Monitoring Data
  • Graphed data is ready to be interpreted when
    sufficient data-points have been added to detect
    a trend (min 7 data points).
  • Meaning can be determined by observing how the
    data-points are distributed on the graph.
  • (Source Curriculum-Based Measurement A Manual
    for Teachers by Jim Wright, Syracuse (NY) City
    Schools)

62
When Examining Progress Monitoring Data Look for
  • Change in level of data points
  • Variability of data points
  • Overlap of data points

63
Change in Level of Data-Points
  • The average level at which data-points are
    clustered on a graph are reviewed both before and
    after the intervention has been implemented.
  • Typically, increases in level are sought,
    however, there are situations in which a decrease
    in level may be desired.
  • (Source Curriculum-Based Measurement A Manual
    for Teachers by Jim Wright, Syracuse (NY) City
    Schools)

64
Degree of Change in Level of Data-Points
  • Comparing level of points before and after
    intervention

Substantial jump in level
Minimal jump in level
(Source Curriculum-Based Measurement A Manual
for Teachers by Jim Wright, Syracuse (NY) City
Schools)
65
Variability of Data-Points
  • Intervention that brings stable, steady
    improvement in a students academic behaviors is
    preferred.
  • A pattern of consistent progress is evident when
    data-points on the graph are relatively close
    together without extreme peaks and valleys.
  • Data with extreme peaks and valleys would
    demonstrate inconsistency, a sign that the
    students performance could not be easily
    predicted on any given day.
  • (Source Curriculum-Based Measurement A Manual
    for Teachers by Jim Wright, Syracuse (NY) City
    Schools)

66
Variability of Data-Points
  • Comparing variability of points

Extreme variability
Limited variability
(Source Curriculum-Based Measurement A Manual
for Teachers by Jim Wright, Syracuse (NY) City
Schools)
67
Overlap of Data-Points
  • If the intervention is working and the students
    academic behavior is improving, there should be
    minimal overlap between data-points collected
    before and those obtained after the intervention
    has gone into effect.
  • Particularly in the early stages of the
    intervention, some overlap is expected.
  • Overlap should decrease or disappear as the
    student develops increased fluency in the skill.
  • (Source Curriculum-Based Measurement A Manual
    for Teachers by Jim Wright, Syracuse (NY) City
    Schools)

68
Overlap of Data-Points
  • Overlap of points before and after intervention

Substantial Overlap
Minimal Overlap
(Source Curriculum-Based Measurement A Manual
for Teachers by Jim Wright, Syracuse (NY) City
Schools)
69
Evaluating Data Using Decision Rules
  • Decision rules are firm guidelines for
    interpreting the pattern of data-points on a
    graph (e.g., Three Data-Point Decision Rule and
    the Tukey Method).
  • Research has shown that teachers who
    systematically use decision rules to interpret
    graphed CBM data and use the information to guide
    them in instructional decision-making achieve
    marked improvements in student learning rates.
  • (Source Curriculum-Based Measurement A Manual
    for Teachers by Jim Wright, Syracuse (NY) City
    Schools)

70
3 data point rule
  • If the last 3 data points below the aimline are
  • close to the aimline and
  • have direction toward the aimline THEN
  • collect more data and
  • reapplythe 3 data point decision rule

71
Apply the rule
72
Apply the rule again
73
Tukeys method
  • Step 1 Divide the data points into three equal
    sections by drawing two vertical lines. (If the
    points divide unevenly, group them
    approximately).
  • Step 2 In the first and third sections, find the
    median data-point and median instructional week.
    Locate the place on the graph where the two
    values intersect and mark with an X.
  • Step 3 Draw a line through the two Xs,
    extending to the margins of the graph. This
    represents the trend-line or line of improvement.
  • (Hutton, Dubes, Muir, 1992)

74
Tukey Method
The trend
X
X
X
The goal
See also Split Middle Trend Estimation
75
Decision Rules
  • Ascending Goal Lines
  • If three consecutive data points fall above the
    goal line, raise the goal
  • If three consecutive data points fall below the
    goal line, change the intervention
  • If there is no consistent pattern of performance,
    continue the intervention

76
Decision Rules (contd)
  • Descending Goal Lines
  • If three consecutive data points fall below the
    goal line, lower the goal (expect the behavior to
    decrease at a faster rate)
  • If three consecutive data points fall above the
    goal line, change the intervention
  • If there is no consistent pattern of performance,
    continue the intervention

77
Step 5 Evaluate the InterventionPitfalls
  • Evaluating the intervention based on verbal
    report
  • Providing no evidence that the intervention was
    implemented as planned
  • Failing to include the intervention implementer
    in any plans for modifying the intervention
  • Assuming the problem solving process is over if
    the first intervention was successful

78
(No Transcript)
79
(No Transcript)
80
Response-to-Intervention
Determining Eligibility for Special Education in
a Multi-Tiered Model
81
Requirements for Intervention
  • No Child Left Behind
  • IDEIA
  • Ohios Operating Standards (general education)
  • Ohio Senate Bill 1
  • Ohios Operating Standards for Schools Serving
    Students with Disabilities (special education)

82
The Purposes of Considering Special Education
for Students Suspected of Having a Disability
are
  • To provide access to entitlement for individuals
    who need ongoing intensive levels of support in
    order to progress in the general education
    curriculum.
  • To insure unbiased accuracy in identification of
    students who need intensive supports.

83
Characteristics of Interventions Necessary for
Eligibility Decisions
  • Individualized, intensive, explicit instruction
  • Small groups or individually
  • Matched to students individual need (hypothesis)
  • Implementers with high levels of expertise

84
Collaborative Problem Solving for Eligibility
Determination
  • Team becomes the MFE team which includes
  • _____________________________________
  • Continue use of Collaborative Problem Solving to
    develop/intensify interventions
  • Use formative direct assessment data to determine
    eligibility for special education

85
Selecting Students for Eligibility
  • Data from Tier 3 interventions indicate an
    effective intervention has been found
  • What it takes to get progress is unique,
    individualized, expected to be needed over time,
    and is too intensive to be supported with general
    education resources

86
Do You Suspect a Disability?
  • Progress Monitoring Graph
  • Goal line, aim line, peer comparison, phase
    change lines showing Tier 1, 2, and 3
    interventions
  • Additional Assessments
  • Record Review
  • Vision/hearing, attendance, group testing, etc.
  • Intervention Description or Scripts for Tier 1,
    2, and 3
  • Treatment Integrity data on Tier 1, 2, and 3
    interventions
  • Parent Consent for Evaluation

87
Determining Eligibility
  • Definitions in IDEIA
  • Specialized instruction is necessary
  • Childs needs are not due to lack of instruction
    in reading or math or lack of English Language
    proficiency

88
Writing the IEP
  • Present Levels of Performance from direct
    assessment data
  • Interventions become services
  • Goals and Progress Monitoring stay the same

89
Adapted Material
  • This material has been adapted and expanded from
    its original ODE content to meet the needs of
    the KSU OISM project.
Write a Comment
User Comments (0)
About PowerShow.com