Texas Education Agency 20042005 PerformanceBased Monitoring - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Texas Education Agency 20042005 PerformanceBased Monitoring

Description:

Using a data-driven performance-based model to observe, evaluate, and report on ... ( e.g., 14 students in the current year plus 16 students in the previous year) ... – PowerPoint PPT presentation

Number of Views:42
Avg rating:3.0/5.0
Slides: 42
Provided by: teade
Category:

less

Transcript and Presenter's Notes

Title: Texas Education Agency 20042005 PerformanceBased Monitoring


1
Texas Education Agency2004-2005
Performance-Based Monitoring
  • Laura Taylor
  • Rosemary Manges
  • Margaret Mays
  • TCASE January 27, 2005

2
Overall Coordination
Performance-Based Monitoring
Program Monitoring and Interventions
Program Areas
3
Monitoring Defined
  • Monitoring is
  • Using a data-driven performance-based model to
    observe, evaluate, and report on the public
    education system at the individual student group,
    campus, local education agency, regional, and
    statewide levels across diverse areas including
    program effectiveness compliance with federal
    and state law and regulations financial
    management and data integrity for the purpose of
    assessing that student needs are being met

4
Definition (continued)
  • Monitoring is
  • Promoting diagnostic and evaluative systems in
    LEAs that are integrated with the agencys desk
    audit and intervention process and
  • Relying on a research-based framework of
    interventions that ensure compliance and enhance
    student success.

5
Guiding Principles of PBM
  • School District Effectiveness PBM efforts are
    designed to assist school districts in their
    efforts to improve student performance and
    program effectiveness.
  • Statutory Requirements PBM efforts are designed
    to meet statutory requirements.

6
Guiding Principles (continued)
  • Valid Indicators of Performance PBM indicators
    are designed to reflect critical areas of student
    performance, program effectiveness, and data
    integrity.
  • Maximum Inclusion PBM is designed to evaluate a
    maximum number of school districts through the
    use of appropriate alternatives for analyzing
    districts with small numbers of students.

7
Guiding Principles (continued)
  • Individual Program Accountability PBM
    evaluations are structured to ensure that low
    performance in one area cannot be masked by high
    performance in another area and likewise
  • that low performance in one area does not lead to
    interventions in program areas where performance
    is high.
  • High Standards PBM is designed to encourage
    high standards for all students in all
    districts.  Standards will be adjusted over time
    to ensure high expectations continue to be met.

8
Guiding Principles (continued)
  • Annual Statewide Focus PBM allows for the
    annual evaluation of a maximum number of school
    districts in the state, and all evaluated school
    districts will have access to their PBM
    performance on a yearly basis.
  • Public Input and Accessibility The design,
    development, and implementation of PBM are all
    informed by ongoing public input. School
    district performance information that PBM
    generates will be accessible to the public.

9
Guiding Principles (continued)
  • System Evolution PBM is a dynamic system that
    includes a multi-year phase-in process to allow
    for indicators to be added, revised and/or
    deleted in response to other changes and
    developments that occur outside of the system.
  • Coordination PBM is part of an overall agency
    coordination strategy for the data-driven
    evaluation of school district effectiveness.

10
New Strategies for Monitoring
  • Moving toward a data-driven, integrated state
    evaluation system.
  • Moving toward a monitoring system that describes
    performance rather than predicts risk.

11
Overall Goals for Monitoring
  • Achieve an integration of indicators and
    interventions
  • Deliver a consistent and coordinated response to
    identified areas of low performance/program
    ineffectiveness in districts/campuses
  • Take into account both the extent and the
    duration of a districts area(s) of low
    performance/program ineffectiveness
  • Address program and fiscal compliance within the
    overall context of the system

12
PBMAS
  • Performance-Based Monitoring Analysis System
    (PBMAS) an automated data system that reports
    on the performance of school districts and
    charter schools in selected program areas
    (bilingual education/ESL, career and technology
    education, special education, and certain Title
    programs under NCLB).

13
Summary of PBMAS 2004-2005 Indicators of Student
Performance
  • TAKS
  • SPED TAKS
  • LEP TAKS
  • BE English TAKS
  • ESL English TAKS
  • BE Spanish TAKS
  • ESL Spanish TAKS
  • Migrant TAKS
  • CTE TAKS
  • CTE SPED TAKS
  • CTE LEP TAKS
  • CTE Economically Disadvantaged TAKS
  • CTE Tech Prep TAKS

14
PBMAS 2004-2005 Indicators of Student Performance
(continued)
  • RPTE
  • SDAA
  • Performance of exited students
  • SPED Year-After-Exit TAKS
  • LEP Year-After Exit TAKS
  • BE Year-After-Exit TAKS
  • ESL Year-After-Exit TAKS
  • Annual dropout rate
  • SPED dropout
  • LEP dropout
  • Migrant dropout
  • CTE dropout

15
PBMAS 2004-2005 Indicators of Program
Effectiveness
  • Participation
  • SPED TAKS Only Participation
  • SPED SDAA Only Participation
  • LEP TAKS/SDAA Participation
  • Exemption
  • LEP Exemption
  • Exemption from Statewide Assessments (LDAA
    Takers)

16
PBMAS 2004-2005 Indicators of Program
Effectiveness (continued)
  • Over-identification (SPED, SPED DAEP)
  • Over-representation (SPED)
  • Least Restrictive Environment (SPED)
  • Discipline (SPED and NCLB)
  • Non-traditional course completion (CTE)
  • Highly qualified teachers (NCLB)
  • RHSP/DAP Graduates (SPED, LEP, Migrant)

17
Patterns of Performance in PBMAS
  • Analyzing performance level data and examining
    patterns or trends across indicators and program
    areas to inform interventions decision-making
  • Taking into account both the extent and the
    duration of a districts area(s) of low
    performance/program ineffectiveness

18
PBMAS 2004-2005 Special Analysis
  • A NEW process for the monitoring system
  • Based on one of the systems guiding principles,
    MAXIMUM INCLUSION
  • PBMAS is designed to evaluate a maximum number of
    school districts through the use of appropriate
    alternatives for analyzing districts with small
    numbers of students.

19
PBMAS Special Analysis (cont.)
  • Special analysis does not apply to indicators for
    which the district has NO students in a
    particular group these districts received a
    designation of No Data (ND).
  • Special analysis does not apply to indicators for
    which the district performance level is 0 Met
    Standard, regardless of the student group size.
  • There are two types of special analysis in
    2004-2005 PBMAS automated special analysis and
    non-automated special analysis.

20
PBMAS Special Analysis (cont.)
  • Automated special analysis
  • Performance level appears as 0SA, 1SA, 2SA, or
    3SA on the PBMAS Report.
  • Applied to districts that did not meet the
    minimum size requirement of 30 in the current
    year, but did meet it over two years. (e.g., 14
    students in the current year plus 16 students in
    the previous year).
  • The district received a performance level based
    on either the current years data or the previous
    years data, whichever resulted in the higher
    performance level.

21
PBMAS Special Analysis (cont.)
  • Non-automated special analysis
  • Performance level appears as SA on the PBMAS
    Report.
  • Applied to districts that did not meet the
    minimum size requirement of 30 even when looking
    at two years.
  • The district will receive a performance level
    based on professional judgment. Summary data for
    the district are analyzed by program-area staff
    at the agency, and professional judgment is
    applied.

22
PBMAS Special Analysis (cont.)
  • The possible results of non-automated special
    analysis are
  • Allowing the current years performance level
    based on small numbers to stand
  • Elevating the current years performance level to
    a higher performance level or
  • Determining that the districts current year
    performance on the indicator should be Not
    Evaluated (NE).

23
PBMAS Special Analysis (cont.)
  • Update on non-automated special analysis
  • This years effort began during the middle of
    December.
  • Non-automated special analysis for CTE and NCLB
    have been completed.
  • Non-automated special analysis for BE/ESL and
    SPED are scheduled for completion at the end of
    January.
  • Districts that underwent non-automated special
    analysis will receive a summary report in
    February.

24
PBMAS Special Analysis (cont.)
  • Indicators that will receive a performance level
    of NE for 2004-2005 as a result of non-automated
    special analysis
  • Across program areas (BE/ESL, CTE, NCLB, and
    SPED), all TAKS indicators for writing, social
    studies, and science.
  • In CTE
  • CTE LEP TAKS Passing Rate (CTE 3)
  • CTE SPED TAKS Passing Rate (CTE 5)

25
PBMAS Special Analysis (cont.)
  • In BE/ESL
  • ESL English TAKS Passing Rate (1C)
  • LEP TAKS/SDAA Participation (5)
  • LEP Year-After Exit English TAKS Passing Rate
    (4A)
  • BE Year-After Exit English TAKS Passing Rate
    (4B)
  • ESL Year-After Exit English TAKS Passing Rate
    (4C)
  • In SPED
  • TAKS Only Participation (3)
  • SDAA Gap Closure (7)
  • Year-After Exit TAKS Passing Rate (14)

26
Major Considerations in the New Monitoring World
  • The focus is on student performance and program
    effectiveness, and the LEA program and student
    data that reflect current levels of functioning.
  • Redevelopment of systems to address student
    performance and program effectiveness concerns.
  • Alignment of interventions with program needs and
    requirements.
  • Alignment of interventions across program and
    monitoring areas, including interventions for
    academically unacceptable performance.

27
Basic Intervention Concepts
  • Graduated Interventions
  • Continuous Improvement
  • Strategies consistent across programs
  • Use of data to inform decisions
  • Local processes for effectiveness and performance
    reviews
  • Stakeholder/constituent input into review
    processes
  • Compliance addressed in proper perspective on a
    program-by-program basis
  • On-site as tool as reflected in program area
    plans
  • Random selection and/or verification
  • Partnership

28
Basic Intervention Activities
  • Focused Data Analysis
  • Program Effectiveness Review
  • LEA Public Meeting
  • Compliance Review
  • Continuous Improvement Planning

29
Focused Data Analysis
  • A focused review of data indicators for which a
    higher level of performance concern has been
    identified. Traditionally requires a specified
    team of individuals to gather, disaggregate, and
    review data to determine possible causes for the
    performance concern. Results of the analysis
    generally are reflected as findings (strengths
    and areas in need of improvement).

30
Program Effectiveness Review
  • A review of probes and/or data sets which may
    point out data trends, systemic program issues,
    and/or areas of noncompliance with program
    requirements. Traditionally requires a specified
    team of individuals to gather, disaggregate, and
    review data to determine possible causes for
    performance concerns or possible issues in need
    of correction. Results of the analysis generally
    are reflected as findings (strengths and areas in
    need of improvement).

31
LEA Public Meeting
  • An opportunity for local stakeholders and
    constituents to provide input on the effective
    operation and performance of an LEA program
    through participation in focus group discussions.
    Traditionally requires broad-based invitations
    to program stakeholders, with participation of
    LEA staff required. Input received is analyzed,
    and strengths and areas in need of improvement
    are identified.

32
Compliance Review
  • A focused review of compliance issues or
    indicators to ensure that LEAs are implementing
    the program as required by federal statute or
    regulation. The review may include a analysis of
    other requirements as directed by and/or
    specified in state law or administrative code.
    Traditionally requires a specified team of
    individuals to review data to determine possible
    areas of noncompliance, which must be addressed
    in a correction and/or improvement plan.

33
Continuous Improvement Planning
  • A process through which instances of performance
    concern and/or noncompliance are addressed
    through the identification of desired results,
    evidence of change, activities, resources, and
    interim and final review timelines that drive
    positive program change. Emphasis is on a
    continuous improvement process which promotes
    improved student performance and program
    effectiveness over time. Improvement planning
    occurs in a team environment, with required and
    recommended participants identified.

34
Interventions Focus
  • The focus is on a continuous improvement process.
  • Interactions between the agency and school
    districts are ongoing, with the level of TEA
    involvement dependent upon LEA performance and
    improvement.

35
Special Education
36
Special Education
  • Indicators with PBMAS performance levels assigned
    for 2004-2005
  • SPED 1SPED Identification
  • SPED 2ASPED African American Representation
  • SPED 2BSPED Hispanic Representation
  • SPED 3TAKS Only Participation Rate
  • SPED 4(i-v)SPED TAKS Passing Rate
  • SPED 5SDAA Only Participation Rate
  • SPED 6Statewide Assessment Exemption Rate

37
Special Education (continued)
  • Indicators with 2004-2005 performance levels
    (continued)
  • SPED 7--SDAA Gap Closure
  • SPED 8SPED 3-11 Year Olds LRE Placement Rate
  • SPED 9SPED 12-21 Year Olds LRE Placement Rate
  • SPED 10SPED Discretionary Placements
  • SPED 11SPED Discretionary Expulsions
  • SPED 12SPED Discretionary Removals to ISS
  • SPED 13SPED Annual Dropout Rate
  • SPED 14SPED Year-After-Exit TAKS Passing Rate

38
Proposed Framework for 2004-2005Performance-Based
MonitoringSpecial Education
Required level of review and submittal may vary
depending upon initial data review.
39
Special EducationNew Materials Proposed for
2004-2005
  • Focused Data Analysis and FDA with Program
    Effectiveness ReviewGuidance Document Expansion
  • FDA Certified Staff Template
  • Patterns of Service Delivery Review Templates (3)
  • Patterns of Service FDA Findings Template
  • Surrogate Parent Information Template
  • Program Effectiveness Review Template
  • Student Placement Matrices
  • Discipline Placement Matrix

40
Oversight, Sanctions, Interventions
Special Education Monitoring System Draft
Implementation Plan 2004-2005
NO
YES
ONGOING
Implementation OK?
Focused Data Analysis (FDA) and CIP
Stage1Intervention
Resubmit plan (choose outside support)
NO, 1st Time
Implement CIP Evidence of Change (timely review
and check points)
Evaluation, Findings and CIP submitted to TEA
TEA Desk Review of Self-Evaluation Results, Data,
and Continuous Improvement Plan
FDA, LEA Public Meeting andCIP
YES
Plan OK?
YES
Review OK?
Stage 2Intervention
NO, 2nd time
NO
Information Collection and Review(TEA data
andLEA submission)
Oversight, Sanctions, Interventions
Stage 3Intervention
FDA, LEA Public Meeting, Compliance Review and CIP
ONGOING
NO
TEA On-Site Review or Contracted On-Site Review
and Resubmit Plan
Plan OK?
Districts w/ substantial or imminent risk
YES
Special Program Compliance Review
Information Collection and Review(TEA data
andLEA submission)
ONGOING
Required level of review and submittal may vary
depending upon initial data review. Community
stakeholders must be part of self-evaluation team
at all stages of intervention (both required and
recommended team members are provided in guidance
documents) CIP
Continuous Improvement Plan
Targeted TEA On-Site Review and Submission of CIP
Other Random Data and Self-Eval check
TEA Program Monitoring and Interventions Decembe
r 2004
41
PBM Resources
  • PBM Division website http//www.tea.state.tx.us/p
    bm/
  • PBM Division mailbox pbm_at_tea.state.tx.us
  • PMI Division website http//www.tea.state.tx.us/p
    mi/
  • PMI Division mailbox pmidivision_at_tea.state.tx.us
Write a Comment
User Comments (0)
About PowerShow.com