Using A Data System To Inform Instruction - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Using A Data System To Inform Instruction

Description:

Title: No Slide Title Author: ttripolone Last modified by: Site License Created Date: 2/9/2004 3:08:16 PM Document presentation format: US Letter Paper – PowerPoint PPT presentation

Number of Views:27
Avg rating:3.0/5.0
Slides: 36
Provided by: ttripolone
Category:

less

Transcript and Presenter's Notes

Title: Using A Data System To Inform Instruction


1
Using A Data System To Inform Instruction
 Presented byTony Tripolone Technology
Leadership InstituteFor School District
Administrators Lower Hudson Regional
Information CenterWestchester Marriott January
9, 2006
2
Analyzing Assessment Data
  • Purpose of Assessments
  • Appropriate Data Comparison
  • Systemic Change
  • Need to Narrow Focus

3
The Benchmark
  • Snapshot
  • Caution Use Multiple Measures
  • Item Analysis Data
  • -Identifies strengths and weaknesses
  • -Helps align curriculum
  • -Indicates need for parallel tasks
  • Cut Points
  • Graphing will show Gaps

4
P Value or Item Difficulty
  • Definition
  • Why P Values Are Used
  • What It Means (High Low P)
  • How To Calculate

5
P Value or Item Difficulty Calculation
  • P Value is the proportion of students in an
    identified norm group who answer a test item
    correctly usually referred to as the difficulty
    index.
  • The reason for using a P Value graph is to allow
    a district to compare student performance on each
    assessment to a larger reference group as a
    benchmark.
  • A high P Value would indicate a very easy
    question while a low P Value would indicate an
    extremely difficult question purposely designed
    to discriminate performance levels.
  • How a P Value is calculated
  • Multiple Choice questions are either
    right or wrong and receive a score of 1 or 0.
  • These values are added and then
    divided by the number of students taking the
  • assessment resulting in a P Value
    number between 0 and 1.
  • Constructed Response questions have
    rubric scores greater than 1.
  • Therefore, the total number of points
    received on each question are added and
  • then divided by the number of
    students taking the assessment. The resulting
  • average score is then divided by the
    number of possible points in the rubric
  • resulting in a P Value number between
    0 and 1.

6
Gap Region Outperforms District
P Value or Item Difficulty The proportion of
students who answered the item correctly. Low P
Value reflects a difficult question High P Value
reflects an easier question
7
Items Grouped by Key Idea and Performance
Indicators
8
Midrange of Item Difficulty
  • Definition
  • Why Midrange Is Used
  • What It Means
  • How To Calculate

9
Narrowing The FocusWhat is the Midrange?
  • The identified gaps provide a focus for
    improvement.
  • Understanding that the questions were designed to
    discriminate between what a student needs to know
    and be able to do at the various accountability
    levels is very important in the analysis of the
    data.
  • The data is useless without comparing district
    results to a larger reference group as a
    benchmark.
  • Determination of P Value or Item Difficulty is
    critical in distinguishing between easy and
    difficult questions.
  • The midrange is a narrowing of the range of
    assessment scores to provide a more reasonable
    and manageable focus for teachers to evaluate
    their program.
  • If there is a highlighted gap within the
    midrange, i.e. the region outperformed your
    district on questions you would expect your
    students to answer correctly, then it is
    reasonable for you to make meaningful decisions
    to effectively address instructional and
    curricular changes.

10
Midrange of Item Difficulty
  • How To Calculate
  • Determine the range of scores by subtracting the
    lowest regional P Value from the highest on an
    assessment
  • Multiply that difference by 20
  • Add this product to the lowest score
  • Subtract this product from the highest score
  • For example If the range of scores is between
    .90 (the highest
  • score) and .50 (the lowest
    score), then
  • .90 - .50.40 X .20 .08
    .50.08 .58 .90-.08 .82
  • Midrange is .58 to .82 on the P
    Value graph

11
Midrange of item difficulty is .44 - .82
Highlight Key Idea, PI, Question Number of
GAP above
12
Midrange of item difficulty is .44 - .82
Highlight Key Idea, PI, Question Number of
GAP above
13
Trend Summary Chart
  • Displays 5 Years of Assessment Items
  • For Collaborative Discussion

14
(No Transcript)
15
The Trend Summary Chart
  • Captures the proportion of questions asked in
    relation to the identified Standards and
    Performance Indicators or Subskills assessed over
    a Five Year period.
  • Highlights identified Gaps or areas of
    weakness.
  • Reflects only weaknesses that were identified
    from the midrange on the assessment data charts
    that one could reasonably consider needed to be
    addressed.
  • Shows whether multiple choice or extended
    response questions contributed to the weakness.
  • Reveals curriculum balance issues.

16
(No Transcript)
17
Considerations After Viewing Trend Data
  • Number and frequency of items asked
  • Performance Indicators that seem to be targeted
  • Any emerging patterns
  • Consistent areas of weaknesses on assessments
  • Consistent areas of weaknesses in student work
    throughout the year
  • Appropriate balance and emphasis in your
    curriculum as indicated by the assessments
  • Curriculum alignment issues
  • Instructional changes that demonstrate above
    considerations

18
Narrowing The Focus
  • Select Three Areas Of Weakness
  • Look At Student Work
  • Work Collaboratively
  • Determine Root Cause
  • Align and Map Curriculum (Horizontally and
    Vertically)
  • Develop Parallel Tasks
  • Establish Periodic Benchmarks
  • Analyze Data
  • Begin Again

19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
(No Transcript)
24
Collaboration Activity
25
(No Transcript)
26
(No Transcript)
27
(No Transcript)
28
The Collaborative Effort
  • 1. Look at the actual questions to
  • Identify SKILLS needed
  • Determine STRATEGIES necessary
  • 2. Look at student work to evaluate
  • RANGE of RESPONSE
  • Whats missing? DECLARATIVE or
    PROCEDURAL Knowledge
  • 3. Develop Parallel Tasks to
  • Provide BENCHMARK experiential
    opportunities
  • Increase RIGOR and RELEVANCE of
    expectations

29
The Collaborative Process
  • Determine what you know or dont know by doing a
    Gap Analysis
  • Come to discussion prepared by viewing
    dataMentor
  • Alignment of curriculum and instruction to
    standards is the agenda
  • Discussion should be collegial and supportive
  • Decision making is best with active participation
    by everyone
  • Freedom to share identified weaknesses is
    encouraged
  • Examine instruction methods to determine best
    practices or strategies
  • Seek suggestions for improvement
  • Remember We want to improve student achievement
    for all students

30
Does the test behave ?
  • Look at the Range of Responses distribution
  • Determine number and selection at each
    accountability level
  • Identify item difficulty by correct responses
    for multiple choice
  • max. points increase in constructed response as
    accountability level increases

31
Range Of Responses
32
Range Of Responses
33
Range Of Responses
34
Range Of Responses
35
Go to DataMentor.org to see how this data
system can seamlessly facilitate the process of
using data to inform instruction(Follow Handout
Materials)
  • Contact Information
  • Tony Tripolone
  • Administrator for Data Management and Analysis
  • Wayne-Finger Lakes BOCES
  • ttripolone_at_wflboces.org
  • 585-394-9239 ext. 1045
Write a Comment
User Comments (0)
About PowerShow.com