Teacher Use of SOL Test Score Data to Improve Instruction MERC Policy and Planning Council June 4, 2 - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Teacher Use of SOL Test Score Data to Improve Instruction MERC Policy and Planning Council June 4, 2

Description:

MERC Policy and Planning Council 6/4/03. VCU. Jim McMillan. Susan McKelvey. Chesterfield. Glen Miller. Colonial Heights. Gwen Moseley. Hanover. Carol Cash ... – PowerPoint PPT presentation

Number of Views:104
Avg rating:3.0/5.0
Slides: 33
Provided by: soe6
Category:

less

Transcript and Presenter's Notes

Title: Teacher Use of SOL Test Score Data to Improve Instruction MERC Policy and Planning Council June 4, 2


1
Teacher Use of SOL Test Score Data to Improve
Instruction MERC Policy and Planning
CouncilJune 4, 2003
2
Study Team
  • VCU
  • Jim McMillan
  • Susan McKelvey
  • Chesterfield
  • Glen Miller
  • Colonial Heights
  • Gwen Moseley
  • Hanover
  • Carol Cash
  • Richmond
  • Cynthia Gentry
  • Powhatan
  • Carol Pettis
  • Sandy Lynch
  • Hopewell
  • Janet Covington
  • Henrico
  • Kris Herakovich

3
Purpose
To determine the extent to which teachers have
used SOL test data to change or modify
instruction and to identify procedures that have
promoted effective and accurate use of test
scores.
4
Research Questions
  • What is the nature and extent of teacher use of
    SOL test score data? To what extent has usage
    differed according to grade level and subject?
  • How have teachers used SOL test data to change
    instruction?
  • What procedural factors influence test usage?
  • To what extent is recommended practice being
    utilized in interpreting and using test score
    data?
  • What suggestions do teachers have for increasing
    the use of test scores for changing instruction?
  • What is the nature of the training or inservice
    that teachers have found useful?

5
Presentation Overview
  • Research Design
  • Sample
  • Descriptive Results
  • Relationship Results
  • Written comments
  • Interviews
  • Conclusions
  • Implications

6
Research Design
  • Nonexperimental descriptive study also examines
    relationships
  • Quantitative Large-scale survey
  • Qualitative Teacher interviews

7
Sample
  •  
  • Divisions
  •  
  • Number of
  • Division Teachers
    of Total
  •  
  • Chesterfield 307
    38
  • Henrico 160
    20
  • Richmond 126
    16
  • Hanover 98
    13
  • Hopewell 36
    5
  • Colonial Heights 28
    4
  • Powhatan 29
    4
  •  
  •  
  •  
  •  
  •  
  •  

8
Sample
  • Response Rates and Final Sample
  •  

9
Sample
  • Secondary Subject Areas
  •  
  • Number of
  • Subject Teachers
    of Total
  • English 115
    29
  • Social Studies 87
    22
  • Science 78
    20
  • Math 116
    29

10
Findings
  • SOL Test Scores Received
  •  
  • Test Scores Received Number of Teachers
    of Total
  •  
  • Yes, 2001-2002 students 449
    59
  • (last year) 
  • Yes, 2002-2003 students 55
    7
  • (this year)
  • Yes, both groups of students 180
    24
  • None received 37
    5
  • None received, only 41
    5
  • school or division data

11
Findings
  • Report Formats
    Received
  •  
  • Format Number of Teachers
    of Teachers
  •  
  • By Total Scale Score 487
    62
  • By Teacher or Class 382
    49
  • By Reporting Category 374
    48
  • By School 358
    46
  • By Item Analysis 282
    36
  • By Division 188
    24
  • By Student Groups 80
    10

12
Findings
  • Percentage of Teachers Changing Instructional
    Practices Somewhat More or Much More
  •  
  •  

13
Findings
14
Findings
15
(No Transcript)
16
FindingsRelationships with Subject Areas
  • Greater breadth with social studies and science
  • Greater rote memorization with social studies
  • Greater advanced cognitive practices in English,
    social studies, science
  • Greater within-grade collaboration in English

17
FindingsRelationships with When Scores are
Received
  • Greater use when scores are not provided at the
    end of the school year.

18
FindingsRelationships with Report Format
  • Greater instructional change by those receiving
    class or teacher report
  • Consistent finding across grade levels

19
(No Transcript)
20
FindingsRelationships with Type of Assistance
Received
  • Teachers receiving assistance from their
    principal or from lead teachers showed more
    change
  • Finding is mostly true for elementary and middle
    school teachers

21
(No Transcript)
22
(No Transcript)
23
FindingsRelationships with Use of Test Score
Principles
  • Teachers who had professional development on test
    score interpretation increased
  • Direct instruction
  • Test-taking skills
  • Remediation recovery
  • Within-, across-grade and across-content area
    collaboration
  • Extending learning time
  • More change with emphasis on reporting category
    scores, limitations, verification with other
    data, patterns over time

24
FindingsOpen-Ended Question 1Describe briefly
the impact of receipt and analysis of the scores
on your instruction
  • 352 of 686 responded
  • 134 identified student strengths and weaknesses
  • 52 indicated little or no impact, especially with
    high achieving students
  • 32 indicated change in planning for instruction
  • 24 used scores for individualization
  • 14 used scores for small group instruction

25
FindingsOpen-Ended Question 2Make suggestions
that would enhance teacher use of scores
  • 184 of 686 responded
  • Provide more detailed, specific, individualized
    data on specific SOLs tested
  • Provide data earlier during summer
  • Provide more planning time, teacher workdays
  • Few barriers that affect use of scores

26
FindingsInterviews(n12)
  • Teachers making data-driven decisions
  • Our scores were really low here, so we need to
    come up with some alternative activities or
    assessments or lessons, ideas to boost that part
    up.
  • I look at areas that they did poorly with to see
    why they did poorly, was it my teaching method.

27
FindingsInterviews
  • Teachers collaborated with others
  • My team is working on that data this afternoon
    that we just got from their fourth grade year and
    trying to figure out how thats going to help
    us.
  • We need to work together and be a real cohesive
    group because, you know, either we pass together,
    or we fail together.

28
FindingsInterviews
  • Teachers wanted more data
  • Help from the principal is key
  • Our administrator does an amazing job of
    focusing on the positive his job as a coach has
    been wonderful
  • Teachers looked for patterns and trends
  • If we see a pattern, then we get together and
    end up looking over how we have taught the
    material

29
FindingsInterviews
  • Teachers wanted data before school begins in the
    fall
  • If I could get the test scores earlier, I mean,
    if I didnt come in the summertime, I wouldnt
    get them until September, and thats not going to
    do me any good.
  • So this information comes back to us in
    September, but long after the fact. We would be
    more effective educators if we could get
    information back in time to do something with it.

30
Conclusions
  • Teachers are data-driven and use scores to change
    instruction
  • Effective use is related to collaborative teaming
  • Not all teachers receive disaggregated scores
  • Teachers report little group inservice and appear
    capable to interpret scores
  • Teachers use most test-use best practice
    principles
  • Elementary teachers report more change in
    instruction than middle and high school teachers

31
Conclusions
  • Few differences between different subjects
  • No relationship between which reports were
    received, previous or current students
  • Data received in mid to late summer associated
    with greater change in instruction
  • Teachers working with principals reported more
    changes in instruction
  • More instructional change reported for teachers
    who indicate a familiarity with test-use
    principles
  • Teachers would like more detailed test results

32
Implications
  • Provide teachers with item analysis as well as
    reporting category and total scores
  • Provide scores in mid to late summer, before
    school opens in the fall
  • Encourage and facilitate collaborative teaming in
    interpreting and using scores
  • Ensure that teachers are knowledgeable about
    test-use principles
  • Encourage and facilitate verifying of SOL test
    scores with additional data from different
    sources
  • Help teachers identify trends and patterns
  • Provide scores from previous and current students
  • Ensure that principals, department chairs, and
    lead teachers are knowledgeable and skilled in
    test score interpretation and use
Write a Comment
User Comments (0)
About PowerShow.com