Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation - PowerPoint PPT Presentation

1 / 48
About This Presentation
Title:

Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation

Description:

Math 0.04% 26,718 6th Gd. Reading 0.05% 26,718 6th Gd. Math 0.04% 26,699 7th Gd. Reading 0.15% 26,511 4th Gd. Math 0.05% 27,919 8th Gd. Math 0.18% 26,511 4th Gd. – PowerPoint PPT presentation

Number of Views:65
Avg rating:3.0/5.0
Slides: 49
Provided by: JimL56
Category:

less

Transcript and Presenter's Notes

Title: Value -Added Assessment: One Star in the Constellation of Organizational Development and Transformation


1
Value -Added Assessment One Star in the
Constellation of Organizational Development and
Transformation
  • Dr. Jim Lloyd
  • Assistant Superintendent
  • Olmsted Falls City Schools

2
Advanced Organizers
  • Olmsted Falls is a SOAR District
  • Olmsted Falls will become part of BFKs T-CAP
  • Lloyd (2008)DVAS reported
  • The need for further PD related to using data to
    impact teaching and learning
  • The need to fit EVAAS in with other data sets
  • The need to use EVAAS as an improvement tool

3
Objectives of the Presentation
  • Understand the following points
  • Value-added data is one very important component
    to the continuous improvement process.
  • EVAAS is a rear view mirror analysis
  • The story behind the added value is most
    important
  • Special programs do not lead to increases in
    student achievement or progress.
  • Changes in adult behavior do lead to increases in
    student achievement and progress.
  • Play small ball and do not try to hit a grand
    slamget teachers to begin to do things
    differently and share those experiences .

4
Whats in your folder?
  • Part III of a presentation that I gave to our
    middle school staff last yearI handed out the
    exploration questions that were created for the
    groups.
  • An article from the Principal Navigator
  • Chapter V
  • OFCS Power Walkthrough Template
  • OLAC Leadership Development Framework

5
What did Sanders others tell us?
6
Factors related to student learning District,
School, and Teacher Influence on Student Progress
  • Following inferences were shared at the Governors
    Education Symposium (2004)
  • Based on 22 years of Value-Added Study, Dr.
    Sanders draws the following conclusions
  • Variation in student academic progress can be
    attributed this way
  • 5 attributed to District quality
  • 30 attributed to School quality
  • 65 attributed to Teachers quality

7
Socio-economic status Early educational
opportunities Parents educational level School
Factors
Influences on student achievement
8
  • Teacher quality
  • use of formative assessment
  • clear learning targets
  • Quality instructional practices
  • School effects
  • Clear mission/vision
  • Goal setting
  • District effects

Influences on student PROGRESS/GROWTH
9
Things People Will Say about EVAAS
  • Districts schools with high achievement scores
    cant make gains to demonstrate growththis model
    isnt fair.
  • This model isnt reliable and validthere is
    discrepant research in the field about it.

10
How often do students score within the Top 3
Scaled Score Points two years in a Row?
Subject Students Considered Percentage of students scoring within the top three scaled scores on OAT two years in a row
4th Gd. Reading 26,511 0.18
4th Gd. Math 26,511 0.15
5th Gd. Reading 26,695 0.12
5th Gd. Math 26,695 0.21
6th Gd. Reading 26,718 0.04
6th Gd. Math 26,718 0.05
7th Gd. Reading 26,699 0.04
7th Gd. Math 26,699 0.01
8th Gd. Reading 27,919 0.19
8th Gd. Math 27,919 0.05
11
How did the Suburban Districts Do, in particular?
  • The highest percentage of students scoring within
    the top three scaled scores two years in a row
    was a little over 2.
  • Five wealthy Ohio suburban school districts had
    the following highest (district best) rates of
    students scoring within the top 3 scaled scores 2
    years in a row
  • District A 2/172 (1.16) 8th gd. Reading
  • District B 7/612 (1.14) 5th gd. Math
  • District C 5/266 (1.88) 4th gd. Math
  • District D 1/77 (1.30) 4th gd. Math
  • District E 1/58 (1.72) 5th gd. Math
  • These were the highest rates these districts saw
    for any grade for students repeating top-3 scaled
    score performances across years within an OAT
    subject

12
Organizational Development Through Collaborative
Exploration
  • Work of the Ohio Leadership Advisory Council
    (OLAC)
  • Things You Should Consider
  • Establish a District Leadership Team
  • Establish Building Leadership Teams
  • Work on the work

13
About exploration
  • Excellent with Distinction doesnt mean much when
    you dont know exactly why
  • We needed to look at data points in order to see
    our constellation

14
The Leadership for Learning Framework (Reeves,
2006)
15
The Olmsted Falls Effect Constellation
End of Course Exams
SAT/ACT
OAT Data
Implementation Data
Classroom Walkthroughs
SOAR
Graduation Data
CASLData
EVAAS Data
Perception Data
16
Were working on clearly defining the Cause
constellation now
17
Our exploration mechanism
18
(No Transcript)
19
(No Transcript)
20
Our process
  • Conduct a cause and effect analysis
  • Use an array of data points including both SOAR
    and ODE value-added information
  • Define a very limited number of goals
  • Our district fociGet better at 2 things
  • Clarity of Learning Targets
  • Student Feedback

21
OFCS Goal
  • Stated in measurable termsBy 2011 OFCS will have
    experienced a 5 increase in proficient students
    in all buildings in each core subject area when
    compared to 2008 baseline performance as measured
    by the OAT and OGT.
  • Specific, Measurable, Attainable, Results, Time
    bound
  • Increase student proficiency in all buildings in
    the coredoes this mean we should only aim for
    proficiencyNO!

J_Lloyd_2008
22
How will we accomplish this?
  • Strategy
  • Deconstruct, implement and monitor the most
    important learning targets by content area into
    degrees of cognitive complexity in order to more
    clearly articulate the meaning of them to
    students.

23
What evidence do we need to measure our progress?
  • Make the learning targets clearer for students in
    the core curriculum in grades PreK12.
  • Create an implementation system to determine
    whether or not the essential learning targets are
    clear to students prior to, during and after
    instruction.
  • Develop a balanced assessment system that
    emphasizes formative feedback to students during
    learning and has points of data collection after
    learning.
  • Provide time and support for teachers to
    collaborate on student learning

24
  • Making the Learning Targets Clearer

J_Lloyd_2008
25
Clarity of Learning Targets
26
Why clarity?
  • It establishes where the learners are in their
    learning.
  • It establishes where they are going.
  • It provides them with advanced organizers on how
    to get there.If we dont start with clear
    targets we wont end with sound assessments.

27
What do we mean by clarity?
  • Start with considering all indicators
  • Identify PIs by content area for each grade level
  • Link PIs to course content and course
    descriptions
  • Learning targets are written in student and
    parent friendly language
  • Unwrap learning indicators for the standards in
    order to identify concepts, skills, Essential
    Questions Big Ideas
  • Use a learning taxonomy to identify complexity of
    learning targets

J_Lloyd_2008
28
Benefits of Clarity
  • Research indicates students can hit targets they
    can see
  • Increases opportunities for formative assessment
    and student feedback
  • Teachers talking about and agreeing on targets
    makes them clearer to everyone
  • Posting targets in the classroom and talking
    about them before, during and after instruction
    makes them more relevant
  • Breaking targets down into complexity makes them
    clearer to everyone

29
PD Implications of Clarity
  • ID Power Indicators and actual use them to make
    the learning targets clearer for students
  • Student friendly learning targets prior to,
    during and after lessons
  • Big Ideas and Essential Questions prior to,
    during and after lessons
  • Asking students if the targets are clear
  • Monitor the implementation of our professional
    development to ensure it is changing
    instructional practice (classroom walkthroughs)

30
(No Transcript)
31
(No Transcript)
32
The Power of Student Feedback
33
High quality assessment is indistinguishable from
high quality instruction
34
What do we know about classroom assessment?
  • Finding 1 Classroom assessment feedback should
    provide students with a clear picture of their
    progress on learning goals and how they might
    improve.
  • Hattie (1992) Hattie Taimperley (2007)
  • Bangert-Drowns, Kulick, Kulick Morgan (1991)
  • Telling students whether they were correct or
    incorrect had a negative effect on their
    learning.
  • Explaining the correct answer and having them
    refine was associated with gains in learning (20
    percentile points).

35
What do we know about classroom assessment?
  • Finding 1 Classroom assessment feedback should
    provide students with a clear picture of their
    progress on learning goals and how they might
    improve.
  • Fuchs Fuchs (1986)analyzed 21 studies
  • Graphic displays of results enhances student
    learning.
  • Results interpreted by a set of rules (like a
    rubric) enhanced student achievement by 32
    percentile points.

36
What do we know about classroom assessment?
  • Finding 2 Feedback on classroom assessment
    should encourage students to improve
  • Kluger DeNisi (1996)
  • The manner the feedback is communicated greatly
    affects or effect on achievement.
  • When feedback is negative it decreases
    achievement by 5.5 ile points.

37
What do we know about classroom assessment?
  • Marzano (2006) identified 2 characteristics of
    effective feedback.
  • Feedback must provide students with a way to
    interpret even low scores in a manner than does
    not imply failure.
  • Feedback must help students realize that effort
    on their part results in more learning.

38
65 D
39
What do we know about classroom assessment?
  • Finding 3 Classroom assessment should be
    formative
  • Black Wiliam (1998)analyzed 250 studies
  • Formative assessment done well results in student
    achievement gains of about 26 percentile points.
  • It has the highest impact on those students who
    have a history of being low achievers.

40
Our definition
  • FORMATIVE ASSESSMENT
  • is a planned process in which assessment-elicited
    evidence of students status is used by teachers
    to adjust their ongoing instructional procedures
    or by students to adjust their current learning
    tactics.
  • Popham, J (2008). Transformative assessment.
  • Alexandria, VA ASCD.

41
What do we know about classroom assessment?
  • Finding 4 Formative classroom assessments should
    be frequent
  • Bangert-Drowns, Kulik Kulik (1991)meta-analysis
    (29 studies).
  • Frequency of formative classroom assessments is
    related to student achievement

42
(No Transcript)
43
The Power of Feedbackgains in student achievement
  • For SPED students
  • Cues corrective feedback
  • Cues, participation, reinforcement corrective
    feedback
  • Reducing class size
  • Rewards punishment
  • Teacher praise
  • 39 percentile points
  • 37 percentile points
  • 27 percentile points
  • 5 percentile points
  • 5 percentile points
  • 4 percentile points

44
PD Implications of Feedback
  • Establish data/learning teams and structure
    collaborative time
  • Provide opportunities for teachers to learn and
    share feedback strategies
  • Have teachers observe each other to see how it
    occurs
  • Monitor the implementation of our professional
    development to see if it is changing
    instructional practice (classroom walkthroughs)

45
Close Your Knowing-Doing Gap
  • Implement and monitor the things that youre
    already doing
  • Provide people with time to reflect on the results

46
(No Transcript)
47
(No Transcript)
48
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com