Georgia Alternate Assessment - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Georgia Alternate Assessment

Description:

To ensure all students, including students with significant cognitive ... expand the conceptualization of technical quality (Linn, Baker, & Dunbar, 1991) ... – PowerPoint PPT presentation

Number of Views:36
Avg rating:3.0/5.0
Slides: 33
Provided by: mar178
Category:

less

Transcript and Presenter's Notes

Title: Georgia Alternate Assessment


1
Georgia Alternate Assessment
Eighth Annual Maryland Conference October 2007
Melissa Fincher, Georgia Department of
Education Claudia Flowers, UNCC
2
GAA Purpose
  • To ensure all students, including students with
    significant cognitive disabilities, are provided
    access to the state curriculum
  • To ensure all students, including students with
    significant cognitive disabilities, are given the
    opportunity the demonstrate their progress in
    learning and achieving high academic standards

3
Overview of the GAA
  • The GAA is a portfolio of student work provided
    as evidence that a student is making progress
    toward grade-level academic standards.
  • Evidence provided must show instructional
    activities and student work that is aligned to
    specific grade-level standards.
  • The portfolio system is flexible allowing for the
    diversity of the students participating in the
    GAA.

4
GAA Core Belief and Guiding Philosophy
  • All students can learn when provided access to
    instruction predicated on the state curriculum
  • Educators are key significant training and
    support surrounding curriculum access is critical
  • Test development and technical documentation is
    ongoing
  • and includes documentation of decisions
    surrounding development and implementation
  • Technical expertise is important
  • Georgias Technical Advisory Committee
  • Augmented with an AA-AAS expert

5
Additional Resources
  • Georgia took advantage of
  • Learning from other states
  • The growing understanding in the field of
    alternate assessments and what students with
    significant cognitive disabilities can do
  • US EDs offer of technical assistance
  • We elected to focus on technical documentation
  • Invitation to have the National Alternate
    Assessment Center (NAAC) Expert Review Panel
    review documentation
  • National Center for Educational Outcomes (NCEO)
  • Peer Review

6
Description of GAA
  • Structured Portfolio
  • a compilation of student work that documents,
    measures, and reflects student performance and
    progress in standards-based knowledge and skills
    over time

7
Overview of the GAA
  • English/Language Arts (Grades K 8 and 11)
  • Entry 1 Reading Comprehension
  • Entry 2 Communication Writing or
    Listen/Speaking/Viewing
  • Mathematics (Grades K 5)
  • Entry 1 Numbers and Operations
  • Entry 2 Choice from
  • Measurement and Geometry
  • Data Analysis and Probability or
  • Algebra (grades 3 5)

8
Overview of GAA
  • Mathematics (Grades 6 8 and 11)
  • Entry 1 Numbers and Operations or Algebra
  • Entry 2 Choice from
  • Measurement and Geometry
  • Data Analysis and Probability
  • Algebra
  • Science (Grades 3 8 and 11)
  • Entry 1 Choice from blueprint, paired with
    Characteristics of Science standard
  • Social Studies (Grades 3 8 and 11)
  • Entry 1 Choice from blueprint
  • Algebra strand is mandated for
    grade 11.

9
Overview of the GAA
  • There are two collection periods for each entry
    over the course of the school yearminimum time
    between collection periods is 3 weeks.
  • Teachers collect evidence of student performance
    within tasks aligned to a specific grade level
    content standard.
  • This evidence shows the students progress toward
    those standards.
  • Each entry is comprised of 4 pieces of evidence
  • Primary and Secondary for Collection Period 1
  • Primary and Secondary for Collection Period 2

10
Types of Evidence
  • Primary Evidence
  • demonstrates knowledge and/or skills either
    through work produced by the student or by any
    means that shows the students engagement in
    instructional tasks.
  • Secondary Evidence
  • documents, relates, charts, or interprets the
    students performance on similar instructional
    tasks.

11
Entry
12
Permanent Product
13
1/31
100
100
Captioned photos clearly show the student in the
process of the task as well as his completed
product. The captions describe each step of the
task and annotate the students success.
14
Rubric Dimensions
  • Fidelity to Standard
  • the degree to which the students work addresses
    the grade-level standard
  • Context
  • the degree to which the student work is
    purposeful and uses grade-appropriate materials
    in a natural/real-world application
  • Achievement/Progress
  • the degree of demonstrated improvement in the
    students performance over time
  • Generalization
  • the degree of opportunity given to the student to
    apply the learned skill in other settings and
    with various individuals across all content areas
    assessed

15
Rangefinding and Scoring
  • Rangefinding took place in Georgia
  • committee of general and special educators
  • scored a representative sample of portfolios
  • provided guidance and a rationale for each score
    point assigned, which were used to create scoring
    guides and training/qualifying sets
  • Scoring took place in Minnesota
  • GaDOE staff on site
  • 15 read behind and other typical quality control
    checks

16
2006 2007 Entries by Grade
17
Standard Setting
  • Three performance/achievement standards called
    Stages of Progress
  • Emerging Progress
  • Established Progress
  • Extending Progress
  • Descriptions written by development committee

18
Definitions of Stages of Progress
19
Standard Setting Method
  • Portfolio Pattern Methodology
  • combined critical aspects of the Body of Work
    (Kingston, Kahl, Sweeney, Bay, 2001) and the
    Massachusetts (Wiener, D., 2002) models
  • Holistic view of student work and direct tie to
    the analytic rubric as applied to performance
    levels
  • Standards set by grade bands K 2 3 5 6
    8 and 11
  • Articulation committee reviewed recommendations
    across grade bands and content areas

20
Individual Student Report
Individual Student Report Side 1
Individual Student Report Side 2
21
First Year Look at Data
  • Reliability
  • Potential largest source of error is the test
    administer
  • Inter-rater agreement ( exact agreement,
    adjacent agreement, Kappa)
  • Correlation between scores for Entry 1 and Entry
    2
  • G-study (persons, items, raters)
  • Comparability of scores across years (planned)
  • stability over time

22
Inter-rater Agreement
ELA Entry 1 ELA Entry 2 Math Entry1 Math Entry 2 Science Social Studies
Fidelity to Standard 86.4 84.7 88.4 88.3 88.2 89.8
Context 88.0 86.3 88.9 89.6 90.0 91.1
Progress 75.1 74.4 73.3 72.7 76.8 76.0
Generalization 85.3 85.3 85.3 85.3 85.3 85.3
Based on 15 read behind represents exact
agreement.
23
Kappa
ELA Entry 1 ELA Entry 2 Math Entry1 Math Entry 2 Science Social Studies
Fidelity to Standard .71 .69 .70 .73 .76 .77
Context .76 .73 .75 .76 .79 .80
Progress .64 .64 .61 .61 .67 .66
Generalization .76 .76 .76 .76 .76 .76
Based on 15 read behind.
24
Correlation Between Entry 1 and Entry 2
ELA Math
Fidelity to Standard .57 .61
Context .51 .54
Achievement/Progress .57 .59
25
Sources of Evidence for Validity
  • Inter-correlation among the dimensions
  • Content and AlignmentFidelity to Standard
  • Consequential Validity Study
  • Curriculum access (baseline completed)
  • Alternate assessment (planned)
  • Comparability of scores across years (planned)
  • Alignment study (Links for Academic Learning)
  • Content coverage (depth and breadth)
  • Differentiation across grades (vertical
    relationship)
  • Barriers to learning (bias review)
  • Alignment of instruction

26
ELA Entry 1 Correlations for Grade 3
Correlation Fidelity to Standard Context Achievement/ Progress Generalization
Fidelity to Standard 1.0
Context .20 1.0
Achievement/ Progress .22 .18 1.0
Generalization .06 .13 .17 1.0
27
The Challenge of Documenting Technical Quality of
Alternate Assessments
  • Diversity of the group of students being assessed
    and how they demonstrate knowledge and skills
  • Often flexible assessment experiences
  • Relatively small numbers of students/tests
  • Evolving view/acceptance of academic curriculum
    (i.e., learning experiences
  • The high degree involvement of the
    teacher/assessor in administering the assessment
    (Gong Marion, 2006)
  • A lack of measurement tools for evaluating
    non-traditional assessment approaches (e.g.,
    portfolios, performance tasks/events),
    demonstrating a need to expand the
    conceptualization of technical quality (Linn,
    Baker, Dunbar, 1991)
  • NHEAI / NAAC

28
Georgia Technical Considerations
  • Heavy investment in training special educators
  • Curriculum access training teachers started two
    years before implementation of the assessment
  • Four-step process for aligning instruction to
    grade-level standards
  • Teacher Training (Regional and Online)
  • Regional workshops and online presentations
  • Electronic message/resource board
  • Sample activities vetted with curriculum staff

29
GAA Technical Documentation
  • Our documentation includes
  • Rationale for our AA approach (portfolio)
  • Consideration of the purpose of the assessment
    and its role in our assessment system
  • Consideration of who the students are and how
    they build and demonstrate their achievement
  • Consideration of the content assessed and
    establishment of alignment, including monitoring
    plans

30
GAA Technical Documentation
  • Our documentation includes
  • Development of the assessment, including
    rationale for key decisions
  • Training and support for educators responsible
    for compiling portfolios from both a curriculum
    access and assessment perspective
  • Consideration and analysis of potential test bias
  • Scoring and reporting procedures, including steps
    taken to minimize error

31
GAA Technical Documentation
  • Our documentation includes
  • Our plans for specific validity studies
  • Traditional statistics you would expect to find
    in technical documents
  • Inter-rater reliabilities
  • Score point distributions with standard
    deviations
  • Correlations of rubric dimensions by content area
  • Our goal is to collect validity evidence over
    time and systematically document the GAA story

32
Next Steps
  • Continue refining program
  • Continue training and support of teachers
  • Expanding resource board with adapted lessons and
    materials
  • Conduct a series of validity studies
  • In collaboration with NAAC and other states (GSEG)
Write a Comment
User Comments (0)
About PowerShow.com