What is High-Quality Assessment? Linking Research with Practice - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

What is High-Quality Assessment? Linking Research with Practice

Description:

Title: Study of Alignment Quality among the Pennsylvania Standards/Assessment Anchors and PSSA-M in Mathematics Author: khess Last modified by: Karin Hess – PowerPoint PPT presentation

Number of Views:121
Avg rating:3.0/5.0
Slides: 28
Provided by: khe78
Category:

less

Transcript and Presenter's Notes

Title: What is High-Quality Assessment? Linking Research with Practice


1
What is High-Quality Assessment? Linking Research
with Practice
  • Santa Clara County Office of Education
  • June 23, 2014
  • Karin K. Hess, Ed.D.
  • khess_at_nciea.org or
  • karinhessvt_at_gmail.com

2
Presentation Overview
  • Clarify understandings of cognitive rigor/DOK
    using sample assessments rubrics
  • Use the Hess Validation Tools Protocols (Module
    3) to examine technical criteria for high quality
    assessments Formative Performance
  • Review tools strategies to discuss plan
    future assessment activities and support to
    teachers
  • Karins coaching tips

3
Rubric Design Formative Tools
  • Revisit Handout from this morning What I need
    to do rubric (citing evidence of proficiency)
  • Handout 2a Find a half
  • Handout 2b Hess Cognitive Rigor Matrix
    Math-Science
  • Handout 2c What will this formative assessment
    uncover?
  • Work in small groups to analyze the assessment

4
What do we mean by high-quality performance
assessment?
  • At your tables, brainstorm examples of
    performance assessments any content area (e.g.,
    arts, writing, science) or real world assessments
    (drivers test, marriage planning, etc.)
  • Have a recorder write them down
  • You have only 3 minutes

5
Turn talk Select one PA from your list and
answer these questions
  • What is it actually assessing (skills
    concepts)?
  • What makes it a PA?
  • What evidence is captured in the assessment that
    distinguishes poor from best performances?
  • What makes it a good performance assessment?
  • You have 5 minutes

6
Lets generalize
  • With regard to skills concepts assessed ______
  • What makes something a PA? ______
  • The kind of evidence that will distinguish poor
    from exemplary performances _______
  • What makes it a good performance assessment?
    _________

7
What we know (from research) about High Quality
Assessment
  • Defined by agreed-upon standards/ expectations
  • Measures the individuals learning can take
    different forms/formats
  • Measures the effectiveness of instruction and
    appropriateness of curriculum
  • Is transparent
  • Students know what is expected of them and how
    they will be assessed
  • Assessment criteria are clear and training is
    provided to educators and reviewers/raters.
  • Communicates information effectively to students,
    teachers, parents, administration and the public
    at large

8
Simply put, HQ assessments have
  • Clarity of expectations
  • Alignment to the intended expectations (skills,
    concepts)
  • Reliability of scoring and interpretation of
    results
  • Attention to the intended rigor (tasks scoring
    guides)
  • Opportunities for student engagement decision
    making
  • Opportunities to make the assessment fair
    unbiased for all
  • Linked to instruction (opportunity to learn)

9
2. The DOK Matrix Instructional Paths
Instruction Assessment Decisions
Selected Response
Each standard has an assigned Depth of Knowledge.
Constructed Response
Performance Tasks
The DOK determines the cognitive level of
instruction.
10
GOAL Each validated assessment will
demonstrate
  • Clarity of expectations for the student and
    teacher(s)
  • Alignment (task scoring) to the intended
    standards content performance/DOK
  • Provide opportunities for student engagement
  • Provide opportunities to make the assessment
    fair unbiased for ALL students

11
First we consider alignment
  • Its really about validity making decisions
    about the degree to which there is a strong
    match between grade level content standards
    performance and the assessment/test
    questions/tasks
  • And making valid inferences about learning
    resulting from an assessment score

12
Validity is a matter of degree, rather than all
or none.Robert Lynn, 2008
13
Alignment (validity) Questions
  • Is there a strong content match between
    assessment/test questions/tasks and grade level
    standards?
  • Are the test questions/tasks (and the assessment
    as a whole) more rigorous, less rigorous, or of
    comparable rigor (DOK) to grade level performance
    standards?

14
Task Validation Protocol Handout 3(K. Hess,
2013)
  • Table Groups review the technical criteria and
    descriptions on pages 3-4 in the protocol at your
    tables
  • Whats one aspect you feel you (or teachers you
    work with) now do well in most local assessments?
  • Whats one aspect you feel you (or your teachers)
    need to understand more deeply as you work with
    them?

15
Uses of the assessment task validation tools
protocols
  • Develop new assessments
  • Analyze existing assessments
  • Validate a revised assessment or new assessment
    prior to broader administration (or purchase)
  • Provide OBJECTIVE feedback to assessment
    developers
  • Promote collaboration and a shared understanding
    of high quality assessment

16
Local Validation Teams represent the diversity of
the school
  • Administrator/Leader/Coach
  • All content areas represented
  • All/most grade levels (grade spans) represented
  • PLUS Representation from special education, fine
    arts, HPE, CTE, foreign language, ELL, etc.
  • decisions may differ depending on school
    configurations and staffing, but diversity in
    teams is critical, especially including special
    educators

17
Frequency of Validations?
  • Initially learning debriefing the process
    together serves as calibration - so everyone is
    on the same page developing a shared
    understanding of what high quality assessment
    looks like
  • School teams set up their schedules once each
    month, every other month, as needed, highest
    priority, etc.
  • Team members may rotate on-off so more (all)
    staff are involved over time

18
Getting ready for validation
  • Grade level or department teams develop the
    assessments using the Basic Validation Protocol
    (e.g., a gr 2 team might develop a common math
    assessment for all gr 2 classes/schools)
  • Developers put the assessment on the local
    (school/district) validation calendar
  • Validation teams prioritize order of validations
    common assessments, major assessments first,
    second round review after getting feedback, etc.

19
Validation Materials
  • Each team member needs (electronic) validation
    protocols (Handout Module 3, pages 3-4)
  • Each person needs a copy of the cover page with
    the assessment and scoring rubric/answer key
    (Handout Module 3, pages 5-6)
  • There may be additional materials e.g., anchor
    papers, examples that do not need to copied for
    everyone but may be helpful to see during the
    review
  • Each person needs a content specific DOK
    reference sheet (Handout Module 1, tools 1, 2,
    or 3)

20
Validation Protocols 1
  • Each time, preview norms for working together
  • I am
  • I am NOT
  • Choose a recorder to keep an electronic record
    provide a copy of feedback for the assessment
    developers
  • Date and list validation panel names on the
    official copy (this can be set up ahead of
    time)
  • Individually, take 5-10 minutes to read through
    make notes before any discussion

21
Sample norms (Source adapted from Powell, WY)
  • I AM
  • Keeping electronic devices on vibrate/off
  • Listening to understand other points of view
  • Respecting everyone as a professional
  • Focusing on the issues
  • Avoiding side conversations
  • Encouraging everyone having a turn to speak
  • Refraining from judgmental statements
  • Representing the best interests of all students
  • Asking clarifying questions
  • Demonstrating a commitment to the process
    (attending meetings, on time, etc.)
  • Others?
  • I AM NOT
  • Using killer phrases
  • Preparing my next remark instead of listening
  • Sounding apologetic
  • Engaging in unrelated activities
  • Using negative gestures/body language
  •  others?

22
Optional -Validation Protocols 2
  • Should the authors present the task at the start?
    (especially if 2nd round) there are pros cons
    to this
  • Go over what is on the cover page/what is
    included and what the purpose of the assessment
    is
  • 2-5 minutes to explain the materials in the
    packet no interruptions from validation panel
  • Panel then asks any clarifying questions only
  • The is NOT for depth of understanding, just to
    know/clarify what is there BEFORE silently
    reading discussing

23
Validation Protocols 3
  • Make notes individually before discussion
  • Choose a task manager/ timekeeper to keep things
    moving reads each indicator on the Validation
    Protocols
  • Have a process to reach consensus (fist 5, thumbs
    up, etc.)- be sure to involve each person!
  • Choose 2 people to give feedback to the
    authors/developers rehearse comments
  • DEBRIEF! Did we honor norms? What went well/needs
    to be refined next time?

24
Giving Feedback
  • Use descriptive language, NOT judgmental language
  • While you may wonder about instructional pieces,
    comments/suggestions about instruction are
    probably not appropriate
  • Your job is NOT to redo the assessment! Keep
    feedback crisp to the point (e.g., pose a
    question)- it is the developers job to decide
    what to do next to strengthen the assessment
    tasks.

25
Giving Feedback (continued)
  • Well-written, clear feedback guides assessment
    developers to make a stronger assessment in the
    end.
  • Place your positive (and descriptive) comments
    under the feedback section (Module 3, page 7)
    What makes this a HQ (high quality) assessment?

26
Examples of Feedback (noted on page 7)
  1. We were unable to locate
  2. We think this might be DOK2, not DOK3
    becausewhat do you think?
  3. We were not clear what the student is expected to
    do or to produce. Did you mean?
  4. This might be better aligned to this standard
  5. As hard as it will be, avoid saying we liked
    This implies you did not like other things and
    your job is NOT to like the assessment.
  6. Include the HQ positives! The directions are
    clear students have authentic choices etc.

27
Debrief each time!
  • Did the validation team honor the norms at all
    times?
  • Do we need to modify/revise norms?
  • What went well?
  • What could have gone better?
  • What will we do differently next time?
  • Who/when will we meet with authors to give
    feedback?
Write a Comment
User Comments (0)
About PowerShow.com