Title: Developing Assessments for and of Deeper Learning [Day 2b-afternoon session]
1Developing Assessments for and of Deeper Learning
Day 2b-afternoon session
- Santa Clara County Office of Education
- June 25, 2014
- Karin K. Hess, Ed.D.
- khess_at_nciea.org or
- karinhessvt_at_gmail.com
2Presentation Overview
- Clarify understandings common misconceptions
about rigor/DOK, deeper learning - Use the Hess Validation Tools to examine sample
performance tasks - Give rubrics the chocolate chip cookie taste
test - Be inspired by Karins performance assessment
coaching tips - Plan get feedback on future assessment
activities and/or support to teachers
3What we know (from research) about High- Quality
Assessment
- Is defined by agreed-upon standards/ expectations
- Measures the individuals learning can take
different forms/formats - Measures the effectiveness of instruction and
appropriateness of curriculum - Is transparent
- Students know what is expected of them and how
they will be assessed - Assessment criteria are clear and training is
provided to educators and reviewers/raters. - Communicates information effectively to students,
teachers, parents, administration and the public
at large
4Simply put, HQ assessments have
- Clarity of expectations
- Alignment to the intended content expectations
(skills concepts) - Reliability of scoring and interpretation of
results - Attention to the intended rigor (tasks scoring
guides) - Opportunities for student engagement decision
making - Opportunities to make the assessment fair
unbiased for all - Linked to instruction (opportunity to learn)
52. The DOK Matrix Instructional Paths
Instruction Assessment Decisions
Selected Response
Each standard has an assigned Depth of Knowledge.
Constructed Response
Performance Tasks
The DOK determines the cognitive level of
instruction.
6First we consider alignment
- Its really about validity making decisions
about the degree to which there is a strong
match between grade level content standards
performance and the assessment/test
questions/tasks - And making valid inferences about learning
resulting from an assessment score
7Alignment (validity) Questions
- Is there a strong content match between
assessment/test questions/tasks and grade level
standards? - Are the test questions/tasks (and the assessment
as a whole) more rigorous, less rigorous, or of
comparable rigor (DOK) to grade level performance
standards?
8Some Common Misconceptions about DOK
- All kids cant think deeply or Kids dont need
scaffolding to get there. - Webbs DOK model is a taxonomy (4 vs 1)
- Bloom verbs levels Webb DOK
- DOK is about difficulty.
- All DOK levels can be assessed with a multiple
choice question (thats just dumb!) - Higher order thinking deeper learning
- Multi-step tasks, multiple texts, or complex
texts always means deeper thinking
9Basic Task Validation Protocol Handout 2a (K.
Hess, Linking Research with Practice, Module 3,
2013)
- Table Groups review the technical criteria and
descriptions of the Basic Validation Protocol - Select a sample assessment task to review
- Handout 2b Writing CRM
- Discuss what you see in terms of these criteria
- Purpose use? how might you use results?
- Clarity is it clear what is expected?
- Alignment are task content rigor/DOK
appropriate for grade level, use of data? - Engagement is there opportunity for student
decision making?