Constructing Rubrics for Openended Activities - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Constructing Rubrics for Openended Activities

Description:

Printed copies of materials will be passed out today ... Susan Haag, Director of Assessment, College of ... http://www.cea.wsu.edu/TIDEE/monograph.html ... – PowerPoint PPT presentation

Number of Views:62
Avg rating:3.0/5.0
Slides: 36
Provided by: ritacasoa
Category:

less

Transcript and Presenter's Notes

Title: Constructing Rubrics for Openended Activities


1
Constructing Rubrics for Open-ended Activities
  • ASEE 2003 Workshops
  • June 22, 2003

2
Workshop Materials
  • Printed copies of materials will be passed out
    today
  • Also available at the Foundation Coalition web
    page
  • www.foundationcoalition.org

3
Workshop Presenters
  • Susan Haag, Director of Assessment, College of
    Engineering Applied Sciences, Arizona State
    University
  • Ann Kenimer, Associate Professor, Department of
    Agricultural Engineering, Texas AM University

4
Terms Used in Workshop
  • Qualitative Assessment
  • Open-ended data
  • -Content analysis
  • -Rubric
  • -Check-list
  • -Inter-rater
  • -Intra-rater
  • Objective Assessment
  • Closed-ended data
  • -Forced-choice response

Pre-Determined Criteria
Reliability
Validity -Theoretical -Face -Criterion
5
What is a Rubric? (Pre-Determined Criteria)
  • Definition of Rubric3,9
  • A systematic scoring methodology to make
    qualitative assessment and evaluation more
    reliable and objective by applying pre-determined
    criteria
  • e.g., Descriptive criteria are developed to serve
    as guidelines for scorers to assess, rate and
    judge student performance

6
What is a Rubric? (Open-ended Data)
  • It is a tool used in the qualitative assessment
    of open-ended data, such as
  • Written or oral narratives
  • Diagrams or models
  • Written or oral enumerations
  • Behavioral demonstrations
  • of a students knowledge, applied skill, or
    ability to perform

7
How Are Rubrics Used? (Open-ended Data)
  • Advantages and drawbacks of assessing open
    ended data7
  • Advantages
  • Can yield rich information (I.e., individual,
    creative, complex, fine-tuned)
  • Drawbacks
  • Involves subjectivity in interpreting and
    scoring data (i.e. the judgments of individuals
    scoring) as contrasted with objective tests
  • Problems with reliability (both inter-raters and
    intra-rater, across time)

8
How Are Rubrics Used? (Open-ended Data)
  • Other methods of Qualitative Assessment used with
    open-ended data
  • Content analysis and coding10
  • Inventory checklists11
  • Rubrics

9
How Are Rubrics Used? (Diagnostic Feedback)
  • Descriptions of performance standards may serve
    to communicate to students what is expected of
    quality performance5
  • e.g., Ideal, expected performance described in a
    rubric can be explicitly compared with individual
    performance in order to convey what aspects of
    performance need improvement

10
Rubric Types
  • Rubrics may be used holistically or
    analytically
  •  Holistic Rubric5
  • The entire response is evaluated and scored as a
    single performance category
  • Analytical Rubric5
  • The response is evaluated with multiple
    descriptive criteria for multiple performance
    categories

11
Rubric Types Example
  • HolisticRubric for Open-Ended Math Problems 11
  • Criteria for Demonstrated Competence (6 points)
    Description of Exemplary Response
  • Gives a complete response with a clear,
    coherent, unambiguous, and elegant explanation
    includes a clear and simplified diagram
    communicates effectively to the identified
    audience shows understanding of the problems
    mathematical ideas and processes identifies all
    the important elements of the problem may
    include examples and counter-examples presents
    strong supporting arguments.

12
Rubric Types Example
  • HolisticRubric for Open-Ended Math Problems
  • Criteria for Inadequate Response (2 points)
    Description of a Response which Begins, but Fails
    to Complete Problem
  • Explanation is not understandable diagram may
    be unclear shows no understanding of the problem
    situation may make major computational errors.

13
Characteristics of a Rubric (Reliability)
  • A good rubric must posses reliability
  • Definition of Reliability4
  • the extent to which the measuring instrument
    yields responses that are consistent and stable
    across time (intra-rater) and between different
    scorers (inter-rater)
  •  

14
Characteristics of a Rubric (Validity)
  • A good rubric must posses validity
  • Definition of Validity1
  • the extent to which what is being measured by an
    instrument is actually what is intended. Are the
    test and rubric actually measuring the desired
    performance-outcomes? (Construct, Criterion and
    Face Validity)
  •  

15
Team Activity I
  • Evaluate a Rubric


16
The Sample Rubric
  • Developed for use in a freshman-level
    introduction to design class
  • Used to evaluate oral presentations made by
    freshman design teams
  • Used by a panel of 3 to 4 faculty
  • course instructor
  • faculty invited for one day to serve on the
    review panel
  • panel membership changed over the 5 days of
    presentation

17
Your Task
  • With your group discuss
  • The merits of the sample rubric and how it was
    used
  • Potential problems with the sample rubric and how
    it was used
  • What you might do to improve the rubric and its
    use
  • Well share ideas in about 15 minutes

18
Your Ideas
  • What are your thoughts on
  • The merits of the sample rubric and how it was
    used
  • Potential problems with the sample rubric and how
    it was used
  • What you might do to improve the rubric and its
    use

19
Sample Rubric2nd Example
  • Analytical Rubric
  • Scoring rubrics for program objectives
  • Life-long learning
  • Impact in a global/societal context
  • Diana Briedis, Michigan State University
  • Group Presentation Rubric

20
Sample Rubric2nd Example
  • Review group presentation Rubric
  • Discuss
  • Strengths
  • Opportunity for Improvement

21
Constructing a Rubric
  • Note there are two components involved in this
    assessment and evaluation methodology
  • The test instrument given to the students
  • The scoring rubric used by the evaluators

22
Constructing a Rubric3,6,9
  • Develop appropriate performance goals and
    objectives
  • 2. Select the assessment tasks that reflect and
    demonstrate the performance goals
  • 3. Differentiate between performance levels and
    assign relative values to each of the levels
    establish expertlevel establish target
    students developmental level

23
Constructing a Rubric
  • 4. Develop descriptive criteria for each level
    of performance which correspond with local norms
    holistic or analytical
  • Train scorers in application of rubric
  • Pilot both test and scoring rubric for
    inter-rater intra-rater consistency, apply
    cross checking methods
  • Modify test items and scoring rubric based upon
    scoring results content analysis of responses

24
Develop Appropriate Performance Objectives and
Tasks Example5
25
Team Activity II
  • Develop a rubric for
  • Laboratory report
  • Engineering design project
  • Well discuss your rubrics in about 20 minutes

26
Team Activity II
  • Discussion
  • What does your rubric contain?
  • How might you apply this activity to your courses?

27
Common Problems (Transferability
Repeatability )
  • Transferability and Repeatability
  • of Test Questions and Rubric Criteria
  • Across similar or different courses
  • Over time, or across locales
  • Across populations
  • Across scorers
  • Validity
  • Transferability of assessment question
    interpretation
  • Transferability of specifications for expected
    performance
  • Changes in curriculum or instruction
  • Changes in performance standards
  • Changes in students prior knowledge

28
Common Problems (Transferability
Repeatability..cont)
  • Transferability and Repeatability
  • of Test Questions and Rubric Criteria
  • Across similar or different courses
  • Over time, or across locales
  • Across populations
  • Across scorers
  • Different scorers
  • Changes in scorers knowledge
  • Reliability (interacts with validity)
  • Inter-rater
  • Intra-rater (tends to be more validity sensitive)

29
Solutions to Common ProblemsValidity Strengthen
with Peer Review
  • Validity
  • Address..
  • -Theoretical validity2-- Review literature
    other resources for precedents
  • -Criterion validity2 Ask sample of experts,
    novices (if appropriate) and target population to
    respond
  • -Face validity12-- Ask relevant sample of
    local users to respond and critique
  • Content-- Analyze responses compare target
    population to local users, to experts, to
    novices ( if appropriate), and to rubric criteria

30
Solutions to Common Problems Validity
Strengthen with Peer Review
  • Validitycont.
  • Engage external/internal faculty to review the
    rubric for following criteria
  • Appropriateness of the material (level)
  • Topics covered
  • Skills to be demonstrated
  • Bias (culture and gender free?)
  • Text (examine for wording and jargon)
  • Modify test questions

31
Solutions to Common ProblemsReliability
(Training inter-rater)
  • Reliability--Train and manage scorers for
    inter-rater consistency
  • Allow raters or scorers to take the test, then
    score their own and another scorers test
  • Ask raters to justify their scoring to a third
    party
  • Ask raters to re-view and re-score the 1st test
    after they complete the 5th test

32
Solutions to Common ProblemsReliability
(Training intra-rater)
  • Reliability--Train and manage scorers for
    intra-rater consistency
  • Duplicate a sampling of all tests and have all
    scorers evaluate and score each test
  • Ask all scorers to re-view each others scoring
    of the common set of rubrics
  • Ask them to discuss discrepancies
  • Arrive at consensus on interpretation and
    application of rubric criteria
  • Ask scorers to jointly re-score tests
  • Periodically review each other tests

33
Resources Citation References
  • Bergeson, Dr. Terry. Office of Superintendent of
    Public Instruction web page. Scoring the WASL
    Open-Ended Items 1998. 1 May 2002
    lthttp//www.k12.wa.us/assessment/assessproginfo/su
    bdocuments/TechReports/g4part4.pdfgt
  • Cronbach, Lee J., Meehl, Paul E. Construct
    Validity in Psychological Tests. Psychological
    Bulletin (1955). 11 June 2002. http//psychclassic
    s.yorku.ca/Cronbach/fl
  • Ebert-May, Diane. Classroom Assessment
    Techniques Scoring Rubrics. Field-tested
    Learning Assessment Guide (FLAG) web site 1999.
    11 June 2002 lthttp//www.flaguide.org/cat/rubrics/
    rubrics1.htmgt
  • Graduate School of Education Information
    Studies. CRESST. UCLA lthttp//www/Rubrics/CRESSTUC
    LAassementglossary.htmlgt
  •  

34
Resources Citation References
  • Davis D.C., Gentili K.L., Calkins D.E., Trevisan
    M.S. Transferable Integrated Design Engineering
    Education (TIDEE) Project." October 1998. 29 May
    2002. http//www.cea.wsu.edu/TIDEE/monograph.html
  • Moskal, Barbara M. Scoring rubrics what, when
    and how? Practical Assessment, Research
    Evaluation. (2000). 1 May 2002.
    lthttp//ericae.net/pare/getvn.asp?v7n3gt
  • Rowntree, Derek. Home Page. Designing an
    assessment June 2000. 11 June 2002
    lthttp//iet.open.ac.uk/pp/D.G.F.Rowntree/derek.htm
    lgt
  • Rudner, Lawrence M. Reducing Errors due to the
    Use of Judges. ED355254 ERIC/TM Digest (1992).
    11 June 2002 lthttp//ericae.net/db/edo/ED355254.ht
    mgt

35
Resources Citation References
  • Seattle School District. What is a rubric
    (2000). 1 May 2002. lthttp//ttt.ssd.k12.wa.us/dwig
    hth/rubricclass.htmgt
  • Stemler, Steve. An overview of content
    analysis. Practical Assessment, Research,
    Evaluation (2001). 11 June 2002.
    lthttp//ericae.net/pare/getvn.aspgt
  • Summer Technology Institute at Western
    Washington University. Rubric for Open-Ended
    Math Problems. California CAP Math Report
    (1989). 11 June 2002. lthttp//ttt.ssd.k12.wa.us/dw
    ighth/rubricclass.htmgt
  • Trochim, William M.K. Measurement Validity
    Types. William M.K. Trochim Cornell University
    Home Page (2002). 11 June 2002.
    http//trochim.human.cornell.edu/kb
Write a Comment
User Comments (0)
About PowerShow.com