Title: Constructing Rubrics for Open-ended Activities
1Constructing Rubrics for Open-ended Activities
- ASEE Annual Conference
- 16 June 2002
2Form Groups
- This workshop includes two group activities
- Form groups of 4 or 5
- Group with people you dont yet know
3Workshop Presenters
- Rita Caso, Director of Assessment Evaluation,
Educational Achievement Division, College of
Engineering, Texas AM University - Ann Kenimer, Associate Professor, Department of
Agricultural Engineering, Texas AM University
4Rita Caso, Texas AM
- Ph.D. Applied Research and Evaluation in
Education, Counseling and Psychology - 20 years experience in teaching, administration,
research, assessment, evaluation, and
accreditation-review preparation in K-12, Adult
and Higher Education, in Human Services, and
National Market Research. - 7 years specific experience assessing and
evaluating University Level Engineering programs,
and Science, Math, Engineering, and Technology
SMET programs
5Ann Kenimer, Texas AM
- B.S., M.S., Agricultural Engineering, Virginia
Tech - Ph.D., Agricultural Engineering, University of
Illinois - Teaches engineering design processes, fundamental
problem solving, environmental engineering - FC Assessment and Evaluation involvement since
2000
6Workshop Agenda
- Introduction to the Foundation Coalition
- Rubrics
- What is a Rubric?
- How Are Rubrics Used?
- Examples of Rubrics
- Characteristics of a Rubric
- Team Activities
- Use Evaluate a Rubric
- Develop a Rubric
- Common Problems and Solutions
- Resources
- Wrap up
7The Foundation Coalition
- Six cooperating universities
- Arizona State University
- Rose-Hulman Institute of Technology
- Texas AM University
- University of Alabama
- University of MassachusettsDartmouth
- University of WisconsinMadison
- Funded by NSF
8The Foundation Coalition
- Mission
- Establish improved curricula and learning
environments - Attract and retain a more demographically diverse
student body - Graduate engineers who posses those transforming
attributes that fully reflect the FC outcomes
9Holistic Model for Engineering
- Technology enhanced classrooms
- Clustered students in common courses
- Course integration
- Teaming in the classroom
- Active/cooperative learning pedagogy
- Faculty team teaching
- Industry in the classroom
- Course modules for EC2000
- Assessment and evaluation
10Terms Used in Workshop
- Qualitative Assessment
- Open-ended data
- -Content Analysis
-
- -Rubric
- -Check-list
- -Inter-rater
- -Intra-rater
- Objective Assessment
- Closed-ended data
- -Forced-choice response
Pre-Determined Criteria
Reliability
Validity -Theoretical -Face -Criterion
11What is a Rubric? (Open-ended Data)
- It is a tool used in the qualitative assessment
of open-ended data, such as.. - Written or oral narratives,
- Diagrams or models
- written or oral enumerations
- behavioral demonstrations
- of a students knowledge, applied skill, or
ability to perform
12How Are Rubrics Used? (Open-ended Data)
- Advantages and Drawbacks of assessing open
ended data7 - Advantages
- Can yield rich information (I.e., individual,
creative, complex, fine-tuned) - Drawbacks
- involves subjectivity in interpreting and
scoring data (i.e. the judgments of individuals
scoring) as contrasted with objective tests - problems with reliability (both inter-raters and
intra-rater, across time)
13How Are Rubrics Used? (Open-ended Data)
- Other Methods of Qualitative Assessment used with
open-ended data - Content analysis and coding10
- Inventory Checklists11
- Rubrics
14What is a Rubric? (Pre-Determined Criteria)
- Definition of Rubric3,9
- a systematic scoring methodology to make
qualitative assessment and evaluation more
reliable and objective by applying pre-determined
criteria. - e.g., Descriptive criteria are developed to serve
as guidelines for scorers to assess, rate and
judge student performance.
15How Are Rubrics Used? (Diagnostic Feedback)
- Descriptions of performance standards may serve
to communicate to students what is expected of
quality performance5 . - e.g., Ideal, expected performance described in a
rubric can be explicitly compared with individual
performance in order to convey what aspects of
performance need improvement.
16How Are Rubrics Used? (Rubric Types)
- Rubrics may be used holistically or
analytically - Â Holistic Rubric5
- The entire response is evaluated and scored as a
single performance category - Analytical Rubric5
- The response is evaluated with multiple
descriptive criteria for multiple performance
categories
17How Are Rubrics Used? (Rubric Types Example)
- HolisticRubric for Open-Ended Math Problems 11
- Criteria for Demonstrated Competence (6 points)
Description of Exemplary Response - Gives a complete response with a clear,
coherent,unambiguous, and elegant explanation
includes a clear and simplified diagram
communicates effectively to the identified
audience shows understanding of the problems
mathematical ideas and processes identifies all
the important elements of the problem may
include examples and counter-examples presents
strong supporting arguments.
18How Are Rubrics Used? (Rubric Types
Example..cont)
- HolisticRubric for Open-Ended Math Problems..
- Criteria for Inadequate Response (2 points)
Description of a Response which Begins, but Fails
to Complete Problem - Explanation is not understandable diagram may
be unclear shows no understanding of the problem
situation may make major computational errors. -
19How Are Rubrics Used? (Rubric Types Example
..cont)
- Analytical Rubric for TIDEE Design Knowledge
Test 5, 5Design Process question subcategories
Information Gathering Problem DefinitionIdea
Generation Evaluation Decision Making
Implementation Process Development - Score Subcategory Information Gathering
- 1 No information gathered specifically to
support design - 2
- 3 Information gathered primarily once or from
single source aware that information varies
in quality. - 4
- 5 Varied sources used to obtain information
some judgment of information quality
information gathered multiple times.
20How Are Rubrics Used? (Rubric Types
Example..cont)
- Analytical Rubric for TIDEE Design Knowledge
Test , Design Process question subcategories
Information Gathering Problem DefinitionIdea
Generation Evaluation Decision Making
Implementation Process Development - Score SubcategoryImplementation
- 1.. No deliverables produced or they fail to
meet requirements - 2
- 3.. Design decisions converted to
deliverables design products meet primary
requirements - 4
- 5.. Decisions integrated to yield design
products that satisfy system requirements
products delivered on time and within allowed
resources
21Characteristics of a Rubric (Reliability)
- A good rubric must posses reliability
- Definition of Reliability4
- the extent to which the measuring instrument
yields responses that are consistent and stable
across time (intra-rater) and between different
scorers (inter-rater). - Â
22Characteristics of a Rubric (Validity)
- A good rubric must posses validity
- Definition of Validity1
- the extent to which what is being measured by an
instrument is actually what is intended. Are the
test and rubric actually measuring the desired
performance-outcomes? (Construct, Criterion and
Face Validity) - Â
23Team Activity I
24Team-Based Design Communication Knowledge
Assessment TIDEE ENTERING-JUNIOR DESIGN
KNOWLEDGE ASSESSMENT Instrument
- Short-Answer Test Item on Communication5
- OBJECTIVE Demonstrate your knowledge of key
elements in the engineering design process,
teamwork, and communication associated with
team-based engineering design. - ASSIGNMENT Respond to the following
questions/statements. You have 5 minutes. - In team-based design, documentation and exchange
of design information are important. Describe
communication qualities and how communication
occurred among team members in a Design project
assignment.
25Team-Based Design Communication Knowledge
Assessment (Criteria)
- Criteria Elements Knowledge of Effective
Communication - Five specific elements to be articulated by
students. - Structure (i.e. organization, highly
understandable, flow of thoughts) - Content (i.e. details, key points, clarity of
ideas, complete and accurate information) - Relevance to audience (i.e. communicated well and
understandable to audience) - Team attitude (i.e. co-operation, listening)
- Involvement (i.e. planning meetings, effective
interaction between members)
26Team-Based Design Communication Knowledge
Assessment (Scoring Rubric)
- 5 Points--- Students response shows detailed
knowledge of the listed elements of effective
communication if it includes 5 of 5 elements. - 4 Points --- Students response shows above
moderate knowledge, if it includes 4 elements of
the effective communication. - 3 Points---Students response shows moderate
knowledge of the subject if it includes 3
elements of the effective communication . - 2 Points---Students response shows little
knowledge of the subject if it includes 2
elements of the effective communication criteria.
27Team-Based Design Communication Knowledge
Assessment (Scoring Rubric)
- 1 Point---Students response shows little
knowledge of effective communication if only one
element is indicated. - 0 Points---Students response shows no knowledge
about effective communication in a team-based
design project -
- Note The body of the rubric provides the scale
of 0-5 points with benchmarks at 0,1,2,3,4 and 5.
However, the students were also scored at half
values (I.e.,2.5,3.5) to provide more sensitive
distinction between performance levels. A score
of 5 was given to very well articulated,
comprehensive type responses
28Team-Based Design Communication Knowledge
Assessment (Strong Answer Example)
- The Test Question In team-based design,
documentation and exchange of design information
are important. Describe communication qualities
and how communication occurred among team members
in a Design project assignment. - Each member must be able to speak and explain
things clearly so that the other members should
understand information well. - Ability to speak and write concisely and
accurately, a member must have knowledge to
convey a subject clearly. - Focus on the work on hand during team meetings
and not on other things. Speak what is of
interest for team members
29Team-Based Design Communication Knowledge
Assessment (Strong Answer Example..cont)
- Open discussions and open-minded team members,
willingness to compromise, listen to other ideas,
be patience, and allow all members their
opinions. - Equal input of ideas by each member, ask
questions to clarify problems and set up
meetings. - SCORE 5
- Students response shows all the (5) elements of
effective communication, thus shows detailed
knowledge of the subject
30Team-Based Design Communication Knowledge
Assessment (Weak Answer Example)
- The Test Question In team-based design,
documentation and exchange of design information
are important. Describe communication qualities
and how communication occurred among team members
in a Design project assignment - Communication should be often. Other team members
should understand your point. - Common design formats should be used.
- SCORE 1
- Content, Relevance to Audience, Team Attitude or
Involvement are not mentioned. However student
has made an effort to touch upon one element of
effective Communication (Structure) without
detail, hence a score of 1 is given.
31Team Activity I
- Discussion
- What did you like about the sample rubric?
- What would you change?
32Constructing a Rubric
- Note there are two components involved in this
assessment and evaluation methodology - the test instrument given to the students
- the scoring rubric used by the evaluators
33Constructing a Rubric3,6,9
- Develop appropriate performance goals and
objectives - 2. Select the assessment tasks that reflect and
demonstrate the performance goals - 3. Differentiate between performance levels and
assign relative values to each of the levels
establish expertlevel establish target
students developmental level
34Constructing a Rubric
- 4. Develop descriptive criteria for each level
of performance which correspond with local norms
. holistic or analytical. - Train scorers in application of rubric
- Pilot both test and scoring rubric for
inter-rater intra-rater consistency, apply
cross checking methods - Modify test items and scoring rubric based upon
scoring results content analysis of responses
35Develop Appropriate Performance Objectives and
Tasks Example5
36Team Activity II
- Develop a rubric for
- Laboratory report
- Engineering design project
37Team Activity II
- Discussion
- What does your rubric contain?
- How might you apply this activity to your courses?
38Common Problems (Transferability
Repeatability )
- Transferability and Repeatability
- of Test Questions and Rubric Criteria
- Across similar or different courses
- Over time, or across locales
- Across populations
- Across scorers
- Validity
- Transferability of assessment question
interpretation - Transferability of specifications for expected
performance
- Changes in Curriculum or instruction
- Changes in Performance standards
- Changes in Students prior knowledge
39Common Problems (Transferability
Repeatability..cont)
- Transferability and Repeatability
- of Test Questions and Rubric Criteria
- Across similar or different courses
- Over time, or across locales
- Across populations
- Across scorers
- Different Scorers
- Changes in Scorers knowledge
- Reliability (interacts with validity)
- Inter-rater
- Intra-rater (tends to be more validity sensitive)
40Solutions to Common Problems (Transferability
Repeatability..cont)
- Validity
- Address..
- -Theoretical validity2-- Review literature
other resources for precedents - -Criterion validity2 Ask sample of experts,
novices (if appropriate) and target population to
respond - -Face validity12-- Ask relevant sample of
local users to respond and critique - Content Analyze responses compare target
population to local users, to experts, to
novices ( if appropriate), and to rubric criteria
41Solutions to Common Problems (Transferability
Repeatability..cont)
- Validitycont.
- Modify test questions, if necessary, as indicated
by discrepancies between response content
analysis results of target population and/or
local users, and the rubric - Modify rubric criteria or scoring standards, to
align with expert content and performance levels
or with local user content and performance levels
if these differ from expert results
42Solutions to Common Problems (Transferability
Repeatability..cont)
- Reliability--Train and manage scorers for
intra-rater consistency - By having them take the test, then score their
own and another scorers test, then justify their
scoring to a third party - By having them re-view and re-score the 1st test
they scored after they have completed scoring
their 5th test, and - By having them review and re-score the first 5
tests scored after having completed scoring 10
tests, and continue pattern.
43Solutions to Common Problems (Transferability
Repeatability..cont)
- Reliability--Train and manage scorers for
inter-rater consistency - By duplicating a sampling of all tests and having
all scorers evaluate and score each test - By having all scorers re-view each others
scoring of this common set of test, having them
discuss discrepancies, arrive at consensus on
interpretation and application of rubric criteria
and having them jointly re-score discrepant
tests. - By having all scorers periodically and repeatedly
review, each others scored tests, individually
re-score them, then discuss, and jointly
re-score two tests.
44Solutions to Common Problems (Transferability
Repeatability..cont)
- Reliability Controls
- Halfway through the scoring job, have an outsider
sample each scorers scored tests, and have each
scorer justify his/her scoring of the same items
across several tests. - Report both intra-rater inconsistencies and
inter-rater inconsistencies noted to scorers for
their correction - Repeat process near end of scoring job
- Also calculate and examine inter-rater and
intra-rater consistency rates by test subject,
and by test item as well as inter- item
correlations 8
45Resources Citation References
- Bergeson, Dr. Terry. Office of Superintendent of
Public Instruction web page. Scoring the WASL
Open-Ended Items 1998. 1 May 2002
lthttp//www.k12.wa.us/assessment/assessproginfo/su
bdocuments/TechReports/g4part4.pdfgt - Cronbach, Lee J., Meehl, Paul E. Construct
Validity in Psychological Tests. Psychological
Bulletin (1955). 11 June 2002. http//psychclassic
s.yorku.ca/Cronbach/fl - Ebert-May, Diane. Classroom Assessment
Techniques Scoring Rubrics. Field-tested
Learning Assessment Guide (FLAG) web site 1999.
11 June 2002 lthttp//www.flaguide.org/cat/rubrics/
rubrics1.htmgt - Graduate School of Education Information
Studies. CRESST. UCLA lthttp//www/Rubrics/CRESSTUC
LAassementglossary.htmlgt - Â
46Resources Citation References
- Davis D.C., Gentili K.L., Calkins D.E., Trevisan
M.S. Transferable Integrated Design Engineering
Education (TIDEE) Project." October 1998. 29 May
2002. http//www.cea.wsu.edu/TIDEE/monograph.html
- Moskal, Barbara M. Scoring rubrics what, when
and how? Practical Assessment, Research
Evaluation. (2000). 1 May 2002.
lthttp//ericae.net/pare/getvn.asp?v7n3gt - Rowntree, Derek. Home Page. Designing an
assessment June 2000. 11 June 2002
lthttp//iet.open.ac.uk/pp/D.G.F.Rowntree/derek.htm
lgt - Rudner, Lawrence M. Reducing Errors due to the
Use of Judges. ED355254 ERIC/TM Digest (1992).
11 June 2002 lthttp//ericae.net/db/edo/ED355254.ht
mgt - Â
47Resources Citation References
- Seattle School District. What is a rubric
(2000). 1 May 2002. lthttp//ttt.ssd.k12.wa.us/dwig
hth/rubricclass.htmgt - Stemler, Steve. An overview of content
analysis. Practical Assessment, Research,
Evaluation (2001). 11 June 2002.
lthttp//ericae.net/pare/getvn.aspgt - Summer Technology Institute at Western
Washington University. Rubric for Open-Ended
Math Problems. California CAP Math Report
(1989). 11 June 2002. lthttp//ttt.ssd.k12.wa.us/dw
ighth/rubricclass.htmgt - Trochim, William M.K. Measurement Validity
Types. William M.K. Trochim Cornell University
Home Page (2002). 11 June 2002.
http//trochim.human.cornell.edu/kb
48Wrap Up
- Please complete the workshop evaluation forms
- Thank you!