Title: Morphing Rubrics to Adapt to Developmental Student Needs
1Morphing Rubrics to Adapt to Developmental
Student Needs
- Aja Henriquez, MFA, EdD Student CSUSB
- English InstructorCrafton Hills College,
California Baptist University
2What is a Rubric?
- Definition of
- rubric ('ru?br?k)
1. a title, heading, or initial letter in a book, manuscript, or section of a legal code, esp one printed or painted in red ink or in some similarly distinguishing manner
2. a set of rules of conduct or procedure
3. a set of directions for the conduct of Christian church services, often printed in red in a prayer book or missal
4. instructions to a candidate at the head of the examination paper
5. an obsolete name for red ochre
6. written, printed, or marked in red
3Why Rubrics?
4School Specific Issues
- Approximately 31 of incoming students transfer
into the developmental level (English 015) at
CHC, while 36 transfer into a lower, remedial
level course (914) Around 33 of students test
into college level English courses (Research
Briefs, 2011). - Approximately 52 of students who pass the
Preparation for College Writing course (English
015) persist in the college and successfully
complete a college level writing course (Student
Equity Data, 2011).
5What We Can Control
- The success rates tell us that there is something
not transferring in our courses. - Many things we cannot control student
readiness/commitment, family problems, financial
problems, etc. - We can address our rubrics and how we communicate
with the students. - As we know from research, the students may simply
misunderstand what we say/write when we grade.
6What is the Purpose of a Rubric?
- We use it to grade for a few reasons
- To assess specific items in a given assignment
- To help ensure or increase uniformity of
assessment - It is a tool for instructors to communicate with
students. - Students must be able to understand the
communication.
7Language Disconnect
- We may be taking for granted student
understanding of our rubrics. - Developmental courses often act as an
introduction into the academic discourse
community (Bizzell, 1982) - students from different social classes come to
school with different abilities to deal with
academic discourse middle-class students are
better suited by their socialization in language
use to deal with academic discourse's relative
formality and abstraction than working-class
students are. This unequal removal from academic
language is, of course, exacerbated for students
whose home language does not resemble the so
called standard English (192).
8Student Perspective
- Its like they the affluent students had their
own classes, and we the students from the
poor/working class had our own classes. We were,
like, segregated in the same school. - --Prep for College Writing Student
9Demographics of Basic Skills Students
- Crafton Hills College, then, serves a community
where the constituents must commute to work or
school, where the poverty and unemployment rates
exceed the state average, and where the income is
below the national average.
10We Should Keep in Mind
- Students may not understand the language we use
on rubrics - We must ask ourselves if we have written the
rubric for our understanding or student
understanding.
11Example Unexamined Rubric
12A Tool That Doesnt Work
- Does a developmental student know what these
terms mean when it comes to writing? - If the students dont understand what the rubric
means, then they cant use it to improve. - How can we make this more understandable?
13Example Revised Rubric
14Other Concerns
- Even with some clearer criteria, there are still
some issues with this rubric, which are less
simplistic than just the language used. - Validity-does this measure what we want it to
measure? - Reliability-are the scores consistent?
15Validity
- Content Validity to ensure we are measuring what
students should know and not some other thing
(personal preferences), we must match each item
on the rubric to a course objective. - Substantive Validity we should make sure that
the items on the rubric connect to different
types of cognitive processes and are of varying
difficulty
16How to Ensure Content Validity
- Review the items on your rubric and make sure
they match to the course objectives along with
any departmental norms. - English departments often have norming sessions
for grading, which these instructors must take
into consideration when evaluating the content of
their rubrics. - EXAMPLE partial rationalization of content
validity ? - Some things that make it into our grading that
should not be there stapling, niceness of
binder or folder, whether it was accompanied with
a latte, etc.
- Item one corresponds with objectives 4, 5, 6, 11,
as well as the grading criteria for writing
(demonstrates skillful use of vocabulary and
syntax is generally free from errors in
mechanics, usage, and sentence structure). - Item two corresponds with objectives 7, 8, 9, as
well as the grading criteria for writing (focuses
clearly on the topic and responds effectively to
all aspects of the assignment explores the
issues thoughtfully and in depth is coherently
and logically organized with a thesis statement
supported by apt reasons and specific, detailed
examples).
17Substantive Validity
18Ensuring Substantive Validity
- First of all, we should examine the items in our
rubrics to ensure there is a spread of
difficulty. - By doing this, we are able to measure the ability
of students who are at different levels ability
(low, mid, and high).
19Substantive Validity
- Items do not match the student ability well.
- It is difficult to establish which items should
be more difficult, since each individual will
have personal difficulties. - There should be a greater spread of difficulty.
20Rasch Model
- The previous slide showed an output table from
Winsteps, which is based on the Rasch
mathematical model. - We dont all have this program or the ability to
use it. - There is a simpler way to examine our student
scores using Excel, which most of us already
have.
21Steps to Examining Scores
- Open an Excel page
- Input the student scores for each item and their
total score. - You do not need to input student names. Simply
using letters will work. - Using Excel, array the scores from highest to
lowest. - Examine the scores.
22Classical Analysis
23Fix Through Metamorphosis
- Item one does not have consistent scores.
- In order to fix this, I would expand this item,
Writing Conventions, into its component parts. - Adding more items will help pinpoint the exact
issues with writing conventions that give
students trouble, which will in turn help me
differentiate instruction based on student need.
24Morphed Rubric, Item One
Performance Levels Dimensions Exemplary (4) Accomplished (3) Developing (2) Beginning (1) Score
Spelling There are no spelling errors. Spelling errors are minimal (1/page) Spelling errors are a problem (2-3/page) Spelling errors are a problem (4/page) /12
Punctuation There are no punctuation or capitalization errors Punctuation errors are minimal (1 per page) Punctuation errors are a problem (2-3 per page) Punctuation errors are excessive (4 per page) /12
Fragments There are no fragments in the essay. Fragments are minimal (1 total) Fragments are a problem (2 total) Fragments are excessive (3 or more total) /12
Run-ons There are no run-ons in the essay. Run-ons are minimal (1 total) Run-ons are a problem (2 total) Run-ons are excessive (3 or more total) /12
25Interpreting the Morph
- Because item one was so problematic, I can
interpret that it is not very clear what the
students can do to improve, even with explicit
in-text notes. (As we may know) - I morphed it by expanding so that both the
students and I could understand what wasnt
working in their assignments. - They can work on spelling instead of run-ons
- I can work on student weaknesses with instruction
26Hidden Issues with Rubrics
- The Rasch analysis with Winsteps suggests that
most of the grade scale is not necessary. - Grading on a 100 point scale suggest the ability
to differentiate an essay in 100 ways (Dr.
Jesunathadas). - As the Revised Rubric suggests, I would use a 1-4
point scale for each item and then report the
final score as a percentage for student
understanding. - The changes to the grading scale will help keep
the scoring effective (no more half scores that
dont differentiate well).
27Item 2
28Revised Rubric
29Morphing Your Rubric(s)
- Be sure your items are aligned with objectives
and department norms - Review your rubric for clear language and
explicit (understandable) description of
proficiency levels - Take one set of assignments and array them on
Excel to find inconsistent items - Any inconsistent items should be expanded so you
can pinpoint what is not working for students
30Morph Your Instruction
- If you notice there is something on the rubric
that students are consistently unsuccessful at,
then you can spend extra time on that in class
instead of on items they have already mastered. - The morphed rubric will help you measure whether
your instruction is helping or if you need to
choose another tactic.
31Is it Working?
- Student scores on each item should be consistent
with their total scores. - If not, keep morphing the rubric/instruction.
- Look for growth in scores over the semester, if
you are using the same rubric each time. (This
also helps students understand consistent
expectations) - Keep in mind student lives will occasionally
impact their scores over the semester, so dont
freak out if all students dont show improvement
all the time.
32Final Thoughts
- Questions?
- This presentation is linked at ajahenriquez.wordpr
ess.com - A screencast of how to manipulate data in excel
is available at - http//www.youtube.com/watch?vnDez8VmlN9A
33References
- Bizzell, P. (1982). Review College Composition
Initiation into the Academic Discourse Community.
Curriculum Inquiry, 12(2), 191-207. - Messick, S. (1995). Validity of psychological
assessment Validation of inferences from
persons' responses and performances as scientific
inquiry into score meaning. American
Psychologist. 50(9), 741-749. - rubric. (n.d.). Collins English Dictionary -
Complete Unabridged 10th Edition. Retrieved
October 20, 2011, from Dictionary.com website
http//dictionary.reference.com/browse/rubric - Traub, R. E. Rowley, G. L., (1991).
Understanding reliability. Educational
Measurement. 10, 37-45.
34Related Titles
- Bond, T. and Fox, C. (2007). Applying the Rasch
Model Fundamental Measurement in the Human
Sciences. Lawrence Erlbaum. - Koretz, D. (2008). Measuring up what education
testing really tells us. Cambridge, MA Harvard
University Press. - Ryan, K., Shepard, L. A. (2008). The future of
test-based educational accountability. New York
Routledge. - Spaulding, D. (2008). Program Evaluation in
Practice Core Concepts and Examples for
Discussion and Analysis. San Francisco Wiley.