Assessing Critical Thinking Summer Critical Thinking Institute - PowerPoint PPT Presentation

1 / 53
About This Presentation
Title:

Assessing Critical Thinking Summer Critical Thinking Institute

Description:

Title: 360 Degree Academic Performance Assessment Model Author: Coraggio.Jesse Last modified by: SPC Created Date: 10/3/2006 12:40:24 PM Document presentation format – PowerPoint PPT presentation

Number of Views:329
Avg rating:3.0/5.0
Slides: 54
Provided by: Coraggi1
Category:

less

Transcript and Presenter's Notes

Title: Assessing Critical Thinking Summer Critical Thinking Institute


1
Assessing Critical ThinkingSummer Critical
Thinking Institute
  • QEP Team, Faculty Champions, and Academic
    Roundtables

2008
2
Critical Thinking
  • Beyond the Obvious

3
Assessment Basics
  • Purpose of assessment
  • Creating valid and reliable measures
  • Alignment of goals/measures
  • Use of multiple methods

4
Assessment Basics
  • Why do we assess?
  • To see how well we are doing
  • To confirm what we already know
  • To share our progress with others
  • To see where we can improve and change
  • In some cases to demonstrate what does not work

5
Assessment Basics
  • Why do we assess?

Source http//www.c-pal.net/course/module2/pdf/We
ek1_Lesson5.pdf
6
Assessment Basics
  • Does one size fit all?
  • Assessments need to be valid
  • Assessments need to be reliable

7
Validity
  • Does the assessment measure what it is suppose to
    measure?
  • Validation is the process of accumulating
    evidence that supports the appropriateness of
    inferences that are made of student responses
    (AERA, APA, NCME, 1999)

8
Types of Validity Evidence
  • Content Related - the extent to which a students
    responses to a given assessment reflect that
    students knowledge of the content area
  • Construct Related - the extent to which the
    responses being evaluated are appropriate
    indicators of the underlying construct
  • Criterion Related - the extent to which the
    results of the assessment correlate with a
    current or future event
  • Consequential the consequences or use of the
    assessment results

9
Questions to Examine Validity
  • Content Validity Evidence
  • Does the evaluation criteria address any
    extraneous content?
  • Does the evaluation criteria address all of the
    aspects of the intended content?
  • Is there any content addressed in the task that
    should be evaluated, but is not?

10
Questions to Examine Validity
  • Construct Validity Evidence
  • Are all the important elements of the material
    evaluated through the scoring criteria?
  • Are any of the evaluation criteria NOT relevant
    to the material?

11
Questions to Examine Validity
  • Criterion Validity Evidence
  • What are the important components of the future
    performance that may be evaluated through the use
    of this assessment?
  • How does the scoring criteria measure the
    important components of the future performance?
  • Are there any elements of the future performance
    that are not reflected in the scoring criteria?

12
Reliability
  • Consistency of the assessment scores
  • Types of reliability
  • Interrater Reliability scores vary from
    instructor to instructor.
  • Intrarater Reliability scores vary from a
    single instructor from paper to paper
  • A test can be reliable and not valid, but never
    valid and not reliable

13
Reliability Concerns
  • Reliability
  • Are the score categories well defined?
  • Are the differences between the score categories
    clear?
  • Would two independent raters arrive at the same
    score for a given student response based on the
    scoring rubric?

14
Improving Scoring Consistency
  • Provide grading rubrics or scoring criteria to
    students prior to assessment
  • Grade papers anonymously
  • Use anchor papers to define levels of proficiency
    for reference
  • Use multiple scorers
  • Calculate reliability statistics during training
    and grading

15
Assessment Basics
  • Assessment Purpose
  • Everything needs to align (objectives through
    assessment)
  • SPC QEP example

16
Assessment Basics
Operational Elements (KSAs)
Definition
Measurable Learning Outcomes
Appropriate Assessment Measures
17
SPC Definition
  • Critical thinking is the active and systematic
    process of communication, problem-solving,
    evaluation, analysis, synthesis, and reflection,
    both individually and in community, to foster
    understanding, support sound decision-making, and
    guide action.

18
Student Learning Outcomes
  • In order to link specific and measurable student
    learning outcomes, SPCs definition of critical
    thinking was operationalized.
  • This provided a more concrete and less abstract
    linkage or bridge between the student learning
    outcomes and the definition of critical thinking.

19
Student Learning Outcomes
20
Student Learning Outcomes
21
Outcomes to Assessments
  • Student Learning Outcomes were then linked to
    appropriate assessment instruments
  • SPCs QEP contained multiple measures for use in
    assessing student learning in the area of
    critical thinking

22
Outcomes to Assessments
23
Recent Alumni Survey
  • Question 31 Thinking logically and critically to
    solve problems
  • Gathering and assessing relevant information
  • Inquiring about and interpreting information
  • Organizing and evaluating information
  • Analyzing and explaining information to others
  • Using Information to solve problems

24
Employer Survey
  • Question 3 Use mathematical and computational
    skills
  • Comfortable with mathematical calculations
  • Uses computational skills appropriately
  • Accurately interprets mathematical data
  • Question 5 Think logically and critically to
    solve problems
  • Gathers and assesses relevant information
  • Inquires and interprets information
  • Organizes and evaluates information
  • Analyzes and explains information to others
  • Uses Information to solve problems

25
CCSSE
  • Question 5 During the current school year, how
    much has your coursework at this college
    emphasized the following mental activities?
  • b. Analyzing the basic elements of an idea,
    experience, or theory
  • d. Making judgments about the value or soundness
    of information, arguments, or methods
  • Question 12 How much has YOUR EXPERIENCE AT THIS
    COLLEGE contributed to your knowledge, skills,
    and personal development in the following areas?
  • e. Thinking critically and analytically

26
Assessment Basics
  • Multiple Measures
  • SPC will determine improvement in students
    critical thinking skills using the multiple
    measures.
  • These include standardized direct instruments,
    authentic assessments, and indirect methods.

27
Student Assessment Points
28
Standardized Direct Instruments
  • Direct assessments include
  • CAT - Critical Thinking Assessment Test is
    designed to assess and promote the improvement of
    critical thinking and real-world problem solving
    skills. 
  • Measure of Academic Proficiency and Progress
    (MAPP), developed by Educational Testing Services
    (ETS), is a measure of college-level reading,
    mathematics, writing, and critical thinking in
    the context of the humanities, social sciences,
    and natural sciences
  • The iSkills assessment (former ICT Literacy
    Assessment), developed by ETS, is a comprehensive
    test of Information and Communication Technology
    proficiency that uses scenario-based critical
    thinking tasks to measure both cognitive and
    technical skills.

29
Indirect Methods
  • Student, alumni, employer, faculty, and staff
    reports, such as end-of-course, institutional,
    and national surveys and questionnaires, can
    provide indirect measures that help deepen the
    interpretation of student learning (Maki, 2004).

30
Indirect Methods
  • Indirect methods include
  • Community College Survey of Student Engagement
    (CCSSE), established at UT at Austin, a tool for
    assessing quality in community college education.
    CCSSE contains specific survey items intended to
    assess various Core Operational Elements (KSAs)
    associated with a students critical thinking.
  • Entering Student Survey, Enrolled Student Survey,
    Graduating Student Survey, and Recent Alumni
    Survey are the primary surveys that have been
    developed to collect student feedback on their
    experiences.
  • Employer Surveys are sent out to employers of
    recent SPC graduates in order to gather
    information on graduates knowledge and behavior.

31
Authentic Assessments
  • Authentic assessments serve dual purposes of
    encouraging students to think critically and of
    providing assessment data for measuring improved
    student learning.

32
Authentic Assessments
  • Authentic assessments include
  • Criterion-referenced rubrics. Complex,
    higher-order objectives can be measured only by
    having students create a unique product, whether
    written or oral in-class essays, speeches, term
    papers, videos, computer programs, blueprints, or
    artwork (Carey, 2000).
  • Student Reflection. Written reflection is
    espoused to have several important benefits it
    can deepen the quality of critical thinking,
    increase active involvement in learning, and
    increase personal ownership of the new learning
    by the student (Moon, 1999).
  • Student Portfolios. Collections of students work
    over a course or a program and can be an
    effective method of demonstrating student
    progress in the area of critical thinking (Carey,
    2000).

33
Rubrics
  • What is a rubric?
  • Scoring guidelines, consisting of specific
    pre-established performance criteria, used in
    evaluating student work on performance assessments

34
Rubrics
  • SPC currently uses rubrics in such programs as
  • College of Education
  • College of Nursing
  • Paralegal

35
  • Assessment Rubric
  • for Critical Thinking (ARC)

36
Assessment Rubric for CT
  • ARC was designed to
  • Enhance the QEP
  • Align with the Colleges definition of critical
    thinking
  • Be flexible for use in multi-disciplines

37
Assessment Rubric for CT
  • ARC is a global rubric template developed to
    provide a snapshot view of how student learning
    is being affected by the critical thinking
    initiative.
  • ARC will be designed to assess a variety of
    student projects from a critical thinking
    perspective. For example, students in a
    composition class may be asked to write a paper
    on a specific topic.
  • ARC rubric template will evaluate the students
    use of critical thinking skills in the
    development of the paper as opposed to
    specifically evaluating the quality of students
    writing skills.

38
Assessment Rubric for CT
  • ARC rubric template will be designed to be
    flexible enough to address a number of student
    project modalities including written and oral
    communications.
  • The development of a rubric is an iterative
    process and will be improved and strengthened as
    it is used more widely however, the first
    iteration of the rubric has been developed by the
    QEP faculty champions.

39
Rubric Development Process
  • Re-examine the learning objectives to be
    addressed by the task ?
  • Identify specific observable attributes your
    students should demonstrate ?
  • Describe characteristics at each attribute ?
  • Write narrative descriptions for each level of
    continuum ?
  • Collect samples of student work ?
  • Score student work and identify samples that
    exemplify various levels ?
  • Revise the rubric as needed ?

Repeat as Needed
40
Assessment Rubric for Critical Thinking
41
Assessment Rubric for Critical Thinking
42
Assessment Rubric for Critical Thinking
43
ARC Assignment Profile
  • Designed to provide consistency and accuracy in
    the evaluation of the ARC at the institutional
    level as well as provide guidelines for the use
    at the course level
  • ARC is essentially a tool to evaluate critical
    thinking, but for a tool to be effective it must
    be in the correct situation or job. It would be
    inefficient to use a machete to conduct heart
    surgery.
  • Purpose of the ARC Assignment Profile is to
    outline the most appropriate course assignment

44
ARC Assignment Profile
  • Participating faculty should have one assignment
    during the course that can be evaluated using the
    ARC scoring rubric.
  • Course assignment could be a graded homework
    assignment or a major assessment for the course.
  • Course assignment should include all of the
    elements of the rubric and should be aligned with
    the task outlined for each element.
  • Assignments that only evaluate some of the
    elements or are not aligned with the specific ARC
    tasks will be considered incomplete and not used
    in the institutional analysis.

45
ARC Assignment Profile
  • Faculty may add additional discipline specific
    rubric elements (such as grammar and punctuation
    in a composition class), but must maintain the
    ARC elements as listed.

46
ARC Assignment Profile
  • Students should be provided a copy of the
    assignment rubric (ARC and any additional
    discipline specific elements). The specific
    elements and tasks include
  • Communication Define the problem in your own
    words.
  • Analysis Compare contrast the available
    solutions within the scenario.
  • Problem Solving Select one of the available
    solutions and defend it as your final solution.
  • Evaluation Identify the weaknesses of your final
    solution.
  • Synthesis Suggest ways to improve/strengthen
    your final solution (may use information not
    contained within the scenario).
  • Reflection Reflect on your own thought process
    after completing the assignment.
  • What did you learn from this process?
  • What would you do differently next time to
    improve?

47
ARC Assignment Profile
  • Evaluating scenario (selected or created) should
    be stated in such a manner to allow the student
    to address each of the tasks.
  • QEP team is willing to assist you with the
    creation of the scenario or identify possible
    sources of existing scenario that could be used.
  • Completed student assignments should include a
    copy of the scenario, the assignment provided to
    the student (with the rubric), the students work
    and the final graded rubric.

48
Deer Population Scenario
  • Three teenagers were seriously injured in a car
    accident when swerving to avoid a deer in on a
    two-lane road near a small, rural town in
    Florida. The residents of the town have seen more
    and more deer enter the towns populated areas
    over recent years. Local law enforcement has been
    called numerous times this year to remove the
    animals from backyards and neighborhood streets,
    and one deer even caused considerable damage as
    it entered a restaurant in town. The mayor has
    been charged by the city leaders to keep the town
    residents safe. Local crops have even been
    damaged by the animals. Some long time residents
    have requested that the hunting season and catch
    limits be extended in order to reduce the deer
    population. One city leader even proposed that
    the city purchase electronic devices to deter the
    deer from entering populated areas. Health
    concerns have recently been elevated as three
    deer carcasses were found at the edge of town and
    local law enforcement suspect that the animals
    had been poisoned.

49
Next Steps
  • Another Scoring workshop will be held this Fall
  • Pairs of Faculty Champions (scorers) will
    individually score student work samples and
    identify samples that exemplify various levels

50
Next Steps
  • Faculty Champions (scorers) will complete
    evaluation forms regarding the validity and
    reliability of the ARC rubric
  • Interrater reliability will also be calculated
    from ARC ratings

51
Next Steps
  • Faculty champions will make revisions to the ARC
    and the assignment profile as needed.
  • ARC Development Process will be repeated (Steps 5
    - 7)

52
Questions/Next Steps
53
Assessing Critical ThinkingSummer Critical
Thinking Institute
  • QEP Team, Faculty Champions, and Academic
    Roundtables

2008
Write a Comment
User Comments (0)
About PowerShow.com