Outcomes Assessment 2 Program Assessment - PowerPoint PPT Presentation

1 / 68
About This Presentation
Title:

Outcomes Assessment 2 Program Assessment

Description:

Outcomes Assessment 2 Program Assessment Joseph A. Shaeiwitz West Virginia University joseph.shaeiwitz_at_mail.wvu.edu Daina M. Briedis Michigan State University – PowerPoint PPT presentation

Number of Views:330
Avg rating:3.0/5.0
Slides: 69
Provided by: JoeS52
Category:

less

Transcript and Presenter's Notes

Title: Outcomes Assessment 2 Program Assessment


1
Outcomes Assessment 2Program Assessment
  • Joseph A. Shaeiwitz
  • West Virginia University
  • joseph.shaeiwitz_at_mail.wvu.edu

Daina M. Briedis Michigan State
University briedis_at_egr.msu.edu
2
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

3
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

4
Initial Quiz
  • What do you know about ABET?
  • Describe the engineering criteria?
  • According to ABETs definitions, what is the
    difference between outcomes and objectives?
  • Identify four assessment methods. Classify each
    as direct or indirect.

5
Initial Quiz
  • What do you know about ABET?
  • Describe the engineering criteria?
  • According to ABETs definitions, what is the
    difference between outcomes and objectives?
  • Identify four assessment methods. Classify each
    as direct or indirect.

6
Initial Quiz
  • What do you know about ABET?
  • Describe the engineering criteria?
  • According to ABETs definitions, what is the
    difference between outcomes and objectives?
  • Identify four assessment methods. Classify each
    as direct or indirect.

7
Initial Quiz
  • What do you know about ABET?
  • Describe the engineering criteria?
  • According to ABETs definitions, what is the
    difference between outcomes and objectives?
  • Identify four assessment methods. Classify each
    as direct or indirect.

8
Initial Quiz
  • What do you know about ABET?
  • Describe the engineering criteria?
  • According to ABETs definitions, what is the
    difference between outcomes and objectives?
  • Identify four assessment methods. Classify each
    as direct or indirect.

9
ABET and Engineering Criteria
  • ABET Accreditation Board for Engineering and
    Technology
  • Engineering criteria changed to assessment basis
    (TQI) around 2000
  • Must prove that students are achieving objectives
    and outcomes
  • measure output, feedback model
  • previously, feed forward model

10
Feed Forward Model
curriculum
input
output assumed
11
Feedback Model
education process
one class
one course
graduate
alumnus
one class
entering college
one course
12
The Two Loops of the Engineering Criteria
13
ABET and Engineering Criteria
  • 8 criteria
  • students
  • program educational objectives
  • program outcomes and assessment
  • professional component
  • faculty
  • facilities
  • institutional support and financial resources
  • program criteria

14
ABET and Engineering Criteria
  • Focus of this workshop
  • program educational objectives
  • definition
  • how to establish
  • how to assess
  • program outcomes and assessment
  • definition
  • how to establish
  • how to assess

15
Minute PaperClearest vs. Muddiest Point
  • What have you just learned about ABET and
    assessment?
  • What points are the clearest?
  • What points are the muddiest?

16
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

17
Program Objectives
  • broad statements that describe the career and
    professional accomplishments that the program is
    preparing the graduates to achieve.
  • Our graduates will be successful
  • must define successful
  • Criteria for Accrediting Engineering Programs,
    2007-2008 Cycle, ABET, Inc., Baltimore, MD,
    http//www.abet.org

18
Program Objectives
  • must be
  • detailed and published
  • include constituencies/periodically evaluated
  • educational program to achieve outcomes (defined
    later) and to prepare graduates for
    accomplishments that achieve objectives
  • ongoing evaluation to determine extent objectives
    attained, use results for program improvement
  • Criteria for Accrediting Engineering Programs,
    2007-2008 Cycle, ABET, Inc., Baltimore, MD,
    http//www.abet.org

19
Exercise
  • New program in nanobiomolecular engineering
  • Define two program objectives.

20
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

21
Program Outcomes
  • statements of what students are expected to
    know and be able to do by the time of their
    graduation.
  • Outcomes must foster attainment of the program
    objectives
  • Process to produce outcomes
  • Assessment process, with documented results
  • demonstrating measurement
  • demonstrating degree of achievement
  • Evidence results used for program improvement
  • Criteria for Accrediting Engineering Programs,
    2007-2008 Cycle, ABET, Inc., Baltimore, MD,
    http//www.abet.org

22
Program Outcomes
  • Minimum outcomes a-k plus program criteria
  • Opportunity to be unique, i.e., define unique
    outcomes, not just repeat a-k
  • Helpful to map program-defined outcomes into a-k
  • Helpful to map outcomes into classes
  • Necessary to map outcomes into objectives

23
WVU ABET 1 chemical process 2 communicate 3 computers 4 learn independently and group work 5 lab and data analysis 6 continuing education 7 safety, societal, environmental 8 ethics 9 broad education
a. apply math, sci, engr ?
b. expts - design, conduct, analyze, interpret data ?
c. design system ?
d. multidisciplinary teams ?
e. identify, formulate, solve engineering problems ?
f. professional and ethics ?
g. communication ?
h. broad education - global impact ?
i. life-long learning ? ?
j. contemporary issues ? ?
k. use techniques, skills, modern engineering tools ? ?
24
WVU outcome WVU class 1 chemical process 2 communicate 3 computers 4 learn independently and group work 5 lab and data analysis 6 continuing education 7 safety, societal, environmental 8 ethics 9 broad education
CHE 201 ? ? ? ? ? ?
CHE 202 ? ? ? ? ? ?
CHE 230 ? ? ? ? ?
CHE 310 ? ? ? ? ? ?
CHE 311 ? ? ? ? ? ?
CHE 312 ? ? ? ? ? ?
CHE 315 ? ? ? ? ? ?
CHE 320 ? ? ? ? ? ?
CHE 325 ? ? ? ? ? ?
CHE 450/451 ? ? ? ?
CHE 455/456 ? ? ? ? ? ? ? ?
25
Exercise
  • Define three unique program outcomes for the
    nanobiomolecular engineering program.

26
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

27
Performance Criteria
  • What will students do to demonstrate achievement
    of outcome
  • Example An ability to communicate
    effectively. (3g in ABET list)
  • What are attributes of effective communication?

28
Performance Criteria for Effective Communication
  • When making an oral presentation, students will
  • maintain eye contact
  • A written report will
  • follow a prescribed format
  • demonstrate proper grammar and punctuation
  • adhere to commonly accepted word usage

29
Performance Criteria
Learning Outcome
1.Has knowledge of current technological issues
related to XXX engineering and society 2. Is
able to discuss major political and societal
issues and their pertinence to XXX engineering
Educational Objective
Contempo-rary Issues
Professionally responsible
30
Exercise
  • Define two or three performance criteria for one
    of the outcomes previously defined.

31
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

32
Indirect vs. Direct Assessment
  • Indirect based mostly on student
    self-evaluation
  • surveys
  • interviews
  • focus groups
  • Direct by faculty or some other means of
    evaluation of student performance (advisory
    boards evaluate design projects)

33
Indirect vs. Direct Assessment
  • Indirect
  • necessary, but not sufficient provides
    uncalibrated snapshot
  • self-assessment not necessarily reliable
  • Terminology may be unfamiliar
  • Direct
  • necessary for quality assessment plan
  • not now for all, but new Program Evaluators are
    being trained to look for this feature
  • relies on faculty experience, expertise, and
    judgment

34
How do we know if the students have the requisite
outcomes?
  • When the students think they do, based on student
    survey results
  • With direct evidence from student work
  • Evidence is the key to accreditation
  • Faculty evaluation of student work is the key to
    providing evidence

35
Assessment Measures
  • Primary assessment of student outcomes should be
    based on student work (direct)
  • e.g., student portfolios, student projects,
    assignments and exams, some employer surveys
    where skill is observed
  • Secondary evidence
  • Senior exit surveys, alumni surveys, employer
    surveys (qualitative evidence based on opinion),
    other
  • Combination of both methods triangulation

36
Indirect Assessment Measures
  • Surveys
  • Interviews
  • Course satisfaction surveys

37
Direct Assessment Measures
  • End-of-Course Assessments
  • Targeted Assignments
  • Capstone Experiences
  • Capstone Exams
  • Portfolios

38
End-of-Course Assessments
  • Course should have objectives
  • perhaps set by department committee
  • Related to program outcomes
  • Assigned problems (assignments, exams, projects)
    each related to course objectives
  • Evaluation/reflection by instructor

39
End-of-Course Assessments
  • Advantages
  • quick and easy
  • assessment can be done in parallel with grading
  • Disadvantage
  • not comprehensive
  • no big picture
  • Opinion
  • a component of assessment plan

40
Targeted Assignments/Problems
  • Key assignments that relate to specific program
    outcomes
  • Multiple assignments per outcome recommended
  • Integrate through curriculum
  • can demonstrate progress toward achievement of
    program outcome

41
Targeted Assignments/Problems
  • Advantages
  • quick and easy
  • assessment can be done in parallel with grading
  • Disadvantages
  • none really
  • need consistent evaluation method with reliable
    inter-rater reliability

42
Capstone Experiences
  • Can be design, laboratory, research
  • All programs have them
  • Where students are supposed to demonstrate and
    synthesize what learned
  • Usually includes teamwork, communication

43
Capstone Experiences
  • Advantages
  • already required in program
  • assessment can be done in parallel with grading
  • Disadvantages
  • none really
  • need consistent evaluation method with reliable
    inter-rater reliability

44
Capstone Exams
  • FE Exam
  • detailed, subject-related results available
  • Department-generated

45
Capstone Exams
  • Advantages
  • FE is standardized
  • the most direct measure possible
  • Disadvantages
  • FE may not set bar as high as some want
  • students may not take department-generated exam
    seriously if results do not impact grades or
    graduation

46
Portfolios
  • Collection of student work demonstrating outcomes
  • Must also evaluate the portfolio
  • Can also have students reflect on work in
    portfolio

47
Portfolios
  • Advantage
  • comprehensive
  • Disadvantages
  • portfolio evaluation is additional work
  • need consistent evaluation method with reliable
    inter-rater reliability

48
Other Direct Assessment Methods
  • Journals
  • Concept maps
  • Oral presentations with follow-up questions (like
    M.S./Ph.D. defense)

49
Exercise
  • Select an assessment method to be used for direct
    assessment of program outcomes previously
    defined.

50
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

51
Rubrics
  • A set of categories developed from the
    performance criteria that define extent to which
    the performance criteria are met (progression
    towards attaining).

52
Developing Rubrics
  • Define levels of performance for each performance
    criterion
  • best first to define top and bottom levels of
    performance then fill in middle ground
  • 3-5 levels of performance recommended

53
Developing Rubrics
  • Standardized method to ensure inter-rater
    reliability
  • Specific definitions of terms like excellent,
    understand, not acceptable, exceeds expectations
  • Initial effort and periodic review required

54
Developing Rubrics
  • If have five levels of performance, can make each
    level a grade
  • if have three levels, can make each level A/C/F,
    and interpolate for B/D
  • Use rubric for assessment and grading
  • Advantage Students and faculty have clearly
    defined criteria for grading of what appears to
    be subjective (lab and design reports, oral
    reports, etc.)

55
Application of math science
56
Problem Assessment Form
  • Connects physical model with math model
  • Able to write unsteady state mass balance
  • Makes appropriate substitution for flow terms
  • Makes appropriate simplification for flow terms
  • Converts differential equation into Laplace form
    correctly
  • Understands idea for output/input form of
    transfer function
  • Correctly combines Laplace transforms in series
  • Correct answer

57
Rubrics Assessment ScaleApplication of math
science (5high)
  • Level 5
  • Formulates models correctly
  • Applies calculus or linear algebra to solve
    problems
  • Correct calculations
  • Correct statistical analysis
  • . . .
  • Level 3
  • Formulates models with some trouble
  • Some understanding of calc/linear algebra
    applications
  • Minor calc errors
  • Minor statistical errors
  • . . .
  • Level 1
  • Not able to model
  • Cannot apply calc/linear algebra
  • Incorrect calculations
  • Does not apply statistics
  • . . .

58
Performance Criteria
Learning Outcome
1.Has knowledge of current technological issues
related to XXX engineering and society 2. Is
able to discuss major political and societal
issues and their pertinence to XXX engineering
Educational Objective
Contempo-rary Issues
Professionally responsible
59
Outcome A knowledge of contemporary issues
Performance Criteria Scoring Rubrics
Rating Scale Element Needs Improvement 1 2 Met Expectations 3 4 Exceeded Expectations 5 N/A
Has knowledge of current technological issues related to chemical engineering and society (global warming, resource depletion, waste proliferation, etc.) Has minimal knowledge of technological issues and their relevance to chemical engineering has weak connection between the issue and scientific principles for analysis and has trouble developing solutions. Has reasonable knowledge of technological issues some may not be directly relevant to chemical engineering can apply scientific principles to analysis and suggest solutions when guided. Has thorough knowledge of current technology issues related to chemical engineering and is able to analyze them and propose solutions using scientific principles
Has knowledge of and is able to discuss major societal and political issues and their pertinence to chemical engineering Has minimal knowledge of societal and political issues if given an issue, does not see its connection to engineering without instruction is minimally effective in discussion and presentation of such issues Has reasonable knowledge of societal and political issues recognizes some connection to chemical engineering, but misses the details is somewhat effective in discussing and presenting such issues when prompted Has thorough knowledge of societal and political issues related to chemical engineering recognizes the big picture and the details presents strong discussion of such issues
60
(No Transcript)
61
(No Transcript)
62
(No Transcript)
63
Rubrics
  • Additional rubrics at
  • http//www.che.cemr.wvu.edu/ugrad/outcomes/rubric
    s/index.php
  • Instructions on rubrics
  • http//webquest.sdsu.edu/rubrics/weblessons.htm

64
Exercise
  • For the two or three performance criteria for one
    of the outcomes previously defined, and assuming
    the assessment method previously defined, begin
    to develop an evaluation rubric.

65
Outline
  • ABET and engineering criteria
  • Program objectives
  • Program outcomes
  • Assessment
  • performance criteria
  • assessment measures direct and indirect
  • rubrics
  • Review

66
Recommendation
  • Strongly recommend adopting direct assessment
    measures into assessment activities
  • Will be expected in the near future!

67
Exercise
  • What are the two most important things that you
    learned in this workshop?
  • What is still unclear to you about program
    assessment?

68
Questions
  • ?
Write a Comment
User Comments (0)
About PowerShow.com