Assessment - PowerPoint PPT Presentation

1 / 114
About This Presentation
Title:

Assessment

Description:

Assessment & Review of Graduate Programs Duane K. Larick, NC State University David L. Wilson, Southern Illinois University Carbondale Council Of Graduate Schools – PowerPoint PPT presentation

Number of Views:247
Avg rating:3.0/5.0
Slides: 115
Provided by: MichaelC197
Category:

less

Transcript and Presenter's Notes

Title: Assessment


1
Assessment Review of Graduate Programs
  • Duane K. Larick, NC State University
  • David L. Wilson, Southern Illinois University
    Carbondale
  • Council Of Graduate Schools
  • Pre-Meeting Workshop
  • December 8, 2004

2
Guidelines for This Presentation
  • Please turn off or silence you cell phones
  • Please feel free to raise questions at anytime
    during the presentation
  • We have included a set of discussion questions
    along the way
  • We will also leave time at the end for general
    discussion.
  • We are very interested in your participation

3
Agenda
  • Introduction and Objectives
  • Overview of Graduate Program Review
  • Reasons for Graduate Assessment
  • General Process of Program Review
  • Process or Processes for Development of a Program
    Review Procedure
  • External program review
  • Outcome based continuous ongoing review
  • Comparative Data Sources
  • Case Studies
  • Southern Illinois University Carbondale
  • North Carolina State University
  • Summary and Discussion

4
Objectives
  • Discuss various motivators for undertaking
    graduate assessment
  • Increase overall awareness of recent trends in
    Graduate Program Review
  • Demonstrate practical experience/knowledge gained
    related to development and implementation of
    external reviews and outcome-based continuous and
    ongoing procedures for Graduate Program Review
  • Illustrate examples of data and managerial tools
    developed/utilized to improve the efficiency of
    the process

5
Why Assess Graduate Programs (external drivers)?
  • Improvement in the quality of graduate education
  • To help satisfy calls for accountability
  • Especially at the State level
  • Requirement for regional accreditation,
    licensure, etc.

6
Why Assess Graduate Programs (internal drivers)?
  • Meet short-term (tactical) objectives or targets
  • Meet long-term (strategic) institutional/departme
    ntal goals
  • Funding allocation/reallocation
  • Funded project evaluation (GAANN, IGERT)
  • Understand sources of retention/attrition among
    students and faculty

7
(No Transcript)
8
Accreditation Agencies
  • Southern Association of Colleges and Schools
  • Western Association of Colleges and Schools
  • Northwest Association of Colleges and Schools
  • North Central Association
  • New England Association of Schools and Colleges
  • Middle States Commission on Higher Education

9
SACS Principles of Accreditation
  • Core requirement 5 The institution engages in
    ongoing, integrated, and institution-wide
    research-based planning and evaluation processes
    that incorporate a systematic review of programs
    and services that (a) results in continuing
    improvement and (b) demonstrates that the
    institution is effectively accomplishing its
    mission.

10
SACS Criterion for Accreditation
  • Section 3 Comprehensive Standards - 16
  • The institution identifies outcomes for its
    educational programs and its administrative and
    educational support services assesses whether it
    achieves these outcomes and provides evidence of
    improvement based on analysis of those results.

11
SACS Principles of Accreditation
  • Section 3 Comprehensive Standards Standards
    for All Educational Programs
  • 12. The institution places primary
    responsibility for the content, quality, and
    effectiveness of its curriculum with the faculty
  • 18. The institution ensures that its graduate
    instruction and resources foster independent
    learning, enabling the graduate to contribute to
    a profession or field of study.

12
Northwest Association of Colleges and Schools
  • Standard 2.B
  • The institution identifies and publishes the
    expected learning outcomes for each of its degree
    programs
  • The institutions processes for assessing its
    educational programs are clearly defined,
    encompass all of its offerings, are conducted on
    a regular basis, and are integrated into the
    overall planning evaluation plan
  • The institution provides evidence that its
    assessment activities lead to the improvement of
    teaching and learning

13
Intent of Accreditation Agency Effort
  • The intent of the accrediting agencies is to
    encourage institutions to create an environment
    of planned change for improving the educational
    process.

14
State Mandated Reviews and Assessment
  • Illinois Board of Higher Educations Priorities,
    Quality, and Productivity (P.Q.P.) Initiative
    (1992)
  • IBHEs Framework for Reviewing Priorities,
    Productivity, and Accountability (PPA) in
    Illinois Higher Education (December 2003)

15
So, The Questions We Need To Ask Ourselves Are?
  • What are we currently doing?
  • Why are we currently doing it?
  • Is what we are currently doing accomplishing the
    external goals just described above?
  • Is what we are currently doing accomplishing the
    internal goals described above?
  • Is there a better way?
  • Who defines better?

16
General Procedure(s) for Review of Graduate
Programs
  • External program review conducted on a 5 10
    year cycle
  • Standard practice at most Institutions
  • Outcome-based continuous and ongoing program
    review
  • Being implemented by many in response to regional
    and state accreditation requirements and
    institution needs

17
General Process for External Reviews
  • Operational Procedures
  • 8 - 10 year review cycle
  • Components
  • Internal self-study
  • External team review
  • Review teams report
  • Programs response
  • Administrative Meeting

18
General Process for External Reviews
  • Administration
  • Typically Administered by the Dean of the
    Graduate School or centrally through the
    Provosts Office
  • Initiated by program or the administrating office
  • Often conducted at the Department level
  • Includes multiple degrees/programs

19
Typical Objectives for External Reviews
  • Reviews are conducted to gain a clearer
    understanding of a programs
  • Purpose(s) within the Institution
  • Effectiveness in achieving purposes
  • Overall quality
  • Future objectives
  • Changes needed to achieve objectives

20
General Process for External Reviews
  • Information Made Available at the Institution
    Level (examples include)
  • Enrollment numbers, demographic
  • Applications
  • numbers
  • applied/admitted/enrolled
  • quality indicators
  • Number of degrees awarded, time to degree
  • Financial support
  • Exit interviews

21
General Process for External Reviews
  • Self-Study
  • Purpose
  • Encourage stakeholders in a thoughtful and
    creative study and evaluation of the programs
    academic performance in relation to the
    Institutions mission
  • Philosophy
  • Review must cover all components of the programs
    mission
  • teaching, research, and outreach

22
General Process for External Reviews
  • Key Self-Study Components
  • Program description including objectives
  • Faculty distribution quality
  • Students need, enrollment, quality, degrees
    granted, support
  • Curriculum/Instruction
  • Masters Doctoral degrees granted

23
General Process for External Reviews
  • Key Self-Study Components
  • Teaching, research, and service participation
  • Current research national comparison, external
    support, interdisciplinary projects
  • Methods for internal program review
  • Recent changes why
  • Strengths, Weakness Opportunities

24
General Process for External Reviews
  • Review Team Make-up
  • On-Campus Representation
  • Often a Graduate School and/or Graduate Faculty
    Representative
  • One or more off-campus external experts
  • Depends on scope of program(s) being reviewed
  • Can add to expense

25
General Process for External Reviews
  • Review Team Visit
  • Often 2-4 days in length
  • Generally meet with University and College
    administration in addition to faculty and students

26
General Process for External Reviews
  • Review Team Report
  • Generally includes some form of an analysis of
    the strengths, weakness, opportunities for and
    needs of the graduate program from the
    perspective of their peers

27
General Process for External Reviews
  • Final Administrative Meeting
  • Final meeting to discuss the outcome(s) of the
    review
  • Should include proposed action items with a
    follow-up schedule

28
Discussion Questions?
  • How many of your institutions have a graduate
    program review process similar to what was just
    described?
  • What are some of the variations that exist?
  • How often or what is the frequency of review
    remember the words continuous improvement

29
Discussion Questions? continued
  • Who should coordinate the review of graduate
    programs? What should the role of the Graduate
    School be?
  • Should the external review be comprehensive in
    nature i.e. encompass all roles of the program?
  • Should the review be tied to other reviews
    licensure, accreditation, etc.?
  • Who pays for the external review and how much is
    reasonable?

30
Outcome-Based, Continuous and Ongoing Review of
Graduate Programs
  • There are fewer Institutional models or norms to
    go by when it comes to designing and implementing
    this type of review process
  • Goal is generally to establish an outcomes-based
    program that is continuous rather than sporatic
  • The program periodically reports the nature and
    outcomes of the review process to the Institution
    and appropriate external agencies (State,
    accreditation agencies, etc)
  • Results are used by the program and Institution
    for planning purposes

31
What Is Outcomes-Based Assessment?
  • The process of (1) determining the indicators of
    an effective program, (2) using those indicators
    as criteria for assessing the program, and (3)
    applying the results of the assessment toward the
    ongoing and continuous improvement of the program.

32
What Is Outcomes-Based Assessment?
  • Shift to student learning centered concerns
  • What do we want our students to know?
  • How well does the program promote learning?
  • Moves from the quality of presentation to How
    well did the student learn it?
  • Assesses achievement of the outcomes on a
    continuous rather than episodic basis

33
Potential Benefits of Assessment Planning Process
  • Gives faculty a voice in defining the program and
    thus a stake in the program
  • Gives faculty an investment in assessing the
    program
  • Provides faculty-approved indicators for gauging
    and improving the effectiveness of the program

34
Where Do We Start When Considering an
Outcome-Based Process?
  • It Sometimes Helps to Ask the Following
    Questions?
  • Do our graduate programs have a clearly stated
    objectives?
  • Do we have departmental plans to evaluate the
    effectiveness of our degree programs?
  • Do our degree programs have clearly defined
    faculty expectations (outcomes) for students?
  • Are they measurable or observable?
  • Do we use data to assess the achievement of
    faculty expectations for students?
  • Do we make changes in our programs based on the
    outcomes of these assessments?
  • Do we document that assessment is done and
    results are used to make change?

35
Outline of the Process
  • Development of program specific objectives
  • Development of program specific outcomes
  • Development of an assessment plan or a Schedule
    for assessing and reporting outcomes
  • Development of the necessary database at the
    program and institutional level
  • Development of the appropriate managerial tools

36
Keys to Success
  • The department should want to do this process
  • The department must use the information collected
  • Demonstrate change as a result of findings
  • The institution must use the information
    collected
  • It should somehow tie to resource decisions
  • Use participation in the process as part of
    faculty reviews

37
What Are Objectives?
  • Program objectives are the general goals that
    define what it means to be an effective program.

38
Three Common Objectives
  • Developing students as successful professionals
    in the field
  • Developing students as effective researchers in
    the field
  • Maintaining or enhancing the overall quality of
    the program

39
What Are Outcomes?
  • Program outcomes are specific results the program
    seeks to achieve in order to attain the general
    goals defined in the objectives.
  • These can be thought of as faculty expectations
    of students completing the degree program

40
Example for Outcome 2 Effective Researchers
  • 2. To prepare students to conduct research
    effectively in XXXX in a collaborative
    environment, the program aims to offer a variety
    of educational experiences that are designed to
    develop in students the ability to
  • a. read and review the literature in an area of
    study in such a way that reveals a comprehensive
    understanding of the literature
  • b. identify research questions/problems that are
    pertinent to a field of study and provide a focus
    for making a significant contribution to the
    field
  • gather, organize, analyze, and report data using
    a conceptual framework appropriate to the
    research question and the field of study
  • interpret research results in a way that adds to
    the understanding of the field of study and
    relates the findings to teaching and learning in
    science

41
Objectives and Outcomes
  • Objectives general, indefinite, not intended to
    be measured they set the overall agenda for the
    program
  • Outcomes specific, definite, intended to be
    measured they establish the particular means by
    which the agenda is achieved

42
Critical Questions for Assessment
  • What are the indicators of effectiveness for our
    program? Objectives and Outcomes
  • How do we determine whether or not our program is
    meeting the outcomes? Outcomes Assessment Plan
  • How effective is our program in terms of the
    outcomes? Outcomes
    Assessment
  • What does our assessment suggest for improving
    the program? Continuous and Ongoing
    Improvement

43
Four Questions for Creating an Assessment Plan
  • What types of data should we gather for assessing
    outcomes?
  • What are the sources of the data?
  • How often are the data to be collected?
  • When do we analyze and report the data?

44
Types of Data Used
  • Take advantage of what you are already doing
  • Preliminary exams
  • Proposals
  • Theses and dissertations
  • Defenses
  • Student progress reports
  • Student course evaluations
  • Faculty activity reports
  • Student exit interviews

45
Types of Data Used
  • 2. Use Resources of Graduate School and Your
    Institutional Analysis Group
  • Enrollment statistics
  • Time-to-degree statistics
  • Student exit data
  • Ten-year profile reports
  • Alumni surveys

46
Types of Data Used
  • Use your imagination to find other kinds of data
  • Dollar amount of support for faculty
  • Student cvs
  • Faculty surveys

47
Data Two Standards to Use in Identifying Data
  1. Appropriateness Data should provide information
    that is suitable for assessing the outcome
  2. Accessibility Data should be reasonable to
    attain (time, effort, ability, availability,
    resources)

48
Four Questions for Creating an Assessment Plan
  • What data should we gather for assessing
    outcomes?
  • What are the sources of the data?
  • How often are the data to be collected?
  • When do we analyze and report the data?

49
Sources of Data
  • Students
  • Faculty
  • Graduate School
  • Graduate Program Directors
  • Department Heads
  • Registration and Records
  • University Information Technology
  • University Planning and Analysis

50
Four Questions for Creating an Assessment Plan
  • What data should we gather for assessing
    outcomes?
  • What are the sources of the data?
  • How often are the data to be collected?
  • When do we analyze and report the data?

51
Frequency of Data Collection
  • Every semester
  • Annually
  • Biennially
  • When available from individual graduate students
  • At the preliminary exam
  • At the defense
  • At graduation

52
Ordering Outcomes for Assessment
  • More pressing outcomes earlier and less pressing
    ones later
  • Outcomes easier to assess earlier and outcomes
    requiring more complex data gathering and
    analysis later
  • Approximately the same workload each year of the
    assessment cycle

53
Four Questions for Creating an Assessment Plan
  • What data should we gather for assessing
    outcomes?
  • What are the sources of the data?
  • How often are the data to be collected?
  • When do we analyze and report the data?

54
Creating an Assessment Timetable
  • Standard practice appears to be to call for a
    short annual or biennial assessment report
  • Longer cycles loose the impact on the continuous
    and ongoing nature
  • When possible combining with an external review
    program i.e. including assessment reports as part
    of the self study is recommended

55
Four Questions for Creating an Assessment Plan
  • What data should we gather for assessing
    outcomes?
  • What are the sources of the data?
  • How often are the data to be collected?
  • When do we analyze and report the data?

56
Discussion Questions?
  • How many of your institutions have an
    outcome-based graduate program review process?
  • How many of you are considering implementing such
    a review program?
  • What are some of the variations that exist?
  • How often are your assessment reports due?

57
Discussion Questions? continued
  • For those of you with an outcome-based review
    process, or for that matter, for those of you
    considering implementing such a process what was
    (is) the driving force in that decision?
  • What has been the level of campus buy-in?

58
Case Studies
  • Graduate Program Review At
  • Southern Illinois University Carbondale
  • North Carolina State University

59
Impact of State Mandates on SIUC
  • Illinois Board of Higher Educations Priorities,
    Quality, and Productivity (P.Q.P.) Initiative
    (1992)
  • IBHEs Framework for Reviewing Priorities,
    Productivity, and Accountability (PPA) in
    Illinois Higher Education (December 2003)

60
P.Q.P. in Illinois
  • Statewide Productivity Assessment of Graduate
    Programs
  • Capacity in Relation to Student Demand
  • Capacity in Relation to Occupational Demand
  • Centrality in Relation to Instructional Mission
  • Success of Graduates
  • Program Costs

61
P.Q.P. Guidelines for Elimination of Degree
Programs (IBHE, August 1992)
  • Institutions Should Consider Eliminating
    Programs
  • Whose credit hours, enrollments, and degree
    production significantly deviate from statewide
    or institutional averages
  • In fields in which projected statewide job
    openings are low or projected to slow
  • In fields that enroll a relatively small
    proportion of non-majors

62
Elimination guidelines cont.
  • That have been found to have quality deficiencies
    based upon their most recent program reviews
  • That exhibit low job placement rates, lack of
    student and alumni satisfaction and support, and
    low graduate admissions of pass rates on
    licensure exams
  • Whose costs significantly deviate from statewide
    avg. expenditures per FTE in the discipline

63
Role of IBHE, University Boards of Trustees and
Campus in Program Reviews (IBHE, Nov. 1994)
  • IBHE has statutory authority to review public
    institution instructional programs and to
    communicate to governing boards any programs that
    the Board finds to be educationally and
    economically not justified
  • Univ. Boards of Trustees have statutory authority
    to eliminate academic programs
  • Each Univ. has a Mission Statement which sets
    forth the campus values and aspirations

64
Role of IBHE cont.
  • Each Univ. has a Focus Statement that describes
    the distinctive strengths and contributions of
    each of the 12 public universities to Illinois
    higher education
  • Each must annually produce a Priorities
    Statement which should guide decisions to
    allocate current funds and develop new programs
    and budget requests. These statements are to be
    included in each universities annual Resource
    Allocation and Management Plan (RAMP)

65
Role of IBHE cont.
  • To integrate information from program review
    into campus governing board, and state-wide
    decision making in the P.Q.P. initiative, the
    public university academic program review process
    was revised (by IBHE) FOR 1993-94.
  • The revised program review process requires that
    the 12 public universities Submit Reviews of
    Similar Programs the same year in an eight-year
    cycle and for IBHE staff to identify issues to be
    addressed in a Statewide Analysis Prior to Campus
    Reviews.

66
Elements of the Illinois Statewide Program Review
Process (IBHE RAMP Manual, 1993)
  • The IBHE review schedule assures the submission
    of reviews of similar programs by all (twelve
    public) universities at the same time.
  • Prior to the review, an IBHE statewide analysis,
    coordinated with the review schedule, defines
    statewide issues, examines capacity in fields of
    study across universities, and provides
    comparative information for institutional reviews
    of individual programs.

67
Elements of Program Review cont.
  • Illinois universities conduct the program reviews
    according to campus-developed procedures and
    submit the results of reviews to the IBHE.
  • IBHE staff analyze the program review reports and
    provide recommendations on the educational and
    economical justification of selected programs
    in the staffs annual Priorities, Quality and
    Productivity (P.Q.P.) report.
  • Universities must follow the coordinated review
    schedule but may conduct reviews within a
    reasonable period (e.g. up to 3 years) prior to
    submission date in order to coordinate reviews
    with accreditation and other evaluations.

68
Elements of Statewide Analysis for Each Program
Area to be Reviewed (RAMP Manual, 1993)
  • Trends in enrollment and degrees granted
  • Student characteristics
  • Program costs
  • Occupational demand
  • Recommendations for expansion or reduction of
    programs on a statewide basis
  • Universities will be asked to respond to the
    elements of the statewide analysis in their
    program reviews

69
Elements of the Program Review Reports to the
IBHE (RAMP Manual, 1993)
  • A 1 to 2 page summary of the review, submitted by
    July 1st of each year, should address the
    following questions and the key findings and
    recommendations in each of these areas should
    constitute its substance
  • Student demand
  • Occupational demand
  • Centrality to instructional mission
  • Breadth
  • Success of graduates
  • Costs
  • Quality
  • Productivity

70
P.Q.P. Impact on SIUC Graduate Programs
  • Graduate Council from 1993-96 examined all
    graduate programs on campus, using data supplied
    by IBHE and also generated by the Graduate School
  • Programs scheduled for elimination or substantial
    reduction had the opportunity to respond to
    recommendations in a series of Graduate Council
    meetings and special forums

71
P.Q.P. Impact cont.
  • 7 Ph.D. programs were eliminated or consolidated
    (Communication Disorders Higher Ed. Molecular
    Science Physical Ed. Special Ed. Geology and
    Geography
  • Eventually, Geography (Liberal Arts), Geology
    (Science), and Agribusiness Economics
    (Agriculture) created a new interdisciplinary
    Ph.D. program in Environmental Resources and
    Policy within the Graduate School
  • 5 masters programs and several post
    baccalaureate specializations were eliminated
  • Most of these eliminations occurred outside of
    regular review process, though some were informed
    by that process

72
SIUC Graduate Council Response
  • The P.Q.P. initiative illuminated flaws in the
    review process and especially problems with
    inconsistent data about programs
  • GC Program Review Committee took a hard look at
    the problems and suggested substantial changes in
    the campus review process
  • In 1999, a new review format was put in place
    (see handout)

73
Problems with IBHE Statewide Review of Similar
Programs
  • Clearly, IBHE through P.Q.P. hoped to use the
    statewide review of similar programs as a vehicle
    to reduce or eliminate programs seen as
    duplicative, too costly, unproductive, etc.
  • At first, IBHE wanted to appoint the
    reviewersproved to be impractical
  • Accreditation cycles did not march in sync
  • IBHE went through several attempts to revise the
    review cycle mandated in P.Q.P.

74
IBHE Review cont.
  • IBHE creates the Illinois Commitment in February
    1999, including including the stipulation that
    By 2004, all academic programs will
    systematically assess student learning and use
    assessment results to improve programs.
  • IBHE Redesign of Public Institution Academic
    Program Approval and Review Processes (April
    2002) see handout, especially p. 38
  • SIUC had created an annual outcomes-based
    assessment requirement for all programs in 1999,
    in part, because of the North Central
    accreditation process see http//www.siu.edu/ass
    essment/

75
IBHE Priorities, Productivity, and
Accountability (PPA, December 2003)
  • Illinois system of higher education must have a
    clear sense of its priorities, ensure the
    efficient and productive use of existing
    resources, and demonstrate public accountability
    before seeking additional assistance from the
    taxpayer and student

76
IBHE PPA in Action
  • IBHE board chair indicates that focus of PPA will
    be on faculty productivity at the 12 publics and
    50 community colleges without regard for mission
    or focus of the institution
  • 12 Publics indicate that this approach is not
    practical given the nature of their differing
    missions and focuses and board chair backs off
  • Two subcommittees formed to begin the process

77
PPA cont.PPA in Action, cont. (Committee
Minutes, May 25, 2004)
  • Subcommittee A is focusing on mission/focus
    statements, program approval processes, and more
    qualitative issues, including
  • Reviewing statewide enrollment by program (all
    degree levels),new programs, degrees awarded,
    programs discontinued
  • Considering Impact of Technology on faculty work

78
PPA in Action, cont.
  • Faculty review includes
  • Faculty roles in terms of hours per week of
    formal class, preparation, conferences
    supervising remedial or advanced work keeping up
    with discipline course design,
  • Measure of work (Quantitative) contact hours,
    release and/or assigned time, class size
  • Promotion/Tenure including research, teaching,
    service

79
PPA in Action, cont.
  • Subcommittee B is focusing on state-level
    accountability mechanisms and processes
  • The subcommittees continue to meet and no
    recommendations have been made as of this date
  • The saga continues

80
Other Factors that Currently Impact or Inform the
Review/Assessment Process at SIUC
  • Southern at 150 Building Excellence Through
    Commitment (2002 see http//news.siu.edu/s150/)
  • The goal is to articulate a series of
    commitments and actions that will place us among
    the top 75 public research universities in the
    United States by the year 2019, our 150th
    anniversary, while we continue to provide the
    foundation of academic, economic, and social
    progress in Southern Illinois

81
Southern at 150, cont.
  • Commits to offering a Progressive Graduate
    Education and to Lead in Research, Scholarly,
    and Creative Activity
  • By 2019, 25 percent of our total enrollment will
    be graduate students (increasing from
    approximately 4,000 to 6,000 graduate students)

82
Southern at 150, cont.
  • Research and scholarship will be integrated into
    every decision made on campus. Building a
    culture where research becomes an integral part
    of all undergraduate and graduate programs is
    essential. Substantially enhance research and
    scholarly productivity.

83
Southern at 150, cont.
  • A Review of the Research Enterprise at SIUC,
    Washington Advisory Group (July 2003)
  • Focused on Sciences, Engineering, and School of
    Medicine
  • This review looked at all programs in these areas
    and recommended strategies.

84
Southern at 150, cont.
  • Research and Scholarship in the Arts,
    Humanities, and Social Sciences at Southern
    Illinois University Carbondale, (Consultant Team
    Report, October 2004)
  • SIUCs Faculty Hiring Initiative, a long-term
    commitment of recurring resources each each year
    to meet the strategic goals set by Southern at
    150. Reviews/assessment play a key role in this
    program

85
Graduate Program Review at NC State External
Review
  • Current Process Administration
  • Administered by the Dean of the Graduate School
  • Initiated by program or Graduate School
  • Often at the Department level
  • Includes multiple degrees/programs
  • Partner with College and/or accreditation reviews

86
Graduate Program Review at NC State External
Review
  • Current Process Objectives
  • Reviews are conducted to gain a clearer
    understanding of a programs
  • Purpose(s) within NC State
  • Effectiveness in achieving purposes
  • Overall quality
  • Future objectives
  • Changes needed to achieve objectives

87
Graduate Program Review at NC State External
Review
  • Current Process Operational Procedures
  • 10 year review cycle
  • Components
  • Internal self-study
  • External team review
  • Review team report oral written
  • Program response prepared
  • Administrative Meeting
  • Graduate Dean, Provost, Vice-Chancellor for
    Research, College Administration, Department
    Head, Director of Graduate Programs, Review Team
    Chair

88
Graduate Program Review at NC State External
Review
  • Current Process Information Made Available
  • Last program review report response
  • 5 year graduate program profile (updated
    annually)
  • Enrollment numbers, demographics
  • Applications
  • Numbers applied/admitted/enrolled
  • Quality indicators
  • Number of degrees awarded, time to degree
  • Financial support
  • Exit interviews
  • All thesis and dissertation students

89
Questions We Began to Ask Ourselves?
  • Do each of our degree programs have clearly
    defined outcome goals?
  • Are they measurable or observable?
  • Do we obtain data to assess the achievement of
    degree program outcomes?
  • Do we use assessment results to improve programs?
  • Do we document that we use assessment results to
    improve programs?

90
Motivation For Change
  • Growing culture of program improvement on our
    campus both undergraduate and graduate
  • Undergraduate Student Affairs had implemented an
    outcome-based review program that was now
    operational
  • SACS was just around the corner

91
Ultimate Question for NC State Became
  • How could we create a hybrid that evaluated
    program quality and measured student learning?
  • Accomplish administrative goals regarding
    evaluation of quality related to funding and
    institutional goals
  • Accomplish graduate school goals related to
    program improvement
  • The ultimate goal is to improve educational
    programs, not fill out reports to demonstrate
    accountability

92
Studying Revising the Process
  • Graduate Dean Appointed a Task Force
  • Made up of stakeholders
  • Relied on on-campus expertise
  • Focus groups with administrators, faculty,
    students, etc.
  • Could not utilize Undergraduate Program Review
    personnel
  • Work load issue
  • New perspectives
  • Bottom Line The opportunity for change is at
    the faculty level, so we want the process to
    address improvement at that level.

93
Graduate Program Review at NC State
  • Task Force Goals
  • Study/revise the existing process
  • Evaluate purpose and goals of review
  • Examine current protocols, especially with
    respect to
  • Continuous and ongoing expectation
  • Outcomes-based assessment requirement
  • What should the role of the Graduate School and
    its Administrative Board be?
  • How can the outcome of a review be most
    effective?
  • What infrastructure is necessary to operate the
    process?

94
Graduate Program Review at NC State
  • Task Force Key Findings
  • The current process is fairly typical
  • Graduate program reviews typically are conducted
    on a 6- to 10-year cycle
  • The current process follows Council of Graduate
    School guidelines (as outlined previously)
  • An external review component should be continued
  • Greater emphasis should be placed on student
    learning outcomes

95
Graduate Program Review at NC State
  • Task Force Key Findings continued
  • The revised process should be more continuous and
    ongoing
  • The review process should result in appropriate
    follow-up
  • Current resources do not allow review of all
    graduate programs on a 10-year cycle

96
What We Decided to Do
  • Continue the traditional external review program
    on an 8 year schedule
  • Continue to partner with external reviews already
    conducted for accreditation or other purposes
  • Emphasize development of program specific student
    learning outcomes and assessment procedures to
    determine if they are being achieve

97
What We Decided to Do
  • In addition to the External Program Review we
    will require each program to
  • Develop program specific objectives and outcomes
  • Develop an assessment plan outlining the
    assessment activity(s) they will conduct annually
  • Complete a biennial assessment report that is
    submitted on-line

98
What We Decided to Do
  • Provide the training necessary for programs to
    implement these changes
  • Development of objectives, outcomes, assessment
    plans
  • Partner with University Planning and Analysis and
    other campus units to improve utility of
    centralized data collection and processing
  • Assist in data collection for assessment plans at
    the institutional level

99
What We Decided to Do
  • Increase efforts relative to follow-up after the
    graduate program review assess progress on
    recommendations
  • Tie the annual assessment and biennial reports to
    the external review by incorporating the changes
    made as a result of assessment into the
    self-study
  • Development of an Action Plan
  • Agreed upon by University, Graduate School,
    College and Department Administration

100
Revised Review Process
 
  • Initial Year 1
  • (Start-Up)
  •  
  • Development of objectives, outcomes and
    assessment tools
  • Identification of data sources and beginning of
    data collection

Cycle Year 3 (also 5 and 7)   Continued data
collection pertinent to outcomes and assessment
measures  
  • Cycle Year 2
  • (also 4 and 6)
  • Ongoing assessment self-study by grad faculty
  •  
  • Programmatic changes
  • Brief biennial assessment report
  •  
  • Cycle Year 8
  • (program review)
  •  
  • Self-study report
  • External review
  • Review report
  • Program response
  • Action plan
  •  

                         
Compact Initiatives
101
Training Workshops Provided
  • Workshops provide the training necessary for
    programs to implement
  • Graduate Program Review Where we are, Where we
    are headed, and why?
  • Assessing the Mission of Doctoral Research
    Universities
  • A workshop on outcomes based assessment put on by
    outside experts
  • Creating Outcomes and Objectives
  • Creating and Assessment Plan
  • Utilizing the Graduate School Managerial Tools
  • Developing an Institutional Database for
    Assessment of Graduate Programs to be developed

102
Managerial Tools Created for Program Review -
Website
103
Managerial Tools Created for Program Review -
Website
104
Managerial Tools Created for Program Review -
Website
105
Managerial Tools Created for Program Review -
Website
106
Managerial Tools Created for Program Review -
Profile Data
107
Managerial Tools Created for Program Review -
Profile Data
108
Managerial Tools Created for Program Review -
Profile Data
109
Managerial Tools Created for Program Review
Review Document Management
110
Managerial Tools Created for Program Review
Review Document Management
111
Managerial Tools Created for Program Review
Review Document Management
112
Managerial Tools Created for Program Review
Review Document Management
113
What We Have Learned/ Discussion Points
  • The process of change takes time
  • We have been at this for going on three year
    (since the start of the Task Force) and have not
    collected the first biennial report
  • Communication is the key to success
  • Clearly communicated goals and expectations are
    important
  • Flexibility faculty in many programs on our
    campus prefer to the word faculty expectations
    to outcomes so be it

114
What We Have Learned/ Discussion Points continued
  • This kind of change has to be from the ground
    (faculty) up not top (administration) down
  • Even then faculty are very skeptical about work
    loads verses value
  • This type of a review process requires
    significant human resources
  • Training, data collection, analysis, and
    interpretation, etc.
  • A key to our success will e how much of this can
    be institutionalized
Write a Comment
User Comments (0)
About PowerShow.com