Methods for Benchmarking Academic Programs: A Graduate Program Example - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

Methods for Benchmarking Academic Programs: A Graduate Program Example

Description:

Computer Science (MS, PhD) Counselor Education (MS, PhD) Criminal Justice (MS) ... K-8 Math/Science Education (MS) Nursing (MS) Optics (MS, PhD) 26. Process ... – PowerPoint PPT presentation

Number of Views:239
Avg rating:3.0/5.0
Slides: 53
Provided by: robertla6
Category:

less

Transcript and Presenter's Notes

Title: Methods for Benchmarking Academic Programs: A Graduate Program Example


1
Methods for Benchmarking Academic Programs A
Graduate Program Example
Robert L. Armacost Alicia L. Wilson University
Analysis and Planning Support University of
Central Florida Florida Association for
Institutional Research 2003 Annual
Conference July 17, 2003
Presentation available at http//uaps.ucf.edu
2
Overview of Presentation
  • what is benchmarking?
  • why benchmark academic programs?
  • what to measure?
  • where to get data?
  • how to make the comparisons? (methods)
  • what are the results?
  • how do I use them?

3
The University of Central Florida
  • established in 1963 in Orlando Florida (first
    classes in 1968), Metropolitan Research
    University
  • grown from 1,948 to 39,000 students in 34 years
  • 32,500 undergraduates and 6,500 graduates
  • doctoral intensive
  • 76 Bachelors, 62 Masters, 3 Specialist, and 20
    PhD programs
  • second largest undergraduate enrollment in state
  • approximately 1,100 faculty and 2,800 staff
  • six colleges and two schools
  • Arts and Sciences, Business Administration,
    Education, Engineering and Computer Science,
    Health and Public Affairs, Honors, Optics, and
    Hospitality Management

4
Benchmarking Definitions
  • benchmark n. 1. A standard by which something can
    be measured or judged. 2. Often bench mark. A
    surveyors mark made on a stationary object of
    previously determined position and elevation and
    used as a reference point in tidal observations
    and surveys. --benchmark tr.v. To measure (a
    rivals product) according to specified standards
    in order to compare it with and improve ones own
    product. (American Heritage Dictionary, 1996)

5
Benchmarking Definitions
  • the continuous process of measuring our products,
    services and business practices against the
    toughest competitors or those companies
    recognized as industry leaders (Xerox Corp.)
  • a basis for establishing rational performance
    goals through the search for industry best
    practices that will lead to superior performance
    (Camp, 1989)

6
What is Benchmarking?
  • benchmarking involves
  • first examining and understanding your own
    internal work procedures,
  • then searching for "best practices" in other
    organizations that match those you identified,
    and finally,
  • adapting those practices within your organization
    to improve performance. It is, at bottom, a
    systematic way of learning from others and
    changing what you do. (Epper, 1999)
  • process for identifying gaps so that you can
    improve
  • not about performance measurement or rankings
  • although measures are used

7
Baldridge Education Award
  • benchmarks
  • refer to processes and results that represent the
    best practices and performances for similar
    organizations, inside or outside of the education
    community.
  • engage in benchmarking to
  • understand current dimensions of world-class
    performance
  • achieve discontinuous (nonincremental) or
    breakthrough improvement
  • comparative data
  • benchmarks are one form
  • third party data
  • performance data for competitors and comparable
    educational organizations
  • similar organizations in same geographical area

8
Benchmarking Menu (Spendolini, 1992)
9
Approaches to Benchmarking
  • problem-based
  • when a problem comes up, you focus a benchmarking
    effort on the problem
  • process-based
  • focuses on the vital (few) business processes
  • survey support
  • process analysis support
  • assessment support
  • accepted as correct approach (Camp, 1995)

10
Types of Benchmarking (Spendolini, 1992)
11
Types of Benchmarking
  • competitive benchmarking
  • benchmarking against competitors
  • typically requires customer input
  • requires identification of competitors
  • functional benchmarking
  • benchmark against best in class in the
    operation or process of interest
  • requires identification of best in class

12
Types of Benchmarking Continued
  • performance benchmarking
  • process for identifying benchmarks and
    identifying stretch targets
  • requires identification of key competitors or
    best in class
  • strategic benchmarking
  • process used for identifying world class
    standards, determining gaps in competitiveness,
    developing strategies, and remaining focused and
    aware of developments

13
Another Classification
  • external benchmarking
  • focuses on identifying other institutions
  • internal benchmarking
  • focuses on similar processes inside the
    institution

14
Related, But Not Benchmarking
  • comparative analysis
  • requires identification of comparables for
    whatever objective one has in mind, but not
    generally for improvement purposes
  • key performance indicators (KPI)
  • accountability measures
  • institutional characteristics

15
Benchmarking Processes
  • plan
  • functions or processes to benchmark
  • benchmark measures (key performance variables)
  • who to benchmark (best-in-class, partner)
  • collect data
  • acquire data, observe
  • analyze data
  • identify actions to close gap
  • adapt for improvement
  • specify improvement programs and actions
  • implement plans

16
Approaches
  • lone ranger
  • third party data
  • inference
  • partner
  • win-win
  • mutual exchange on best-in-class processes
  • data exchanges and visits
  • consortium
  • participant
  • observer
  • requires significant effort

17
Benchmarking Experience at UCF
  • common use implies comparison for _____, not
    necessarily for process improvement
  • initial efforts to identify prominent graduate
    programs
  • some comparative analysis as part of academic
    program reviews
  • limited process studies (e.g., transfer credit
    evaluation, 1996)

18
What Do We Want Benchmarking to Mean for Academic
Programs?
  • process improvement
  • process benchmarkingcomparison against
    best-in-class for a specified process
  • externaladmissions process
  • internaldepartmental advising practices
  • comparative analysis
  • curiosity, potentially leading toward process
    improvement
  • competitive benchmarkinghow are we doing
    relative to our competitors? (e.g., Florida
    schools admitting National Merit Scholars)
  • best-in-class benchmarkinghow are we doing
    relative to a specified class of comparable
    institutions? (e.g., Metropolitan Research
    Universities)
  • world-class benchmarkinghow do we rank among the
    best universities?

19
Practical Questions
  • what do I measure (benchmark)?
  • who do I compare to?
  • what process should I use?
  • where do I get data?
  • answer it depends on what you want to accomplish

20
Measures
  • primary determinant of measures is the purpose of
    the comparison
  • process improvement
  • comparison
  • rankings
  • prominence
  • number and type of measures will depend on
    program or process
  • will typically have multiple measures
  • best-in-class will generally not be dominant on
    all measures
  • identification of best-in-class is difficult

21
Who Do I Compare To?
  • identification of class
  • peerssimilar institutions
  • differ by program
  • differ by process
  • requires insight and knowledgeno reference lists
    generally available
  • comparables
  • similar-sized operations in similar-sized
    institutions
  • best-in class
  • strong reputation

22
What Process to Use?
  • partner approach is good for non-competitors
  • consortium is preferred approach for process
    improvement benchmarking

23
Getting Benchmark Data
  • published data
  • reports
  • websites
  • information sharing
  • establish relationship with benchmarks

24
UCF Graduate Programs of Prominence
  • strategic initiativeprograms of prominence
  • who determines who is prominent?
  • what are the key measures used for this judgment?
  • what actions are anticipated when rankings are
    known?
  • new strategies
  • improve marketing?
  • improve accomplishments?
  • how to identify candidate programs?

25
Identifying Programs
  • Deans recommendations
  • selected programs
  • Computer Science (MS, PhD)
  • Counselor Education (MS, PhD)
  • Criminal Justice (MS)
  • Environmental Engineering (MS, PhD)
  • Applied Experimental and Human Factors Psychology
    (MS, PhD)
  • K-8 Math/Science Education (MS)
  • Nursing (MS)
  • Optics (MS, PhD)

26
Process
  • led by Vice Provost and Dean of Graduate Studies
  • UAPS provided guidance and technical support
  • initial meeting to define terms, set goals, evoke
    commitment ()
  • agreement on common measures
  • bi-weekly progress meetings
  • templates
  • report
  • data formats
  • Dean sign-offs on approach, measures, results
  • final report
  • follow-up meetings to develop action plans

27
What Measures?
  • both program characteristics and performance
    measures
  • national studies
  • National Research Council
  • US News and World Report
  • TheCenter Report
  • discipline-specific studies (e.g., American
    Association of Nursing)
  • varies by program
  • set of core measures for all programs
  • discipline specific measures
  • looked at both raw data and ratios

28
Sample Benchmark Measures
  • student characteristics
  • of students
  • of minority students
  • of international students
  • GRE scores
  • of students supported (GTAs, GRAs)
  • of national fellowships (other fellowships)
  • program characteristics
  • of degrees awarded
  • amount of lab space
  • faculty characteristics
  • of faculty
  • of society fellows
  • of national awards
  • of publications
  • of faculty publishing
  • of faculty with research support
  • amount of external and federal funding

29
Wheres the Data?
  • if program is general enough
  • national studies
  • discipline specific studies
  • number of graduates from IPEDS
  • if program is more narrowly defined (e.g.,
    environmental engineering)
  • more difficult to find data
  • program data is grouped with other programs
    (e.g., civil engineering) or departments (e.g.,
    optics)
  • create consortiums or partnerships
  • request data directly from colleagues

30
How To Get the Data?
templates
31
How To Get the Data?
  • department representative contacted colleagues
  • sent template (with program specific questions)
    via email
  • followed up with multiple phone calls
  • took 2-3 months to gather data
  • not all benchmark programs cooperated
  • attempt to fill in missing pieces
  • web of science for publications (labor intensive)
  • search discipline-specific journals
  • search university web pages
  • follow-up with colleagues

32
How To Organize The Data?
33
Comparison Methods
  • looking at summary data to develop impressions of
    where programs ranked
  • dominance ranking
  • primary method used
  • data envelopment analysis (DEA)
  • hierarchical weight and rate approaches

34
Dominance Ranking
  • based on an approach used in TheCenter's annual
    report, The Top American Research Universities
  • classification of universities into groups based
    upon quality indicators
  • method relies on counts of the number of times
    that a university is included in the top 25 on a
    given measure or in the second group (26-50)
  • number of counts of those occurrences in the
    first tier or second tier are used to group or
    rank the institutions

35
Dominance Ranking
  • UCF graduate programs are compared to 6-10 other
    graduate programs
  • comparable approach is to rank order the programs
    for each measure and use that ranking to
    determine in which tier each program falls
  • Tier 1 ranked number one or two
  • Tier 2 ranked number three or four
  • Tier 3 ranked below number four
  • uses counts of those occurrences, ranking first
    by the number of counts in Tier 1, then by the
    counts in Tier 2, and then by the counts in Tier
    3

36
Dominance Ranking
  • data acquisition and limitations
  • methodology depends on having consistent data
  • serious limitation associated with missing data
  • missing data results in a university not being
    ranked on those given measures (equivalent to
    being in Tier 3)
  • for an otherwise highly ranked university,
    missing data will necessarily lower its rank
  • assumes that each measure is equally important
  • could conduct the analysis using only those
    measures for which complete data are available
  • the overall ranking needs to be used with care

37
Example Faculty Summary
38
Example Student Summary
39
Example Program Summary
40
Data Envelopment Analysis
  • multi-dimensional approach
  • analyzes inputs and outputs
  • assesses overall effectiveness
  • advantages
  • assigns mathematically optimal weights
  • simultaneous comparisons of performance measures
  • calculates distance from best-practice
    organizations
  • efficiency
  • weighted sum of outputs (more is better) divided
    by weighted sum of inputs (less is better)
  • find best set of weights to maximize
    efficiencybest possible caseuse Solver in Excel
    (linear program)

41
Sample Excel DEA Output
42
Weight and Rate Approach
  • identify key benchmark measures
  • create hierarchy to group similar dimensions
  • develop weights for each measure to determine
    relative importance
  • use any decision analysis method
  • pairwise comparisons--Analytic Hierarchy Process
    (AHP)
  • use value of measure or separate evaluation to
    rate institution on the measure
  • compute overall score to rank institutions

43
Rating Hierarchy
determine weight for relative importance of
measures
determine rating of individual measures
compute overall score for institution
44
Results
  • programs used the results of the dominance
    rankings to identify what areas were competitive
    with other top institutions (i.e., faculty
    productivity) and what areas needed improvement
    (i.e., number of faculty)
  • programs submitted a report which included
  • programs benchmarked
  • measures used
  • results of the dominance ranking
  • plan of action
  • review held with Vice Provost of Graduate Studies
    to further refine action plans

45
Action Plans
  • identified what was needed to elevate the program
    to prominence
  • examples
  • of additional faculty needed
  • necessary increases in faculty productivity
  • revise recruitment strategies
  • market the program
  • increase in lab space
  • increase financial support for students
  • identified areas where additional data was needed
  • breakdown of masters vs. doctoral
  • benchmark clinical costs

46
How Will the Results Be Used?
  • assist programs to get the help they need
  • marketing
  • recruiting
  • provides documentation and support for requesting
    additional funds and university specials
  • strategic plan provides justification for
    additional funding and support for those programs
    identified as programs at or near national or
    international prominence
  • results will be presented to the Provost in the
    near future

47
Resources
  • internal sources
  • IR http//pegasus.cc.ucf.edu/7Eirps/character/cu
    rrent.html
  • OEAS http//oeas.ucf.edu/BenchmarkingComparativeR
    esources
  • UAPS http//pegasus.cc.ucf.edu/uaps/Benchmarking
    .html
  • websites to universities
  • IR Offices http//airweb2.org/links/offices.cfm
  • Florida Colleges and Universities
    http//iea.fau.edu/fair/flacol.htm
  • Florida Colleges and Universities IR Offices
    http//iea.fau.edu/fair/flair.htm
  • Southern Association IR Offices
    http//sair.org/Resources/Links.htmSAIR20IR20Of
    fice20Web20Sites
  • Coalition of Urban Metropolitan Universities
    http//uc.iupui.edu/cumu/

48
Resources
  • websites to industry resources
  • Higher Education Associations http//iea.fau.edu/
    fair/edasoc.htm
  • Higher Education Research Centers
    http//iea.fau.edu/fair/edres.htm
  • Higher Education Research Meta Index
    http//www.irp.panam.edu/more_html/utpa_erlist.htm
    l
  • Institutional Research Internet Resources
    http//www.airweb3.org/air-new/page.asp?page21
  • http//airweb2.org/links/linkmap.html
  • Knight Collaborative Project at the University of
    Pennsylvania (CRI) http//www.irhe.upenn.edu/rese
    arch/crihome.html
  • National Center for Educational Statistics
    http//www.nces.ed.gov/
  • National Organizations http//oeas.ucf.edu/Relate
    dLinks.html

49
Resources
  • websites to data
  • Common Data Sets http//airweb2.org/links/cds.cfm
  • Census Data http//airweb2.org/links/census.cfm
  • Data Warehousing http//airweb2.org/links/datawar
    ehouse.cfm
  • Enrollment Statistics http//airweb2.org/links/en
    roll.cfm
  • Environmental Scanning http//airweb2.org/links/s
    canning.cfm
  • Peer Comparison Data http//airweb2.org/links/pee
    rs.cfm
  • Performance Indicators http//airweb2.org/links/i
    ndicators.cfm
  • Statistics/Research Methods http//airweb2.org/li
    nks/stats.cfm
  • Southern Universities-Common Data Sets
    http//sair.org/Resources/Links.htmSAIR20School
    20Common20Datasets
  • Resource Identification for Programs
    http//oeas.ucf.edu/SourceID.html

50
Resources
  • websites to data
  • Resource Sheet Identifying available data
    elements http//oeas.ucf.edu/InstitutionResourceS
    heet.html
  • Variables Available About Specific Programs
    http//oeas.ucf.edu/VariablesAvailableAboutSpecifi
    cPrograms.html
  • Population Characteristics http//site.conway.com
    /ez/
  • Student Characteristics http//nces.ed.gov/ipeds/
  • http//www.usnews.com/usnews/edu/college/coworks.h
    tm
  • Faculty Characteristics http//nces.ed.gov/ipeds/
  • Financial Characteristics http//nces.ed.gov/iped
    s/
  • http//www.nsf.gov
  • List of Published Rankings http//oeas.ucf.edu/Pu
    blishedRankings.html
  • List of Program Rankings http//oeas.ucf.edu/Prog
    ramRankings.html

51
Resources
  • websites to data
  • Benchmarking Literature
  • American Productivity and Quality Center
    http//www.apqc.org/best/
  • Free Resources http//www.apqc.org/search/dispRel
    atedItems.cfm?ProductTypeFreeProductID1257
  • Ebenchmarking http//ebenchmarking.com/ (note
    scroll down page for list of additional resources
    including industry specific)
  • Consortium for Higher Education Benchmarking
    Analysis http//www.cheba.com/
  • National AAU Peer Benchmarking for Quality
    http//www.ir.ufl.edu/compare/intro.htm

52
Questions
???
  • Ms. Alicia L. Wilson
  • Coordinator, Computer Applications
  • University of Central Florida
  • 12424 Research Parkway, Suite 488
  • Orlando, FL 32826-3207
  • 407-882-0287
  • awilson_at_mail.ucf.edu
  • http//oeas.ucf.edu
  • Contacts
  • Dr. Robert L. Armacost
  • Director, University Analysis and Planning
    Support
  • University of Central Florida
  • 12424 Research Parkway, Suite 488
  • Orlando, FL 32826-3207
  • 407-882-0286
  • armacost_at_mail.ucf.edu
  • http//uaps.ucf.edu
Write a Comment
User Comments (0)
About PowerShow.com