Title: Benchmarking Academic Programs: Methods and Examples
1Benchmarking Academic Programs Methods and
Examples
Robert L. Armacost Alicia L. Wilson University
Analysis and Planning Support University of
Central Florida 2004 AIR Annual Forum June 2,
2004
Presentation available at http//uaps.ucf.edu
2Overview of Presentation
- what is prominence?
- what is benchmarking?
- why benchmark academic programs?
- what to measure?
- where to get data?
- how to make the comparisons? (methods)
- what are the results?
- how to use them?
3The University of Central Florida
From Promise to Prominence Celebrating 40 Years
- established in 1963 in Orlando Florida (first
classes in 1968), Metropolitan Research
University - grown from 1,948 to 41,700 students in 35 years
- 34,400 undergraduates and 7,300 graduates
- 12 instructional sites in regional campus system
- doctoral intensive
- 84 Bachelors, 64 Masters, 3 Specialist, and 23
PhD programs - second largest undergraduate enrollment in state
- projected largest undergraduate enrollment in
2005 - approximately 1,100 faculty and 3,100 staff
- eight colleges
- Arts and Sciences, Business Administration,
Education, Engineering and Computer Science,
Health and Public Affairs, Honors, Optics and
Photonics, and Hospitality Management
4UCF Strategic Initiative
- increase prominence in graduate studies
- UCF will increase its emphasis on high-quality
graduate education, providing professional
education to meet the needs of the metropolitan
area while achieving international prominence in
engineering, optics, education, and the physical,
biological, social, environmental, and space
sciences, as well as other selected programs.
5What is Prominence?
- prominent adj. 1. Projecting outward or upward
from a line or surface. 2. Immediately
noticeable conspicuous. 3.Widely known
eminent. (American Heritage Dictionary, 1996) - who determines who is prominent?
- what are the key measures used for this judgment?
- what actions are anticipated when rankings are
known? - new strategies
- improve marketing?
- improve accomplishments?
6What is Benchmarking?
- benchmark n. 1. A standard by which something can
be measured or judged. 2. Often bench mark. A
surveyors mark made on a stationary object of
previously determined position and elevation and
used as a reference point in tidal observations
and surveys. --benchmark tr.v. To measure (a
rivals product) according to specified standards
in order to compare it with and improve ones own
product. (American Heritage Dictionary, 1996)
7What is Benchmarking?
- the continuous process of measuring our products,
services and business practices against the
toughest competitors or those companies
recognized as industry leaders (Xerox Corp.) - a basis for establishing rational performance
goals through the search for industry best
practices that will lead to superior performance
(Camp, 1989)
8What is Benchmarking?
- benchmarking involves
- first examining and understanding your own
internal work procedures, - then searching for "best practices" in other
organizations that match those you identified,
and finally, - adapting those practices within your organization
to improve performance. It is, at bottom, a
systematic way of learning from others and
changing what you do. (Epper, 1999) - process for identifying gaps so that you can
improve - not about performance measurement or rankings
- although measures are used
9Baldridge Education Award
- benchmarks
- refer to processes and results that represent the
best practices and performances for similar
organizations, inside or outside of the education
community. - engage in benchmarking to
- understand current dimensions of world-class
performance - achieve discontinuous (nonincremental) or
breakthrough improvement - comparative data
- benchmarks are one form
- third party data
- performance data for competitors and comparable
educational organizations - similar organizations in same geographical area
10Benchmarking Menu (Spendolini, 1992)
11Approaches to Benchmarking
- problem-based
- when a problem comes up, you focus a benchmarking
effort on the problem - process-based
- focuses on the vital (few) business processes
- survey support
- process analysis support
- assessment support
- accepted as correct approach (Camp, 1995)
12Types of Benchmarking
- competitive benchmarking
- benchmarking against competitors
- typically requires customer input
- requires identification of competitors
- functional benchmarking
- benchmark against best in class in the
operation or process of interest - requires identification of best in class
13Types of Benchmarking (Continued)
- performance benchmarking
- process for identifying benchmarks and
identifying stretch targets - requires identification of key competitors or
best in class - strategic benchmarking
- process used for identifying world class
standards, determining gaps in competitiveness,
developing strategies, and remaining focused and
aware of developments
14Another Classification
- external benchmarking
- focuses on identifying other institutions
- internal benchmarking
- focuses on similar processes inside the
institution
15Approaches to Benchmarking
- problem-based or process-based
- types
- competitive
- functional
- performance
- strategic
- internal or external
16Related, But Not Benchmarking
- comparative analysis
- requires identification of comparables for
whatever objective one has in mind, but not
generally for improvement purposes - key performance indicators (KPI)
- accountability measures
- institutional characteristics
17Benchmarking Experience at UCF
- common use implies comparison for _____, not
necessarily for process improvement - initial efforts to identify prominent graduate
programs - some comparative analysis as part of academic
program reviews - limited process studies (e.g., transfer credit
evaluation, 1996)
18What Do We Want Benchmarking to Mean for Academic
Programs?
- process improvement
- process benchmarkingcomparison against
best-in-class for a specified process - externaladmissions process
- internaldepartmental advising practices
- comparative analysis
- curiosity, potentially leading toward process
improvement - competitive benchmarkinghow are we doing
relative to our competitors? (e.g., Florida
schools admitting National Merit Scholars) - best-in-class benchmarkinghow are we doing
relative to a specified class of comparable
institutions? (e.g., Metropolitan Research
Universities) - world-class benchmarkinghow do we rank among the
best universities?
19Benchmarking Processes
- plan
- functions or processes to benchmark
- benchmark measures (key performance variables)
- who to benchmark (best-in-class, partner)
- collect data
- acquire data, observe
- analyze data
- identify actions to close gap
- adapt for improvement
- specify improvement programs and actions
- implement plans
- focus
- assessmentcontinuous improvement
- benchmarkingdiscontinuous improvement
20Approaches
- lone ranger
- third party data
- inference
- partner
- win-win
- mutual exchange on best-in-class processes
- data exchanges and visits
- consortium
- participant
- observer
- requires significant effort
21Practical Questions
- what do I measure (benchmark)?
- who do I compare to?
- what process should I use?
- where do I get data?
- answer it depends on what you want to accomplish
22Measures
- primary determinant of measures is the purpose of
the comparison - process improvement
- comparison
- rankings
- prominence
- number and type of measures will depend on
program or process - will typically have multiple measures
- best-in-class will generally not be dominant on
all measures - identification of best-in-class is difficult
23Who Do I Compare To?
- identification of class
- peerssimilar institutions
- differ by program
- differ by process
- requires insight and knowledgeno reference lists
generally available - comparables
- similar-sized operations in similar-sized
institutions - best-in class
- strong reputation
24What Process to Use?
- partner approach is good for non-competitors
- consortium is preferred approach for process
improvement benchmarking
25Getting Benchmark Data
- published data
- reports
- websites
- information sharing
- establish relationship with benchmarks
26UCF Graduate Programs of Prominence
- strategic initiativeprograms of prominence
- who determines who is prominent?
- what are the key measures used for this judgment?
- what actions are anticipated when rankings are
known? - new strategies
- improve marketing?
- improve accomplishments?
- how to identify candidate programs?
27Identifying Programs
- Deans recommendations
- selected programs
- Computer Science (MS, PhD)
- Counselor Education (MS, PhD)
- Criminal Justice (MS)
- Environmental Engineering (MS, PhD)
- Applied Experimental and Human Factors Psychology
(MS, PhD) - K-8 Math/Science Education (MS)
- Nursing (MS)
- Optics (MS, PhD)
28Process
- led by Vice Provost and Dean of Graduate Studies
- UAPS provided guidance and technical support
- initial meeting to define terms, set goals, evoke
commitment () - agreement on common measures
- bi-weekly progress meetings
- templates
- report
- data formats
- Dean sign-offs on approach, measures, results
- final report
- follow-up meetings to develop action plans
29Candidate Benchmark Institutions
- types of Institutions
- best in class
- top institutions
- unique institutions
- peer institutions
- direct competitors
- where to find them
- general knowledge within the discipline
- rankings in discipline-specific association
journals - US News discipline rankings
30Benchmark Institutions
- Counselor Education
- U Minnesota
- Indiana U
- UNC Greensboro
- U of MD College Park
- U South Carolina
- Kent State
- Portland State
- Criminal Justice
- U Louisville
- Michigan State
- SUNY Albany
- Cal State LB
- Rutgers U
- Georgia State
- U Cincinnati
- UNC Charlotte
- Optics
- Stanford U
- K-8 Math/Science Ed
- UC Berkeley
- U Wisconsin Madison
- Ohio State
- Clemson
- Oregon State
- Hofstra
- George Mason
- San Diego State
- 5 others didnt respond
- Human Factors
- U Ill. Urbana Champaign
- George Mason U
- Georgia Tech
- U Cincinnati
- New Mexico State
- NC State
- Wright State
- Computer Science
- Environmental Engineering
- Stanford U
- UC Berkeley
- Georgia Tech
- UT Austin
- U Michigan
- U Ill. Urbana Champaign
- Cal Tech
- U Florida
- Virginia Tech
- NC State
- Nursing
- U Washington
- UNC Chapel Hill
- Ohio State
- U Kansas
- U Kentucky
- Arizona State
- U Florida
Red-Top Tier Pink-Second Tier Green-Peers
Black No delineation
31What Measures?
- both program characteristics and performance
measures - national studies
- National Research Council
- US News and World Report
- TheCenter Report
- discipline-specific studies (e.g., American
Association of Colleges of Nursing) - varies by program
- set of core measures for all programs
- discipline specific measures
- looked at both raw data and ratios
32Sample Benchmark Measures
- faculty characteristics
- of faculty
- of society fellows
- of national awards
- of publications
- of faculty publishing
- of faculty with research support
- amount of external and federal funding
- student characteristics
- of students
- of minority students
- of international students
- GRE scores
- of students supported (GTAs, GRAs)
- of national fellowships (other fellowships)
- program characteristics
- of degrees awarded
- amount of lab space
33Wheres the Data?
- if program is general enough
- national studies
- discipline specific studies
- number of graduates from IPEDS
- if program is more narrowly defined (e.g.,
environmental engineering) - more difficult to find data
- program data is grouped with other programs
(e.g., civil engineering) or departments (e.g.,
optics) - create consortiums or partnerships
- request data directly from colleagues
34How To Get the Data?
templates
35How To Get the Data?
- department representative contacted colleagues
- sent template (with program specific questions)
via email - followed up with multiple phone calls
- took 2-3 months to gather data
- not all benchmark programs cooperated
- attempt to fill in missing pieces
- web of science for publications (labor intensive)
- search discipline-specific journals
- search university web pages
- follow-up with colleagues
36How To Organize The Data?
37Comparison Methods
- look at summary data to develop impressions of
where programs ranked - analytic approaches
- dominance ranking
- primary method used
- data envelopment analysis (DEA)
- hierarchical weight and rate approach
38Dominance Ranking
- based on an approach used in TheCenter's annual
report, The Top American Research Universities - classification of universities into groups based
upon quality indicators - method relies on counts of the number of times
that a university is included in the top 25 on a
given measure or in the second group (26-50) - number of counts of those occurrences in the
first tier or second tier are used to group or
rank the institutions - see http//thecenter.ufl.edu/
39Dominance Ranking
- UCF graduate programs are compared to 6-10 other
graduate programs - comparable approach is to rank order the programs
for each measure and use that ranking to
determine in which tier each program falls - Tier 1 ranked number one or two
- Tier 2 ranked number three or four
- Tier 3 ranked below number four
- uses counts of those occurrences, ranking first
by the number of counts in Tier 1, then by the
counts in Tier 2, and then by the counts in Tier
3
40Dominance Ranking
- data acquisition and limitations
- methodology depends on having consistent data
- serious limitation associated with missing data
- missing data results in a university not being
ranked on those given measures (equivalent to
being in Tier 3) - for an otherwise highly ranked university,
missing data will necessarily lower its rank - assumes that each measure is equally important
- could conduct the analysis using only those
measures for which complete data are available - the overall ranking needs to be used with care
41Example Faculty Summary
42Example Student Summary
43Example Program Summary
44Data Envelopment Analysis
- multi-dimensional approach
- analyzes inputs and outputs
- assesses overall effectiveness
- advantages
- assigns mathematically optimal weights
- simultaneous comparisons of performance measures
- calculates distance from best-practice
organizations - efficiency
- weighted sum of outputs (more is better) divided
by weighted sum of inputs (less is better) - find best set of weights to maximize
efficiencybest possible caseuse Solver in Excel
(linear program)
45Sample Excel DEA Output
46Weight and Rate Approach
- identify key benchmark measures
- create hierarchy to group similar dimensions
- develop weights for each measure to determine
relative importance - use any decision analysis method
- pairwise comparisons--Analytic Hierarchy Process
(AHP) - use value of measure or separate evaluation to
rate institution on the measure - compute overall score to rank institutions
47Rating Hierarchy
determine weight for relative importance of
measures
determine rating of individual measures
compute overall score for institution
48Results
- programs used the results of the dominance
rankings to identify what areas were competitive
with other top institutions (i.e., faculty
productivity) and what areas needed improvement
(i.e., number of faculty) - programs submitted a report which included
- programs benchmarked
- measures used
- results of the dominance ranking
- plan of action
- review held with Vice Provost of Graduate Studies
to further refine action plans
49Action Plans
- identified what was needed to elevate the program
to prominence - examples
- of additional faculty needed
- necessary increases in faculty productivity
- revise recruitment strategies
- market the program
- increase in lab space
- increase financial support for students
- identified areas where additional data was needed
- breakdown of masters vs. doctoral
- benchmark clinical costs
50How Will the Results Be Used?
- assist programs to get the help they need
- marketing
- recruiting
- provides documentation and support for requesting
additional funds and university specials - strategic plan provides justification for
additional funding and support for those programs
identified as programs at or near national or
international prominence
51Resources
- Internal sources
- IR http//pegasus.cc.ucf.edu/7Eirps/character/cu
rrent.html - OEAS http//www.oeas.ucf.edu/benchmarking.htm
- UAPS http//www.uaps.ucf.edu/benchmarking.html
- Websites to Universities
- IR Offices http//airweb.org/links/offices.cfm
- Florida Colleges and Universities
http//iea.fau.edu/fair/flacol.htm - Florida Colleges and Universities IR Offices
http//iea.fau.edu/fair/flair.htm - Southern Association IR Offices
http//sair.org/Resources/Links.htmSAIR20IR20Of
fice20Web20Sites - Coalition of Urban Metropolitan Universities
http//cumu.uc.iupui.edu
52Resources
- Websites to Industry Resources
- Higher Education Associations http//iea.fau.edu/
fair/edasoc.htm - Higher Education Research Centers
http//iea.fau.edu/fair/edres.htm - Institutional Research Internet Resources
http//www.airweb3.org/air-new/page.asp?page21 - http//airweb.org/links/linkmap.html
- The Learning Alliance for Higher Education
http//www.thelearningalliance.info/index.php - National Center for Educational Statistics
http//www.nces.ed.gov/ - National Organizations http//oeas.ucf.edu/relate
d_links.htm
53Resources
- Websites to Data
- Common Data Sets http//airweb.org/links/cds.cfm
- Census Data http//airweb.org/links/census.cfm
- Data Warehousing http//airweb.org/links/dataware
house.cfm - Enrollment Statistics http//airweb.org/links/enr
oll.cfm - Environmental Scanning http//airweb.org/links/sc
anning.cfm - Peer Comparison Data http//airweb.org/links/peer
s.cfm - Performance Indicators http//airweb.org/links/in
dicators.cfm - Statistics/Research Methods http//airweb.org/lin
ks/stats.cfm - Southern Universities-Common Data Sets
http//sair.org/Resources/Links.htmSAIR20School
20Common20Datasets - Resource Identification for Programs
http//oeas.ucf.edu/SourceID.html
54Resources
- Websites to Data
- Resource Sheet Identifying available data
elements http//oeas.ucf.edu/InstitutionResourceS
heet.html - Variables Available About Specific Programs
http//oeas.ucf.edu/VariablesAvailableAboutSpecifi
cPrograms.html - Population Characteristics http//site.conway.com
/ez/ - Student Characteristics http//nces.ed.gov/ipeds/
- http//www.usnews.com/usnews/edu/college/coworks.h
tm - Faculty Characteristics http//nces.ed.gov/ipeds/
- Financial Characteristics http//nces.ed.gov/iped
s/ - Research Characteristics http//www.nsf.gov
- http//caspar.nsf.gov/cgi-bin/WebIC.exe?template
nsf/srs/webcasp/start.wi - List of Published Rankings http//oeas.ucf.edu/Pu
blishedRankings.html - List of Program Rankings http//oeas.ucf.edu/Prog
ramRankings.html
55Resources
- Websites to Benchmarking Literature
- American Productivity and Quality Center
http//www.apqc.org/best/ - Ebenchmarking http//ebenchmarking.com/ (note
scroll down page for list of additional resources
including industry specific) - Consortium for Higher Education Benchmarking
Analysis http//www.cheba.com/ - National AAU Peer Benchmarking for Quality
http//www.ir.ufl.edu/compare/intro.htm
56Questions
???
- Ms. Alicia L. Wilson
- Assistant Director, University Analysis and
Planning Support - University of Central Florida
- 12424 Research Parkway, Suite 215
- Orlando, FL 32826-3207
- 407-882-0287
- awilson_at_mail.ucf.edu
- http//uaps.ucf.edu
- Contacts
- Dr. Robert L. Armacost
- Director, University Analysis and Planning
Support - University of Central Florida
- 12424 Research Parkway, Suite 215
- Orlando, FL 32826-3207
- 407-882-0286
- armacost_at_mail.ucf.edu
- http//uaps.ucf.edu