Title: Integrating Performance Measures into University Endeavor
1Integrating Performance Measures into University
Endeavor
- Victor M. H. Borden, Ph.D.
- Associate Vice President
- University Planning, Institutional Research, and
Accountability (IU) - Associate Professor of Psychology (IUPUI)
2Becoming an Evidence-Driven Learning Organization
Or
- Victor M. H. Borden, Ph.D.
- Associate Vice President
- University Planning, Institutional Research, and
Accountability (IU) - Associate Professor of Psychology (IUPUI)
3How I Learned to Stop Worrying and Love
Performance Measures
Or
- Victor M. H. Borden, Ph.D.
- Associate Vice President
- University Planning, Institutional Research, and
Accountability (IU) - Associate Professor of Psychology (IUPUI)
4If this were a simple matter, you would have
figured it out long ago and I wouldnt be here.
Do not expect my explanations to be simple nor
my advice to be straightforward. This will be
more like a graduate-level seminar than an
introductory course
5The Institutional Research Credo
- I realize that I will not succeed in answering
all of your questions. Indeed, I will not answer
any of them completely. The answers I provide
will only serve to raise a whole new set of
questions that lead to more problems, some of
which you werent aware of in the first place.
When my work is complete, you will be as confused
as ever, but hopefully, you will be confused on a
higher level and about more important things
6Why Not Data-Driven?
- Data, per se, are not what we need
7(No Transcript)
8(No Transcript)
9If Not Data-Driven, Then What?
- Evidence-based practice to decide
- What to do
- How best to do it
- If it is working as desired
- So that we can learn from what we do and improve
- We want to be part of a Learning Organization
10Learning Organizations
- organizations where people continually expand
their capacity to create the results they truly
desire, where new and expansive patterns of
thinking are nurtured, where collective
aspiration is set free, and where people are
continually learning to see the whole together.
(Senge, 1990)
11Learning Organizations
- are characterized by total employee involvement
in a process of collaboratively conducted,
collectively accountable change directed towards
shared values or principles. (Watkins and Marsick
1992)
12Overview
- Lessons Ive learned (the hard way) about
developing university performance measures - Performance measures as the tip of the
evidence-based iceberg - Going below the surface
- Applying an organizational learning lens
- Some implications and related thoughts
13Lessons Learned
- Early lessons on measurement theory
- 1994 NDIR Volume
- Measuring Institutional Performance Outcomes
(APQC-MIPO) - Developing campus PIs to link planning,
budgeting, evaluation and improvement - Taking it to the next level
14Measurement Theory
- Inductive Deductive Cycle
15Measurement Theory
- Validity
- Warranted assertion (Dewey)
- Degree to which the measure accurately represents
the concept (what you are attempting to measure) - Size of a person (weight, height, circumference,
body mass) - Quality of instruction (course ratings, peer
review, student learning)? - Reliability
- Degree to which measure faithfully represents the
concept - Course ratings taken mid-term/end-term
- Unless very careful attention is paid to ones
theoretical assumptions and conceptual apparatus,
no array of statistical techniques will suffice
Blalock, 1982
161994 NDIR Volume
- Using Performance Indicators to Guide Strategic
Decision Making (Borden and Banta, Eds.)
17Lessons
- Borden and Bottrill Where you stand on PIs
depends on where you sit - Ewell and Jones Think before you count
- Joengblood and Westerheijden (Europe) PIs out,
Quality Assurance in - Dorris and Teeter (TQM) PIs are fine, if P
stands for Process - Dolence and Norris KPIs are the fuel of a
strategic decision engine - DeHayes and Lovrinic (ABC) Show me the moneyand
what you use it for doing.
18Lessons (continued)
- Banta and Borden Criteria for Effective PIs
- Start with purpose
- Align throughout organization
- Align across input, process, output
- Coordinate a variety of methods
- Use in decision making
19Measuring Institutional Performance Outcomes
- An American Productivity and Quality Center
(APQC) benchmarking study
20APQC MIPO Findings
- The best institutional performance measures
communicate the institutions core values - Good institutional performance measures are
carefully chosen, reviewed frequently, and point
to action to be taken on results - External requirements and pressures can be
extremely useful as starting points for
developing institutional performance measurement
systems - Performance measures are best used as problem
detectors to identify areas for management
attention and further exploration - Clear linkages between performance measures and
resource allocation are critical, but the best
linkages are indirect (and non-punitive)
21MIPO Cont.
- Performance measures must be publicly available,
visible, and consistent across the organization - Performance measures are best considered in the
context of a wider transformation of
organizational culture - Organizational cultures supportive of performance
measures take time to develop, require
considerable socialization of the
organizations members, and are enhanced by
stable leadership - Performance measures change the role of managers
and the ways in which they manage
22MIPO Boiling it Down
- You cannot lead with performance measures
- Performance measures emerge from a broader
culture of evidence, that is, they are part of
something bigger
23E.G. PIs_at_IUPUI
www.iport.iupui.edu
24(No Transcript)
25(No Transcript)
26(No Transcript)
27Taking it to the Next LevelAccountability at
Indiana University
- Articulating and Attaining Strategic Goals and
Objectives
28Audiences
- Board of Trustees
- Most comprehensive, University-wide view
- Campus accreditors and (prospective) partners
- Campus-specific objectives and indicators
- Targeted packaging for
- Media legislators alumni current and
prospective students and their parents research
agencies and collaborators
29Purposes
- Position IU strategically
- Improve the effectiveness and quality of programs
and services - Provide a common framework to align efforts
across campuses - Communicate a clear and consistent message about
IUs broad goals - Enhance IUs image
- Define and document IUs contributions to the
state, students, and communities - Demonstrate integrity in accounting for the use
of public and private resources
30Principles
- Mission-centered
- Research-driven
- Transparency
- Inclusive dimensions of excellence and quality
- Empowerment and responsibility
- Influenced by best practices
- National Commission on Accountability in Higher
Education
31Framework
- University-wide strategic goals and core
performance indicators - Campus performance objectives and indicators
derived from mission, aligned to university goals
and core indicators - Explicit link to administrative area goals and
objectives - Annual performance reports and reviews
- University and campuses
32Advance University Distinction and Distinctiveness Rankings and recognitions
Advance University Distinction and Distinctiveness Focused areas of distinction
Advance University Distinction and Distinctiveness Centers of Excellence
Advance University Distinction and Distinctiveness Overall campus quality
Enhance Academic Program Quality Quality of faculty
Enhance Academic Program Quality Program accreditation and review
Enhance Academic Program Quality Teaching and learning development
Enhance Academic Program Quality Information/technology resources
Enhance Academic Program Quality Physical resources
Enhance Academic Program Quality Program demand and delivery
Improve Student Achievement and Success Preparation and support
Improve Student Achievement and Success Access and affordability
Improve Student Achievement and Success Student engagement
Improve Student Achievement and Success Progress
Improve Student Achievement and Success Outcomes
Expand the Scope and Impact of Research and Creative Activities Funding
Expand the Scope and Impact of Research and Creative Activities Research collaborations
Expand the Scope and Impact of Research and Creative Activities Faculty participation/productivity
Expand the Scope and Impact of Research and Creative Activities Space and equipment
Expand the Scope and Impact of Research and Creative Activities Academic Impact
Expand the Scope and Impact of Research and Creative Activities Practical Impact
Advancing Indiana Economic development and impact
Advancing Indiana Cultural development and impact
Advancing Indiana Educational development
Advancing Indiana Indiana professional practice Preparation and service
Advancing Indiana Civic engagement
Increase Operational Efficiency and Effectiveness Finances and budgeting
Increase Operational Efficiency and Effectiveness Enrollment
Increase Operational Efficiency and Effectiveness Leadership development
Increase Operational Efficiency and Effectiveness Administrative overhead
Increase Operational Efficiency and Effectiveness Quality of administrative services to Faculty/staff/student
33Limitations of Measures/Metrics
- Inherently imperfect
- Overly simplistic
- Not everything that counts can be counted, and
not everything that can be counted counts
Albert Einstein
34Accommodating the Limitations
- An imprecise answer to the right question is much
better than a precise answer to the wrong
question (paraphrasing John Tukey) - Triangulation
- Using multiple, convergent measures to better
reflect the underlying - Performance measures as the tip of the
evidence-based iceberg
35Performance Measures as the Tip of the
Evidence-Based Practice Iceberg
Performance measures
Evidence Based Practice
Vertical (hierarchical) alignment
Plan
Implement
Improve
Assess
Horizontal (cross-unit) alignment
36Evidence-Based Practice
- Commonly used in clinical domain
- Validity derived from rigorous research conducted
by others and believed to generalize to other
settings - For university endeavor there are limits to
generalizability across settings - Focus shifts to more continuous use of
process-generated data using less rigorous
methods to monitor, reflect, and adjust
37Methods of Evidence-Based Practice
- The many faces of evidence-based practice
- Student learning outcomes assessment
- Program evaluation
- Program review
- Quality improvement
- Balanced score card
- Benchmarking
- The role of collaborative inquiry
38The Evaluation Cycle
Adapted from Norman Jackson
1. THINK ABOUT ISSUES
2. ENGAGE WITH THE PROBLEM
3. DEVELOP RESOURCES/ STRATEGIES TO IMPROVE
6. PLAN TO IMPROVE
4. IMPLEMENT INTERVENTIONS experiment
5. EVALUATE IMPACT did it work as I intended?
how did people respond? what were the results?
39The Assessment Matrix
40(No Transcript)
41The Support Unit Matrix
42Quality Improvement Models
- Advantages
- Focus on process provides best chances for
identifying points of improvement - Collaborative teams empower staff and help
improve communication across units - Formulaic method and external staff support help
guide and keep on track - Sample methods
- Penn States Fast Track
- U of Wisconsin Accelerated Improvement
43PSU Fast Track
44UWisc Accelerated Improvement
http//www.wisc.edu/improve/improvement/accel.html
Define Goals and measures of success Document process Understand customer needs Check/refine goals
Design Develop potential solutions Analyze solutions/options Finalize solution develop implementation plan
Implement Inform affected people Conduct training, if needed Execute action plans w/timeline
Follow-up Collect data to track improvement Review and refine process changes Issue final report with results
45Program Review
- Program self-study, site visit by peers
- Common method for academic programs
- Increasing use for administrative programs
- Fits well with accreditation framework
- Guidelines shape tone and tenor
- Content standards
- Review team composition
- Flexibility accommodates range of inquiry
orientations
46Limits of Program Review
- Expensive and time-consuming
- Can be done with little participation
- Or with a lot
- Results not always directly useful for change
- Memorandum of understanding helpful
- Episodic nature not responsive to changing
environment
47Balanced Score Card (BSC)
- Kaplan Norton propose business model
- Financial performance
- Customer service and satisfaction
- Process effectiveness and efficiency
- Organizational learning
48BSC in Higher Education
- Ruben (1999)
- Teaching/Learning
- Programs/Courses, Student Outcomes
- Service/Outreach
- University, profession, alumns, state,
prospective students, families employers - Scholarship/Research
- Productivity/Impact
- Workplace satisfaction
- Faculty/staff
- Financial
- Revenues/expenditures
49(No Transcript)
50(No Transcript)
51Benchmarking
- Best practices in organizations sharing similar
internal work procedures - HE focus often on peer or aspirational
institutions - NACUBO study searched for measures
- APQC introduces qualitative benchmarking to
higher education - Measuring institutional performance outcomes
- Electronically supported student services
52(No Transcript)
53(No Transcript)
54More Complex Models
- The Evaluation Center
- Stufflebeam, Eastern Michigan University
- http//www.wmich.edu/evalctr/checklists/
- CIPP Model
- Constructivist Evaluation
- Deliberative Democratic Evaluation
- Key Evaluation Checklist
- Qualitative Evaluation
- Utilization-Focused Evaluation
55Limits of Complex Models
- Too complex and expensive to be practical
- They require an
- evaluation unit as a staff operation at a high
level of the organization in order to help
insulate the unit from inappropriate internal
influences and enhance its influence on decision
making .
Daniel J. Stufflebeam http//www.wmich.edu/evalct
r/checklists/institutionalizingeval.htm
56Collaborative Action Inquiry
- Continuous cycle of data collection ? data
analysis ? data feedback ? action plans ?
data collection - Stakeholder empowerment through active and
on-going participation - Data feedback meetings promote collaboration,
dialogue, and collective analysis - Active learning and discovery fostered by
critical reflection process - Data-driven action plans developed research
linked to action
57Linking Research and Action
- Who does what?
- Decides what actions are taken?
- Is responsible for effective implementation?
- Can devise appropriate evaluation protocols?
- Has access to or can collect appropriate
evidence? - Reviews the results and decides what to do?
- What can be done to get these people to work
together and in concert?
58A Learning Paradigm
- Typical data-driven focus supposes rational world
- Learning incorporates uncertainty, ambiguity, and
multiple styles - Individual learning and organizational learning
are compatible concepts - Evidence-based practice is compatible with
learning approach
59Single- and Double-Loop Learning
- Argyris and Schön
- Learning is the detection and correction of error
(unintended consequences) - Governing Variables are those things what we
feel are important to keep within limits - Action Strategy is what we do or plan to do to
keep the governing variables within limits - Consequences are the intended and unintended
outputs and outcomes - Intended confirm our theory in use
- Unintended suggests error in our theory in use
60Single-Loop Learning
- Governing variables not called into question
- Adjustments made to action strategies at best
- Defense mechanisms can readily arise to maintain
single-loop learning
Governing Variables
Action Strategies
Conse- quences
61Double-Loop Learning
- Questioning the role of the framing and learning
systems which underlie actual goals and
strategies - Reflection is fundamental
- Basic assumptions are confronted
- Hypotheses publicly tested
- Falsification is sought
- Ego is laid aside
Governing Variables
Action Strategies
Conse- quences
62Model I and II Org Learning
- Single- and double-loop learning at the
organizational level - Model I Organizational members prescribe to a
common theory in use - Organizational policies and practices inhibit
change - Model II Governing values, policies, and
practices promote double-loop learning
63- John Seely Brown Paul Duguid
- The Social Life of Information
- (2000) Harvard Business School Press
Organizational Learning and Communities-of-Practic
e Toward a Unified View of Working, Learning,
and Innovation. (1991) Organization Science,
2(1), 40-57.
64Learning To Be / Know How
- Based on collaborative practice
- Communities of practice
- Knowledge as inseparable from the knower
- Evidence from a variety of sources, including
practitioner experience - Sharing interpretations as process
- Common priorities and strategies as output
65Learning is Good
- We promote (lifelong) learning for students
- We seek to contribute to the creation of
knowledge within our disciplines and professions - What about in our practice as...
- Classroom teachers
- Conferrers of degree credentials
- Managers and administrators
- Support staff
66The Learning/Performance Measure Conundrum
- If our general objective is to collectively learn
how to do our work better, then we must accept
that our current thinking, practices, structures,
etc., need to change - Our current best thinking about what measures
reflect progress toward desired changes may
change through the learning process - We should not be rigid about our performance
measures but rather allow our evidence-based
collaborative learning efforts to guide their
evolution
67Implications for Faculty/Staff/Org Development
- There are many viable ways to integrate inquiry
into organizational practices - Administrative support focus may need to shift
from information provision and toward
collaborative inquiry - Someone needs to focus on how this all fits
together - The institutional portfolio provides one such
mechanism
68Implications for Information Use
- Data sources
- Types of needs
- Types of users
- Sources of information
- Tools for user needs
69Data Sources
- Sources of evidence
- Documented
- Provider/practitioner experience
- User/client experience
- Contextual
- Derived from Institutions operational
information systems - Student, Human Resource, Finances
- Space, program inventory, courseware
- Surveys
- Students, faculty, staff, prospects, community
- External data sources
- Federal and State (K-16) education data, national
efforts (CDS, rewards and recognitions, media) - Census, labor, workforce development, licensing
boards
70Types of Information Needs
- Operational
- Directly support the ongoing operation of a
system - Formatted presentations of transactional
- Often use data from a single operational domain
- Tactical
- Monitor and respond quickly to a variety of
short-term situations - Typically more aggregate (less granular) than
operational reports - Includes both recurrent and ad hoc information
needs - Often requires merging data from multiple
operational domains as well as data from
non-operational sources - Strategic
- Focuses on higher level policy and practice
issues, often with longer timeframes - Often requires more significant analysis of
institutional, survey, and external data sources
71User Roles
- Casual
- occasional use that demands relatively little
technical expertise - Recurrent
- more frequent use but modest technical expertise
OR insufficient time to employ technical skills - Power
- modest to frequent use with capacity for using
more complex technical systems - Individuals may occupy different roles at
different times
72Information Needs and Users
Type of User Type of User Type of User
Casual Recurrent Power
Operational
Tactical
Strategic
Pre-packaged Operational Reports
Report Modules With Parameter Choices
ODBC Access to Data Warehouse Tables
Type of Use
Research Briefs And Analyses
Web-based Report Generators
OLAP Tools
73Implications for IT
- Analytic data warehouse is essential, but
- Think more broadly about data sources
- Not just enterprise system as we now know it
- Data from courseware platform
- Mechanisms for collecting droppings from other
important activities - Faculty vitae and annual reports
- Portfolios of faculty and student work
- Civic engagement inventory
- Access/reporting technology should focus on
enabling value-added resellers to deliver to
broad range of users
74Responsibility-Centered Budgeting
- Similar to Churchills opinion of democracy
- It has been said that democracy is the worst form
of government except all the others that have
been tried - Concerns about changing to RCB
- It changes everything and yet nothing really
changes - I have known a great many troubles, but most of
them never happened Mark Twain
75Parting Thought
- It is good to have an end to journey towards but
it is the journey that matters in the end -
Ursula Le Guin