Title: Program Assessment System Design to Enhance Collaboration and Improvement
1Program Assessment System Design to Enhance
Collaboration and Improvement
- Robert L. Armacost
- Higher Education Assessment and Planning
Technologies - Julia Pet-Armacost
- Assistant Vice President, Information, Analysis,
and Assessment - University of Central Florida
- NCCI Seventh Annual National Conference
- July 6-8, 2006
Presentation available at http//iaa.ucf.edu
2Overview
- Role of collaboration and assessment
- Focus on a system
- Management objectives for assessment
- Program assessment process design
- Quality assurance of the assessment process
- Scheduling to systematize assessment
- System support requirements
- Summary of program assessment system
characteristics
3The University of Central Florida
Stands for Opportunity
- Established in 1963 (first classes in 1968),
Metropolitan Research University - Grown from 1,948 to 45,000 students in 37 years
- 38,000 undergrads and 7,000 grads
- Ten colleges
- 12 regional campus sites
- 7th largest public university in U.S.
- 89 of lower division and 67 of upper division
students are full-time
- Carnegie classification
- Undergraduate Professions plus arts sciences,
high graduate coexistence - Graduate Comprehensive doctoral (no medical)
Medical school approved - 92 Bachelors, 94 Masters, 3 Specialist, and 25
PhD programs - Largest undergraduate enrollment in state
- Approximately 1,200 full-time faculty 9,000
total employees
4Collaborative Solutions
- Plan
- Implement
- Assess
- Lead
5Opportunities for Improvement
Strategic planning benchmarking program review
focused initiatives
Program assessment Continuous Quality Improvement
6Systems Supporting Program Improvement
Program Assessment
Unit and Program Reviews
Program Assessment
- Linkages
- Share data and information
- Inform budget process
Strategic Planning
- Differences
- Different cycles
- Additional data elements
- Different purposes
- Continuous improvement
- Evaluation
- Planning
7Program Assessment for Continuous Quality
Improvement
- It is a formative evaluation process designed to
support program improvement - It is continuous
- It is focused on improvement
- Student learning
- Student development
- The institution and its people
8Important to Clearly Separate Formative from
Summative Uses
- If assessment is being used for improvement
purposes - Do not use the assessment measure targets to
judge or grade the quality of the program or
operation - Do not punish programs for not making their
targets - Provide rewards for having an excellent
assessment PROCESS - Provide rewards for conducting assessments and
using the results to improve - Use different terms to distinguish
- Assessment formative evaluation
- Accountability and evaluation summative
evaluation
9Effective Program Assessment Should Answer these
Questions
- What are you trying to accomplish?
- How well are you doing it?
- How, using the answers to the first two
questions, can you improve what you are doing? - What and how does a program contribute to the
development and growth of its students and/or the
support of its customers? - How can student learning be improved?
10Student Learning is Complex
Tutors
Faculty
Peers
Lab Assistants
Teaching Assistants
Student learning
Spiritual Leaders
Student Learning
Advisers
Support Staff
Coaches
Library Staff
Mentors
Resident Assistants
11Educational Outcomes
adapted from presentation by Dr. Gloria Rogers
12What Programs Within the Institution Should Do It?
- Educational programs should conduct formative
assessments of student learning and of research
and service - All levels Associates, Bachelors, Masters,
Doctoral - All disciplines and special programs (e.g.,
General Education) - Administrative and educational support services
should conduct formative assessment of their
operations, processes, and programs - Admissions, student support offices,
administrative support offices, budget offices,
computer technology support office
13Critical Role of a Program Assessment System
- Culture determines the way we do business
- Creating a new assessment culture
- New model for the way we do business
- Existing processes constitute a system
- New culture requires a new system
- Well-thought out design
- Necessary elements
- Phased implementation
14Evidence of Program Assessment SUCCESS
- Sincerity means people trust the process
- Usefulness means the process helps people
- Clarity means people understand the process
- Commitment means people believe the process works
to their advantage and leaders support the
process - Enthusiasm means the people want to do it
- Systemic and Sustainable means everyone is
continuing to use it - Support means people are not on their own
15Program Assessment System Features
- Need easily understood system
- Manageable pieces
- Minimize administrative workload for participants
- Make it easy to submit assessment plans and
results - Make it easy to conduct reviews
- Make it easy to conduct assessments
- Produce useful results
16Essential Program Assessment System Elements
- Management objectives for program assessment
system - Process design content, focus, and mechanics of
the program assessment process - Quality assurance (QA) process
- Schedules and timelines for program assessment
- Support and documentation
17Management Objectives for Program Assessment
System
- Determine purpose of management structures
- Control or coordination
- Determine focus of management activities
- Assessment outcomes or process
- Determine management involvement level
- Participation or oversight
- Determine relationships of management structures
- Centralized or decentralized
18Essential Program Assessment System Elements
- Management objectives for program assessment
system - Process design content, focus, and mechanics of
the program assessment process - Quality assurance (QA) process
- Schedules and timelines for program assessment
- Support and documentation
19Mechanics of Assessment
- Assessment is a continuous improvement process
- To improve, you need to know where you are today
and where you would like to go - Mission (purpose)
- Vision (where you would like to go)
- Goals (steps to getting where you would like to
be) - Objectives or outcomes (what you need to achieve
in order to get there) - Measures (how well you are currently doing)
- To improve, you need to take action
- Analyze your program or operations to determine
changes - Plan the changes
- Take action
20Select a Model
- Popular modelNichols 5-step approach
- A Practitioners Handbook for Institutional
Effectiveness and Student Outcomes Assessment
Implementation, James O. Nichols, Agathon Press,
1995 - Necessary to implement a consistent process
across the institution - Eliminates ambiguity
- Makes planning easier
- Makes training easier
- Standardizes the documentation
- Easier for evaluators to examine the
documentation - Helps increase the comfort level
- Level of flexibility depends on maturity of the
process
21And Then Implement
- Who is required to conduct assessment?
- Academic departments, academic programs,
divisions - What do you assess?
- Student outcomes, academic processes, student
services - When do you conduct assessment?
- Triennially, annually, semesterly, monthly
- Where?
- Main campus, regional campuses
- Why?
- Scope of use
- How?
- Portfolios, surveys, institutional data,
standardized tests
22Design to Close the Loop
- Common characteristics of assessment models to
close the loop - Develop assessment plan and measures for future
period (Plan) - Collect data and analyze to produce results for
previous period (Do) - Use results to determine what needs to be
improved (Check) - Make changes and measure the effects in a future
period (Act) - Need to distinguish between the assessment
planning phase and the assessment results phase - Select serial or parallel approach
23Serial Assessment Approach
- If you want a program or process to be measured
for a full year, it takes three years to complete
one cycle - Plan for the assessment year
- Measure and analyze data at the end of the
assessment year (results) and identify actions - Act (implement changes) in the year following the
assessment year - Results would be reported every three years
24Parallel Assessment Approach
- If you want measures completed every year
- Communicating this concept is a major challenge
25Essential Program Assessment System Elements
- Management objectives for program assessment
system - Process design content, focus, and mechanics of
the program assessment process - Quality assurance (QA) process
- Schedules and timelines for program assessment
- Support and documentation
26Quality Assurance of the Assessment Process
- Quality assurance (QA) is needed to ensure that
the units and programs are following the process
and doing it well - QA may provide one or more of the following
- Leadership of the assessment effort
- Management of the assessment process
- Monitoring function
- Feedback loop to improve the process
- Training
- Support
- Consultations
27General QA Principles
- Ideally you build in quality through
- Training
- Defined process
- Consultations
- Support
- At a minimum you need to assess the assessment
process - Inspection or sampling process
- Feedback to the unit or program
28Implementing QA
- Setting goals for QA cannot ask for complete,
correct, and meaningful plans all at once - Stage 1Did everyone turn in documentation with
all of the required pieces? - Stage 2Are the pieces there and are they done
correctly? - Stage 3Are the pieces there, done correctly, and
are they meaningful? - You need to determine what you will monitor
- Examine the final product only
- Examine each piece of the process
- Examine the product at the end of each major step
of the process (assessment plan development,
analysis of results and planned changes)
29Implementing a QA Process
- Determine how you will monitor
- Inspection or sampling
- Choice of method depends on resources and
- QA process chosen (checklist or other)
- What you monitor (every piece, each major step)
- Number of programs and units in the institution
- Determine how to obtain the documentation
- Submission or on-site inspection
- Determine the schedule for the reviews
- Needs to take into account
- Parallel or serial assessment process
- What you will monitor
- Type of feedback (evaluation or assessment)
30Essential Program Assessment System Elements
- Management objectives for program assessment
system - Process design content, focus, and mechanics of
the program assessment process - Quality assurance (QA) process
- Schedules and timelines for program assessment
- Support and documentation
31Scheduling Program Assessment
- Frequency
- How often will assessments be conducted? be
reviewed? - Assessment period--over what period will the
measurements be taken? Flexibility for academic
vs. administrative units - Annual assessment (recommended)
- Develop assessment plan for next year
- Measure and analyze the results for the current
year - Act on the results from a prior year and
implement changes - Submission and review schedule
- Annual
- Plan for next year and results from past year
submitted together - Semi-annual
- Two separate submissions
- Review of results informs plan preparation
32Implementation of QA on a Serial Assessment
Process
2002-2003 Assessment Year
2005-2006 Assessment Year
Plan
Plan
Measure
Act
Measure
Act
2001-2002
2004-2005
2002-2003
2003-2004
2005-2006
2006-2007
33QA Implementation on a Parallel Assessment
Process (Dual Submission)
34QA Implementation on a Parallel Assessment
Process (Single Submission)
Report on results from previous year, planned use
of results in current year, actual use of results
in previous year, and assessment plan for current
year
35Program Assessment System
36Program Assessment Organization
- University Assessment Committee
- Ensure that process is working correctly
- Coordinates quality assurance (QA) and serves a
management review function - Overall review of Divisional Review Committee
recommendations - Divisional Review Committee
- Responsible for QA at the division/college level
- Conducts specified QA reviews of own
division/college programs - Reports QA recommendations to UAC
37QA Organization
38Essential Program Assessment Process Elements
- Management objectives for program assessment
system - Process design content, focus, and mechanics of
the program assessment process - Quality assurance (QA) process
- Schedules and timelines for program assessment
- Support and documentation
39Support for Whom? By Whom?
- Support for the doers
- Training and consultations
- Surveys and assessment instruments
- Analysis of processes
- Assessment clinics
- Support for the QA authority function
- Administrative
- Communication
- Training
- Technical
40Creating Assessment Support
- Identify areas of existing or potential support
- Institutional Researchsurveys
- College of Educationconduct assessment training
- Establish partnerships
- Establish communications and information
exchanges - Find ways to reduce workload
- Templates
- Web-enabled systems
- Survey support
41Organization of Support Function
- Level of resources
- Location of resources
- Person
- Office
- Dispersed throughout the organization
- Assessment responsibility
- Assigned
- Assumed
- Support infrastructure
- Both assessment and technology expertise
42Training is Essential
- More than a one-time event
- Repeated and reinforced
- How to let people know how or what to do
- Training workshops
- Individual consultations
- Train-the-trainer approach
- Written instructions and guidelines
- Feedback from reviews of submitted materials
- Open information
- All assessment plans accessible via the web
- Only the best ones accessible
43Recognition
- Celebrate assessment success
- Assessment fairs
- Show off good work
- Support personal development
- Training
- Courses
- Assessment conferences
- Best practice recognition or awards
44Assessment Documentation
- Documentation of the assessment process is
required for reaffirmation of accreditation - Documentation is needed for the QA function to do
its job improve the assessment process - Challenges
- Making it meaningful
- Balance between too much and too little
- Making it easy
- Templates
- Making it accessible
- QA function
- Making it flexible to accommodate innovative
approaches
45Considerations When Developing a Documentation
System
- Assessment process
- What elements need to be documented?
- Quality assurance
- Will you have a QA process and will you use the
web? - Timelines
- When do individuals do the submissions and
reviews? - What type of history needs to be maintained?
- Access
- Who gets access to what elements?
- How do you maintain security?
46Choice of System
- Must match your assessment process
- Documents the right elements
- Matches your assessment timelines
- Must match your QA needs
- The evaluation criteria must match your QA needs
- The review forms must match the stage of the QA
process - Must match your technology
- Database
- Operating systems
- Browsers
- Must match your users needs and capabilities
47Technology Enablers
- Web-based systems can strongly influence
development of a quality enhancement culture - User-friendly systems ease the workload
- Web-based systems facilitate routine
participation - Web-based systems become the ordinary way we do
business - Web-based systems require an underlying design
for the program assessment processes
48Web-based Technologies Can
- Assist in submission process
- Assessment plans easily retrieved and revised via
the web - Assessment results submitted via the web
- Assist in review processes
- Reviewers can easily access the plans and results
- The reviews of plans and results can be submitted
via the web - Assist in retrieval of assessment information
- Assessment plans
- Assessment results
- Assessment reviews
- Surveys and other assessment data
49UCF WEB-based Program Assessment System
http//iaaweb.ucf.edu/oeas/phase2/view_plans_resul
ts.asp
50UCF WEB-based Program Assessment System
51Web TemplateWEAVEonline
52Program Assessment System to Enhance Collaboration
- Create easily understood program assessment
system - Clearly defined content, schedule, and review
process - Develop manageable pieces
- Assessment plan and assessment results
- Minimize administrative workload for participants
- Easy on-line submissions and reviews
- Institutional surveys supporting assessment plans
- Training and assistance
- Produce useful results
- Improve student learning outcomes
- Close the loop and share best practices
53Questions
???
-
- Dr. Julia Pet-Armacost
- Assistant Vice President, Information, Analysis,
and Assessment - University of Central Florida
- Millican Hall 377
- P. O. Box 163200
- Orlando, FL 32816-3200
- 407-882-0276
- jpetarma_at_mail.ucf.edu
- http//iaa.ucf.edu
- Contacts
- Dr. Robert L. Armacost
- Higher Education Assessment and Planning
Technologies - 602 Shorewood Drive, Suite 402
- Cape Canaveral, FL 32920-5082
- 321-223-8158
- armacost_at_mail.ucf.edu