Title: EVALUATION
1EVALUATION ACCOUNTABILITY SYSTEM for
EXTENSION (EASE)
SAES/ARD Research Directors Workshop Coeur d
Alene, Idaho September 26, 2001 Howard Ladewig,
University of Florida
2Organizational Accountability
- Provide stakeholders with a clear understanding
of what is being achieved with the funds that are
being spent
3 Accountability Legislation
- GPRA of 1993 - Performance Based Budgeting
- FAIR Act of 1996 - State of the Art Information
Systems - AREERA of 1998 - Accountability Reporting System
- State and County Legislation
4Summary ofLegislative Requirements
- Establish performance goals
- Project activities to support each goal
- Provide annual progress report that compares
projected to actual - Merge fiscal accountability and program
evaluation - Use state of the art information technology
reporting system
5 Implications for Cooperative Extension
- Secretary of Agriculture will use a DBMS to
monitor and evaluate Extension and Research
activities - Nationally, Extension does not have DBMS
- Most states do have DBMS for individual
performance but not program performance. - No common database design among states
- Perceptions toward reporting systems
6Faculty Perceptions Toward Extension Reporting
System
7Administrative Perceptions Toward Extension
Reporting System
8The Information Problem
- Widely dispersed information resources
- Disparate formats incompatible tools
- Multiple sources - little integration
- Limited, unfriendly tools for user access
- Limited use of current information resources for
strategic planning, decision making, and
performance assessment
9A PLATFORM FOR LINKINGREE DATA SYSTEMS
Statistics/ Data
Education
REEIS Program Information Fiscal Information Cost
and Impact Information Policy Information Data
Integration
Research
Extension (EASE)
10Components of anAccountability System
- For what should Extension be held accountable?
- Accuracy and consistency of evidence?
- Window of time available for response?
11For What Should Extension be Held Accountable?
- Relevance
- Resources
- Quality
- Accomplishments
- Impacts on Community
12Accuracy and Consistency of Accountability Data?
- Systematic monitoring of inputs/outputs
- Testimonials of change (success stories)
- Impact statements (case study)
- Time series (before/after measures of change --
county program) - Evaluation research design (controls)
- Budget Analysis (cost-benefit analysis)
13Window of Time for Response?
- Individual reports -- annually
- Big Chief Tablet and 2 pencil
- Word Processor (text file)
- Database Management Information System
- Organization report on demand
- Aggregation (key word search)
- Use of discrete and meaningful data
14Goals of EASE To Provide
- Clear and concise definition of programs and
priorities - Inventory of programs by Extension Classification
System - Outcome based performance measurement
- Program benefits to participants and public
- Efficiency and productivity measures of Extension
programs
15BARRIERS
- Political Process
- Extension budget accounts for people and not
programs - Database designs have focused on individual
performanceno commonality - Few standard measures of outcomes
- Past reporting requirements have been of limited
value to states and counties
16Technical Support for EASE
- Java-based Object-Oriented Database (Objectivity)
on a Unix Server - Architecture is platform independent
- Input/Linkage (use of XML)
- Web-based Interface (anywhere/anytime)
- Batch Files
- Network (WWW and FTP)
- Removable Storage (Diskette CD-Rom)
- Manual Input (cooperative hand entry of data)
17Where to from Here
- Major Program Descriptors
- Key Program Codes
- Best Management Practices (BMPs)
- Program / Budget Level
- Develop standard outcome measures
- Linkage of EASE to CRIS projects
- Testing of EASE DBMS
- Prototype available in December, 2001