Kansas City Regional Performance Measurement Pilot Project - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Kansas City Regional Performance Measurement Pilot Project

Description:

... Performance Management to Public Works Services (APWA Satellite Video Conference) ... The best comparative may be yourself over time ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 31
Provided by: deankat
Category:

less

Transcript and Presenter's Notes

Title: Kansas City Regional Performance Measurement Pilot Project


1
Kansas City RegionalPerformance
MeasurementPilot Project
  • A Partnership of
  • Eleven Cities
  • MARC
  • The Abrahams Group

2
AGENDA
  • Participants
  • Objectives
  • Timeline
  • Management Framework
  • Standards
  • Lessons Learned
  • Conclusions

3
PROJECT OBJECTIVES
  • Identify two public works services
  • Train participants in the concept and use of
    performance measurement
  • Develop performance measures and collect
    comparative data
  • Assess the practicality of collecting comparable
    data and using that data
  • Assess the overall merits of a multiple
    jurisdiction approach to develop and use
    performance information

4
PARTICIPATING CITIES
  • Grandview
  • Kansas City
  • Lawrence
  • Leawood
  • Lees Summit
  • Lenexa
  • Liberty
  • Olathe
  • Overland Park
  • Shawnee
  • Unified Government

5
OTHER REGIONAL PROJECTS
  • North Carolina
  • South Carolina
  • Tennessee
  • Greater NW Chicago Suburbs
  • Greater Hartford Area

6
PROJECT TIMELINE
7
PERFORMANCE MANAGEMENT
  • Influencing factors
  • The concept of performance based management
  • Performance measurement
  • Managing for Results
  • Where do measurement and benchmarking fit within
    a performance management concept

8
PERFORMANCE MANAGEMENT INFLUENCING FACTORS
  • GFOA
  • ICMA
  • GASB
  • APWA

9
GFOA
  • NACSLB
  • Budget and Management Standing Committee
  • New Recommended Practices, points of emphasis and
    revised training programs
  • Performance Management Using performance
    information for management decision making
  • Costing government services

10
GFOA New Recommended Practices
  • Costing
  • The cost of providing government services should
    be determined on full costing of direct, indirect
    and overhead costs
  • Performance Management
  • Using performance measures in management decision
    making
  • Linking planning, performance measures, costing
    and budgeting

11
ICMA
  • Comparative Performance Measurement project
  • Always emphasized performance measurement
  • Best practices
  • Using performance information for decision making

12
GASBs SEA Research Project
  • Service Reporting data based on service
    delivery (programs and activities), not on
    organizations (departments, divisions), and the
    extent to which goals and objectives have been
    achieved
  • Efforts Resources, reporting on the amount of
    money that the service costs, either from a
    budget or actual basis
  • Accomplishments Results, reported on the basis
    of what was accomplished (outputs or the number
    of units provided) or outcomes (impact on the
    customer)
  • Relating Efforts to Accomplishments Efficiency,
    typically reported as the cost per output or
    outcome

13
GASBs Suggested Criteria
  • Reported performance information should be
    managements representation of performance.
  • The Report of Performance Information should
    provide a basis for the understanding of the
    accomplishment of goals and objectives of the
    entity with potentially significant
    decision-making or accountability implications.
    (6 criteria)
  • Performance Information to Report should assist
    in communicating the degree to which programs,
    services, and strategies have contributed to
    achieving stated goals and objectives. (8
    criteria)
  • Communication of Performance Information - A
    reasonably informed interested citizen or other
    user should be likely to learn about the
    availability of reports on performance, and
    should be able to access, understand, and use
    reported performance information. (3 criteria)

14
GASBs Suggested Criteria
  • Reported performance information should be linked
    to resources (input measures) provided and to
    costs of services as that information is
    presented in the budget document and the annual
    financial report respectively. Performance
    information that relate cost to outputs
    (efficiency measures) should be developed and
    reported.
  • Performance reported should be relevant to the
    goals and objectives (outcome and output
    measures)
  • Citizen perceptions of the quality and results of
    services should be reported (service quality
    measures)
  • Key measures of outcomes that provide a basis for
    assessing results achieved should be included in
    the report for major and critical programs and
    services.

15
APWA
  • Performance measurement
  • Performance management
  • Public works performance management
  • Creating Accountability and Increasing
    Performance (Bill Cook)
  • New Times Demand New Tools Applying Performance
    Management to Public Works Services (APWA
    Satellite Video Conference)
  • Top Ten Performance Measures for Fleet Managers
  • Performance Measures for Public Works WORKOUT
    How Do You Measure UP?"
  • http//www.apwa.net/bookstore/

16
The Planets are Aligning
  • ICMA and APWA
  • Performance measurement
  • Performance management
  • GFOA
  • Budget and Strategic Planning
  • With X , I can get Y Results!!!
  • GASB
  • Accounting and Reporting
  • With X , I got Y Results!!!

17
Can Performance Measures Work?
  • Yes, when
  • Performance measures are associated with a system
    of Managing for Results
  • The decision makers that will use them are
    prepared to do so

18
Managing for Results
STRATEGIC PLANNING
PROGRAM/ ACTIVITY PLANNING
REPORT RESULTS
EVALUATE RESULTS
MEASURE FOR RESULTS
Accountability Cultural and Structural Change
ALIGN SYSTEMS
MANAGE WORK PROCESSES
BUDGET FOR RESULTS
19
STANDARDS
  • Standards facilitate comparative analyses
  • Baseline (past performance)
  • Negotiated
  • Customer/stakeholder requirements
  • Benchmarks
  • Industry
  • KC Metro Average of community benchmarks

20
LESSONS LEARNED
  • Collecting Consistent Data Across Jurisdictions
    Is Difficult
  • Its harder than it looks
  • Miles swept
  • Odometer, map, GPS, mph/hrs
  • Financial and work management systems are not set
    up to collect data
  • Too many compromises have to be made to try to
    get consistent data

21
LESSONS LEARNED
  • Inconsistent Data Leads to Incomparable Measures
    Across Jurisdictions
  • Incomparable data means that a regional standard
    cannot be developed
  • It is difficult to use the data for a best
    practices analysis
  • Local governments lose confidence in the process
  • Local governments are reluctant to share the
    results or make them public

22
LESSONS LEARNED
  • Performance measurement requires a top to bottom
    staff commitment over an extended period of time.
  • Managers, supervisors, and line staff need to
    understand both why how
  • Training is essential
  • Should see the data used
  • Commitment must be institutionalized
  • Change in staff over extended period

23
LESSONS LEARNED
  • Performance measures are more effective when used
    internally over time and when linked to broader
    programs and goal achievement.
  • It is most important that local governments use
    measurements over time
  • But still need an outside standard
  • Performance measures should address programs
    instead of activities
  • Performance measures should address program goals

24
LESSONS LEARNED
  • Benchmarking is a learning experience
  • It should not matter if the comparison makes you
    look good or bad initially
  • What is important is you have done the
    comparative analysis, have learned from the
    analysis and have applied a better or best
    practice to reduce cost or increase
    effectiveness, or both

25
LESSONS LEARNED
  • The key is asking the right questions
  • How do your inputs, outputs, efficiency, service
    quality and outcomes compare?
  • Are your outputs constant over time? Increasing?
    Decreasing?
  • Are your unit costs or productivity increasing,
    decreasing or remaining constant overtime?
  • How about the comparisons of service quality and
    outcomes over time?
  • What did you learn from the comparison? How did
    you compare to the standard? Is this comparison
    useful?

26
LESSONS LEARNED
  • The key is asking the right questions (continued)
  • Did one or more of the cities provide street
    sweeping or hot asphalt repair more efficiently
    and/or more effectively? If so, why? What can
    you learn from this comparison? Can you adopt a
    new, better or best practice that will help you?
  • To what extent have you achieved your goals and
    objectives? Why or why not?

27
LESSONS LEARNED
  • The best comparative may be yourself over time
  • For these two activities, baseline (past
    performance) proved to be the most acceptable
    standard
  • Same service definition and explanatory factors
  • Same data collection procedures
  • Same cost basis

28
CONCLUSIONS
  • Valuable information about developing and using
    performance information
  • Close examination of operations of two activities
    internally and comparatively
  • Valuable information to establish a baseline of
    results
  • Recommended approach to continue at the program
    or department level

29
PARTICIPANT COMMENTS
  • Lessons Learned
  • Future Role of Performance Measurement

30
For More Information Contact
  • Dean Katerdahl
  • Mid-America Regional Council
  • E Mail Deank_at_marc.org/
  • www.marc.org performance/home.htm
  • Mark D. Abrahams
  • President, The Abrahams Group
  • E Mail Bettergov_at_aol.com
  • www.theabrahamsgroup.com
  • Sheila Shockey
  • President, Shockey Consulting Services, LLC
  • E Mail shockey_at_planetkc.com
Write a Comment
User Comments (0)
About PowerShow.com