Performance Management Presentation - PowerPoint PPT Presentation

About This Presentation
Title:

Performance Management Presentation

Description:

Slide 1 – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 54
Provided by: tayl129
Learn more at: https://ors.od.nih.gov
Category:

less

Transcript and Presenter's Notes

Title: Performance Management Presentation


1
Performance Management Presentation
  • Team Members
  • Ronald Wilson, Team Leader
  • Team Members Gerald Hines,
  • Fred Khoshbin, Cyrena Simons
  • ORF
  • National Institutes of Health
  • January 15, 2004

2
Table of Contents
  • Main Presentation
  • PMP Template ...
  • Customer Perspective..
  • Internal Business Process Perspective.
  • Learning and Growth Perspective
  • Financial Perspective.
  • Conclusions and Recommendations
  • Appendix.

3
(No Transcript)
4
Relationship Among Performance Objectives
5
Customer Perspective
6
Customer Perspective (cont.)
Objective Measure FY 03 Target FY04 Target FY05 Target Initiative Owner
C1 Increase customer satisfaction C2 Respond consistently and reliably to customers C3 Equitably and impartially serve all ICs C4 Anticipate customers' needs C1 Overall average rating for Customized ORS Customer Scorecard C2 of Customized Scorecard survey respondents indicating satisfaction with the consistency and reliability of service received from the service groupC3 of Customized Scorecard survey respondents indicating satisfaction with access to the planning process regardless of outcome of their individual requestC4 of Customized Scorecard survey respondents indicating service group anticipated their needs and assisted them in incorporating respondents facilities issues into the planning process Average rating of 6.75 (on a scale of 1.0 to 10.00)Average rating of 6.80 (on a scale of 1.0 to 10.00)Average rating of 7.05 (on a scale of 1.0 to 10.00)Average rating of 6.50 (on a scale of 1.0 to 10.00) Equal to or greater than FY03 resultEqual to or greater than FY03 result Equal to or greater than FY03 result Equal to or greater than FY03 result Equal to or greater than FY04 result Equal to or greater than FY04 result Equal to or greater than FY04 result Equal to or greater than FY04 result  Repeat customer survey regularly  
7
Customer Survey Results
  • In the winter of 2002-2003, a customer survey of
    Institute
  • Directors and Executive Officers was undertaken
    to gauge
  • customer satisfaction with the services of the
    then Office of
  • Facilities Planning.
  • Customers were asked to rate services provided by
    Office of Facilities Planning according to a
    planning continuum long-, medium-, and
    short-range.
  • Customer service targets for FY03-FY05
    illustrated in slides that follow have been
    derived from scores attained on this customer
    survey.
  • Complete survey results exclusive of full text
    comments can be found in the Appendix.

8
Summary FY03 Customer Satisfaction Ratings of
Facilities Planning Services
Unsatisfactory
Outstanding
N 21
9
Long-Range Planning Service Ratings by FY
Mean Ratings
M 7.14
M 7.13
M 7.57
Unsatisfactory
Outstanding
N 12
N 21
10
Mid-Range Planning Service Ratings by FY
Mean Ratings
M 7.53
M 7.47
M 7.56
Unsatisfactory
Outstanding
N 21
N 12
11
Short-Range Planning Service Ratings by FY
Mean Ratings
M 7.43
Timeliness
M 6.91
M 7.27
Unsatisfactory
Outstanding
N 21
N 12
12
FY03 Long-Range Planning Service Ratings by
Position
Mean Ratings
Unsatisfactory
Outstanding
Note Differences are not statistically
significant.
N 6
N 11
13
FY03 Mid-Range Planning Service Ratings by
Position
Mean Ratings
Unsatisfactory
Outstanding
Note Differences are not statistically
significant.
N 6
N 11
14
FY03 Short-Range Planning Service Ratings by
Position
Mean Ratings
Outstanding
Unsatisfactory
Note Differences are not statistically
significant.
N 6
N 11
15
Summary
  • Three-fourths of respondents were from the
    Bethesda campus
  • The majority of the respondents were EOs
  • Highest Long-Range Planning ratings of
    satisfaction for responsiveness, availability,
    and competence
  • Lowest ratings for handling of problems and
    quality
  • Highest Mid-Range Planning ratings of
    satisfaction for availability, competence, and
    responsiveness
  • Lowest ratings for handling of problems and
    timeliness
  • Highest Short-Range Planning ratings of
    satisfaction for competence, quality,
    responsiveness and availability
  • Lowest ratings for handling of problems and
    timeliness

16
Summary
  • Top three FY03 ratings of satisfaction for all
    phases of planning were competence,
    responsiveness, and availability
  • Lowest FY03 ratings of satisfaction for all
    phases of planning were in the area of handling
    of problems
  • FY03 Long-Range Planning ratings lower than the
    other phases
  • Comparison of ratings between ICDs/SDs and EOs
    did not indicate huge differences in perceptions
  • EO perceptions more positive than ICDs/SDs
  • Differences are NOT statistically significant

17
Summary
  • Comments indicate
  • Need clearer definition of role responsibilities
    in ORS/ORF
  • More coordination among DFP/DES/Leasing
  • Seamless process from planning to actual space
    acquisition
  • There are perceptions of inequity among ICs
  • ORS/ORF needs to develop a better understanding
    of billing procedures and rules within the ICs

18
Customer PerspectivePlanned Actions
  • The Division of Facilities Planning is currently
    surveying its customers for FY04 as part of the
    annual Buildings and Space Planning process. It
    has modified last years customer survey to
    incorporate the appropriate discrete services for
    which it is responsible and revised the survey
    questions to better align with customer
    objectives.

19
Internal Business Process Perspective
20
Internal Business Process Perspective
Objective Measure FY 03 Target FY04 Target FY05 Target Initiative Owner
IB1 Improve efficiency and effectiveness of planning services IB2 Proactively engage customers and stakeholders and improve their awareness IB1 A. Project Reviews-- of submittal deadlines met, such as- Program of Requirements- Pre-Programming Documents- Design Drawings- Environmental ReviewsB. DFP Planning Studies-- of planning milestones met- Master Plans- Strategic Facilities Plan- Site Feasibility StudiesC. Short-term planning requests- SJDs-- of planning milestones met- Site Selection Requests-- of planning milestones met Identify categories of work process outputs for planning services provide and develop a computer-based system for monitoring outputs on a quarterly basis 5 contacts per year per customer (I.e., ICs, OD, EOs, etc.) by type (0not useful 1useful, collected data 2identified issues/developed strategies to proactively support customers 3educated customers on process and limitations Establish baseline work process outputs and performance measures by planning service TBD TBD TBD Construct a planning milestone chartDevelop a planning and programming tool to assist NIH in assessing project feasibility at the early stages of project development Develop a tracking mechanism in the SJD log to track the progress of SJDs from the original request to the initial presentation of the request to the SRB  
IB2 of meetings attended and number of issues
identified and strategies that resulted from
proactive contacts with customers, by source and
type of meetings.
21
Internal Business Process PerspectivePlanned
Actions
  • In the coming year, the Division will
  • Construct a planning milestone chart to track its
    processes
  • Develop a planning and programming tool to assist
    NIH in assessing project feasibility at the early
    stages of project development
  • Develop a tracking mechanism in the SJD log to
    track the progress of SJDs from the original
    request to the initial presentation of the
    request to the SRB





22
Learning and Growth Perspective
23
Learning and Growth Perspective
Objective Measure FY 03 Target FY04 Target FY05 Target Initiative Owner
LG1 Renew technical and non-technical skills LG2 Complete benchmarking plan to become expert planners LG3 Enhance the structure of planning information for easy reference and retrieval and support the adoption of "best practices"LG4 Improve collaborative knowledge sharing L1a of employees with current and pertinent training plansL1b of training plans that are fully executed L2 of benchmarking plan completed L3 of planning communication tools in place and "best practices" adoptedL4 Number of information exchanges attended by 50 or more of staff 25 of employees have completed planning-related training (Actual--27 of employees took coursework) 20 of benchmarking plan complete 20 of planning information technology plan complete 4 Information Exchanges held during the year 80 of employees have training plans and implementation is in process 100 of benchmarking plan completed 100 of planning information technology plan completed 100 of employees have completed all mandatory training plan requirements Implement selected "best practices" TBD Complete an office-wide training needs assessment. Each staff member to develop a training plan Assemble benchmarking group and survey group members Dedicate time in the monthly Information Exchange to discuss ongoing technology needs Assemble benchmarking group and survey group members
24
Learning and Growth PerspectiveActions Planned
and Underway
  • Objective LG 1 A training needs assessment has
    been conducted and each staff member is
    developing a training plan.
  • Objective LG2 A benchmarking survey is underway.
    In order to identify best practices, the Division
    has invited 9 federal installations within the
    Capital Region and/or DHHS, as well as 8 research
    universities and one hospital system to
    participate in a benchmarking project. To date,
    from among this group, 9 planning officers have
    confirmed their willingness to participate.
    Survey responses are expected by the end of
    January 2004. Follow up plans include interviews
    with some of the participants, and another survey
    next year to (1) include other types of
    organizations, and (2) expand the topics for
    benchmarking.
  • Objective LG3, Part 1 Working with IT, the
    Division is developing a tracking tool for its
    customers to measure the frequency and
    effectiveness of its meetings and the success of
    its outreach efforts.
  • Objective LG3, Part 2 Actions the same as for
    Objective LG2.
  • Objective LG4 As part of its monthly Information
    Exchange, the Division is dedicating time to
    identifying its technology needs. Four
    Information Exchanges held last year.

25
Financial Perspective
26
Financial Perspective (cont.)Mandatory to Report
on Unit Cost Objective
Objective Measure FY 03 Target FY04 Target FY05 Target Initiative Owner
F1 Minimize planning unit costs F1 DFP unit cost for master planning services measured per planning activity F2 DFP unit cost for preparing/reviewing environmental documents measured per environmental document F3 DFP unit cost of strategic facilities planning measured per capita (census population) F4 DFP unit cost to coordinate community input measured per hours spent in coordination F5 DFP unit cost of BS planning services per BS Plan report F6 DFP unit cost to manage SJD process per square foot of space requested Establish baseline master planning unit costs Establish baseline planning activity unit costs Establish baseline environmental planning unit costs Establish baseline community coordination costs Establish baseline BS Planning report unit costs Establish baseline short-range planning costs per square foot of SJD space requested TBDfor all measures TBDfor all measures For all measures, develop a computer-based system for tracking planning costs on a quarterly basis after implementation of the CAN system  
27
Financial PerspectiveActual costs for discrete
services
28
Financial PerspectivePlanned Actions
  • The Division will reexamine its unit cost
    measures and establish new baseline unit costs
    from FY02 and FY03 actual cost data. FY04 costs
    will be monitored quarterly when the new CAN
    system is implemented.

29
Conclusions
30
Conclusions from PMP
  • The Balanced Scorecard approach has helped DFP
    better relate our day-to-day planning activities
    to long-term ORF and DFP planning goals.
  • Also, the process has helped highlight how
    important satisfying customer requirements (some
    of which customers may be unaware of) is to the
    work of the Division. Getting in front of
    customers needs, and thus guiding them through
    options before they are caught short and left
    with no alternatives has gained more
    appreciation. The process has resulted in a
    greater recognition of the value of working with
    our customers early in the facility and space
    planning processes enabling us to spot problems
    well in advance, and help customers plan through
    to a solution(s) that meets their needs and those
    of NIH.
  • Working through the Learning and Growth
    perspective pushed our thinking outward to begin
    to focus on essential technical support and
    training needs. Its doubtful we would have
    pushed as far without the PMP process.
  • In summary, the process has helped sharpen our
    insights, deepened our understanding of what we
    are all about, and focused us on what we need to
    do to achieve the Divisions and ORFs strategic
    goals.
  • The major initiatives for FY04 will be to revise
    the Customer Survey to track the Divisions
    progress in addressing prior year customer
    concerns and completing the benchmarking survey.

31
Appendix Measures
  • Customer Perspective
  • C1 Overall average rating for Customized ORS
    Customer Scorecard
  • C2 Percent of Customized Scorecard survey
    respondents indicating satisfaction with the
    consistency and reliability of service received
    from the service group
  • C3 Percent of Customized Scorecard survey
    respondents indicating satisfaction with access
    to the planning process regardless of outcome of
    their individual request
  • C4 Percent of Customized Scorecard survey
    respondents indicating service group anticipated
    their needs and assisted them in incorporating
    respondents facilities issues into the planning
    process
  • Internal Business Process Perspective
  • IB1 A. Project Reviews-- of submittal
    deadlines met, such as- Program of
    Requirements- Pre-Programming Documents-
    Design Drawings- Environmental ReviewsB. DFP
    Planning Studies-- of planning milestones met-
    Master Plans- Strategic Facilities Plan- Site
    Feasibility StudiesC. Short-term planning
    requests- SJDs-- of planning milestones met-
    Site Selection Requests-- of planning milestones
    met

32
Appendix (contd.)
  • Learning and Growth Perspective
  • L1a of employees with current and pertinent
    training plansL1b of training plans that are
    fully executed L2 of benchmarking plan
    completed L3 of planning communication
    tools in place and "best practices" adoptedL4
    Number of information exchanges attended by 50
    or more of staff
  • Financial Perspective
  • F1 DFP unit cost for master planning services
    per planning activity
  • F2 DFP unit cost of BS planning services per
    BS Plan report
  • F3 DFP unit cost to manage SJD process per
    square foot of space requested

33
Appendix
  • Results from the FY03 ORS Customer Scorecard for
    NIH facilities Planning
  • Prepared by
  • Amy Culbertson and Joe Wolski
  • Office of Quality Management
  • 16 April 2003

34
Methodology
  • OQM contacted by OFP early in FY03 to discuss
    customer assessment methodology
  • Desire to establish system that is integral
    component of annual Building and Space planning
    meetings with ICs
  • Cycle of meetings typically occur early in the FY
  • Discussed concern that had just completed
    customer survey in Sept FY02
  • Designed new FY03 survey to address three
    components of planning process
  • Long-range
  • Medium range
  • Short range
  • Modified the administration process based on
    learnings from Sept FY02 process
  • Carefully tracked who responded to survey
  • Sent follow-up emails to increase response rate

35
Methodology
  • Surveys administered via email in December 2002 -
    January 2003 time frame
  • Sent to IC Directors (ICDs), Scientific Directors
    (SDs), and Executive Officers (EOs)
  • Comments received back from some EOs indicated
    displeasure at survey going to ICD/SD
  • Reminder sent in late January
  • Received substantial number of surveys after
    reminder was sent
  • Gathered, tracked, entered, and analyzed data
  • Integrated responses from Sept 02 survey as
    appropriate

36
Survey Distribution
FY03 Administration Number of surveys
distributed 70 Number of respondents 21 Re
sponse Rate 30 Number of ICs receiving
survey 27 Number of ICs with at least one
response 14 IC Response Rate 52 FY02
Administration Number of surveys
distributed 85 Number of respondents 12 Re
sponse Rate 18
37
FY03 Respondents by Location
N 20
Note Multiple responses allowed. 1 respondent
skipped this question.
38
FY03 Respondents by Primary Mission
N 20
Note 1 respondent skipped this question.
39
FY03 Respondents by Position
N 21
Note Feedback from respondents indicated that 6
surveys were filled out by the EO on behalf of
the IC Director.
40
Summary FY03 Customer Satisfaction Ratings of
Facilities Planning Services
Outstanding
Unsatisfactory
N 21
41
Long-Range Planning Service Ratings by FY
Mean Ratings
M 7.14
M 7.13
M 7.57
Unsatisfactory
Outstanding
N 21
N 12
42
Mid-Range Planning Service Ratings by FY
Mean Ratings
M 7.53
M 7.47
M 7.56
Outstanding
Unsatisfactory
N 21
N 12
43
Short-Range Planning Service Ratings by FY
Mean Ratings
M 7.43
M 6.91
M 7.27
Unsatisfactory
Outstanding
N 21
N 12
44
FY03 Long-Range Planning Service Ratings by
Position
Mean Ratings
Unsatisfactory
Outstanding
Note Differences are not statistically
significant.
N 6
N 11
45
FY03 Mid-Range Planning Service Ratings by
Position
Mean Ratings
Outstanding
Unsatisfactory
N 6
Note Differences are not statistically
significant.
N 11
46
FY03 Short-Range Planning Service Ratings by
Position
Mean Ratings
Unsatisfactory
Outstanding
Note Differences are not statistically
significant.
N 6
N 11
47
Do the facilities planning services in ORS
support your Institutes mission
planning efforts?
N 21
N 12
48
Do you understand how to get your Institutes
needs into the strategic facilities planning
process?
N 21
N 12
49
Does the current process work effectively for
your Institute to acquire the space you need?
N 21
N 12
50
Themes from Comments on Suggestions to Improve
the Facilities Planning Process
  • Need clear definition of role responsibilities in
    ORS and seamless service from planning to
    acquiring the space
  • Among planners, DES, and real-estate leasing
  • Lack of communication between DES, Real Estate
    Leasing, and OBSF
  • Planning and SJD work well, but leasing needs
    more people
  • Space decision-making process does not consider
    the implications of leasing widely dispersed off
    campus sites that NIH corporate infrastructure
    must support
  • There are perceptions of inequity among ICs for
    space
  • Method of providing space to newer ICs needs
    to be improved
  • Central Service facilities are given a low
    priority needs are not met.
  • ORS needs to have a better understanding of
    billing procedures and rules
  • Get clarification from OBSF regarding
    procedures/rules for TIA
  • ICs do not have full knowledge of associated
    costs of leased space
  • Inequities in the way ICs are billed for space
    by ORS that are not being corrected by ORS
  • IC was billed, however space was unusable
  • ICs would like more information on the BF budget
  • ICs would like more independence in managing and
    renovating their leased space

Note Include themes from both FY02and FY03
comments.
51
Summary
  • Three-fourth of respondents were from the
    Bethesda campus
  • The majority of the respondents were EOs
  • Highest Long-Range Planning ratings of
    satisfaction for responsiveness, availability,
    and competence
  • Lowest ratings for handling of problems and
    quality
  • Highest Mid-Range Planning ratings of
    satisfaction for availability, competence, and
    responsiveness
  • Lowest ratings for handling of problems and
    timeliness
  • Highest Short-Range Planning ratings of
    satisfaction for competence, quality,
    responsiveness and availability
  • Lowest ratings for handling of problems and
    timeliness

52
Summary
  • Top three FY03 ratings of satisfaction for all
    phases of planning were competence,
    responsiveness, and availability
  • Lowest FY03 ratings of satisfaction for all
    phases of planning were handling of problems
  • FY03 long-range planning ratings lower than the
    other phases
  • Comparison of ratings between ICDs/SDs and EOs
    did not indicate huge differences in perceptions
  • EO perceptions more positive than ICDs/SDs
  • Differences are NOT statistically significant

53
Summary
  • Comments indicate
  • Need clearer definition of role responsibilities
    in ORS
  • More coordination among OFP/DES/Leasing
  • Seamless process from planning to actual space
    acquisition
  • There are perceptions of inequity among ICs
  • ORS needs to have a better understanding of
    billing procedures and rules
Write a Comment
User Comments (0)
About PowerShow.com