Dr. Frank A. Perry - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

Dr. Frank A. Perry

Description:

... Statistical Methods in Industrial Production and Management - - the 14 Points ' ... Provide insights into important management issues as identified by ... – PowerPoint PPT presentation

Number of Views:82
Avg rating:3.0/5.0
Slides: 30
Provided by: vacog
Category:
Tags: frank | perry

less

Transcript and Presenter's Notes

Title: Dr. Frank A. Perry


1
Integrating Metrics into IT Projects During
Development and Operations
  • Dr. Frank A. Perry Robert S. McKeeman
  • February 25, 2002

2
Why Measure?
  • Modern Emphasis Derives SignificantlyFrom W.
    Edwards Deming
  • No - - This Is Not A Discussion on TQM
  • Demings History1
  • Early Career (prior to 1939) Statistics and
    decision theory for USDA
  • Middle Career (1939-1950) Statistics for
    Industrial Quality Control
  • Late Career (1950-1992) Statistical Methods
    in Industrial Production and Management - -
    the 14 Points

You cant control what you dont measure
You get what you decide to measure reported back
to you
Measure and improve
1. Tortorella, M. J., The Three Careers of W.
Edwards Deming, SIAM News, July 1995,
http//www.deming.org/theman/articles/threecareers
01.htm
3
Why Measure?
  • Every high performing organization can measure
    and improve Dr. Thomas Garthwaite
  • VA Clinical Care Metrics have proven to improve
    the quality of patient care for Veterans

4
Why Measure?
  • Provide insights into important management issues
    as identified by Enterprise Goals.
  • Impact business decisions made by the Enterprise.
  • The Benefit and Value of doing Measurement comes
    from the Decisions and Actions taken in response
    to analysis of the data, not from the collection
    of the data.

Software Engineering Institute Measurement and
Analysis Initiative November 2001
5
Why Measure?
Help Keep Projects Inside the Box
6
Support VA Measures
http//www.va.gov/budget/plan/index.htm
7
Measurement 101 Progress, Performance, and
Analysis Metrics
  • Progress Metrics Track Execution of Defined
    Tasks
  • Most appropriate during development and
    deployment stages of IT projects.
  • Successful task completion does not imply
    successful goal completion.
  • Examples
  • Percent completed versus percent scheduled /
    expected e.g. for installations, Cyber Security
    policy compliance, etc
  • Earned Value Measurement (EVM)
  • Rigorous but implies sophisticated project
    tracking
  • Should be applied to major projects lightweight
    version for minor projects
  • Budgeted Cost of Work Scheduled (BCWS)
  • Budgeted Cost of Work Performed (BCWP)
  • Actual Cost of Work Scheduled (ACWS)
  • Actual Cost of Work Performed (ACWP)
  • Schedule Performance Index (SPI) BCWS/ACWS
  • Schedule Variance BCWS - ACWS
  • Cost Performance Index (CPI) BCWP/ACWP
  • Cost Variance BCWP - ACWP

8
Measurement 101 (or maybe 202)EVM Alternative
Representations
9
Measurement 101 Progress, Performance, and
Analysis Metrics
  • Performance Metrics Track success in meeting
    defined objectives
  • Appropriate for application to developmental
    phases implies goals / thresholds are defined
    along with a formal measurement approach
  • Appropriate for application on a recurring basis
    in-service (goals / thresholds defined)
  • Effectiveness (mission accomplishment)
    suitability (training, documentation,
    availability / maintainability, etc)
  • Examples
  • Operational Availability (Ao) the killer
    metric to assess end-user access to a capability
  • Technical Performance Parameters Round Trip
    Time (RTT), Packet Loss (PL), Bandwidth (BW)
    Usage, Jitter, etc.
  • Cyber Security success rate in detecting Red Team
    intrusions
  • Customer satisfaction (subjective but repeatable
    if done correctly)
  • Mission accomplishment
  • Training documentation adequacy
  • Return On Investment (ROI) (verify achievement of
    projections)

10
Measurement 101 Progress, Performance, and
Analysis Metrics
  • Analysis Metrics Gain Understanding of Data /
    Indicators, or Diagnose Issues
  • Most appropriate for application within a task
  • Helps in judging progress / performance, and if
    not adequate why not
  • Often not necessary if things are working well
  • Examples
  • Earned Value Measurement Cost / Schedule
    Variance (or Performance Index) Analysis
  • Failure Modes, Effects and Causes Analysis
    (FMECA)
  • If Ao proves not to be adequate, assess Mean Time
    Between Failure (MTBF), Mean Time To Repair
    (MTTR), Mean Logistics Delay Time (MLDT), high
    failure components, etc.

11
Measurement 101 Progress, Performance, and
Analysis Metrics
12
Zachman Enterprise Architecture Framework
Metrics
13
Integrated Process FlowFor VA IT Projects
P L A N N I N G
F U N D I N G
Step 0 Define One Page Mission Statement
Step 2 Concept Development Develop 300B Exhibit
Step 3 System Definition Prototype and System
Design
Step 4 System Life Cycle Development and Testing
Step 5 Production and Deployment
Step 1 Concept Definition Develop Abbreviated
CIP
Address Row One of the Framework Plus T2
Master Schedule N2 Business Logistics System N3
Distributed Systems Architecture
Revalidate Row One of the Framework Plus Address
Row 2 of the Framework Plus Initiate Row 3 of
the Framework
Complete Row 3 of the Framework Plus Complete
Row 4 of the Framework
Complete Row 5 of the Framework Plus Complete
Row 6 of the Framework
Zachman Enterprise Architecture Framework Cells
14
Project Management OversightMilestone
Review Criteria
0
  • Acquisition strategy?
  • Schedule?
  • ROM life cycle cost?
  • Critical Success Factors?
  • Major Risks?
  • What is required for your project?
  • Reports, studies analyses
  • Meetings reviews
  • Etc.
  • What problem are you solving?
  • Mapping to VAPG?
  • Projects architecture?
  • Milestone 0 Zachman cells?
  • Technical approach?
  • Organization?
  • Project manager assigned?
  • Workload distribution?
  • Government?
  • Industry?

Basic questions to warrant approving project
initiation
15
Project Management OversightMilestone
Review Criteria
I
  • Project description?
  • Concept of operations?
  • Project requirements?
  • Projects architecture?
  • Milestone I Zachman cells?
  • Updated technical approach?
  • Organization?
  • Personnel assigned?
  • Acquisition plan?
  • Detailed schedule defined?
  • Detailed life cycle cost?
  • 300B Budget exhibit?
  • Return on Investment (ROI)?
  • Analysis of Alternatives?
  • Cost Benefit Analysis (CBA)?
  • Risk management plan?
  • Fielding strategy?
  • Operational support strategy?
  • Training, documentation, etc.

Basic questions to warrant approving
pilot/prototype
16
Project Management OversightMilestone
Review Criteria
II
  • Project description?
  • Final concept of operations?
  • Project requirements frozen?
  • Projects architecture?
  • Milestone II Zachman cells?
  • Final technical approach?
  • Updated acquisition plan?
  • Updated schedule defined?
  • Prototype results?
  • User view of prototype?
  • Project performance metrics?
  • Cost, schedule performance?
  • Finalized goals thresholds for critical success
    factors / KPPs?
  • Updated ROI CBA?
  • Updated risk mgmt plan?
  • Updated fielding operational support strategy?
  • Integrated logistics support plan?
  • Cyber security plan approved?

Basic questions to warrant approving with full
acquisition
17
Project Management OversightMilestone
Review Criteria
III
  • Project description
  • Concept of operations
  • Functional technical requirements
  • Projects architecture?
  • Milestone III Zachman cells?
  • Production results?
  • Development testing?
  • Security certification?
  • Operational testing?
  • Effective?
  • Suitable?
  • Project performance metrics achieved?
  • Cost, schedule performance?
  • Goals thresholds for KPPs?
  • Fielding plan/schedule?
  • Operational support plan?
  • Training ready to deploy?
  • Documentation ready to deploy?
  • Maintenance ready to deploy?
  • Procurement and life cycle support fully funded?

Basic questions to warrant approving project
deployment
18
Project Management OversightMilestone
Review Criteria
IV
  • Project description
  • Concept of operations
  • Functional technical rqmts
  • Projects architecture?
  • Current To Be EA compliant?
  • Fielding plan/schedule execution?
  • Project performance metrics achieved?
  • Cost, schedule, performance, ROI security?
  • Goals thresholds?
  • Operational support
  • Training adequacy?
  • Documentation adequacy?
  • Maintenance effectiveness?
  • Project still fully funded?
  • Are changes required?
  • New requirements?
  • Correction of defects?
  • Enhancements?
  • What reports, studies or analyses are required?
  • Update to AOA, ROI CBA?

Basic questions to assess in-service project
effectiveness
19
Metrics For Project Management and Oversight
  • Milestone 0 Project Initiation
  • Early, top level statement of progress
    performance metrics to be employed
  • Progress How do you plan to track report
    progress?
  • Performance What defines success?
  • Milestone I Prototype Implementation
  • Refine MS 0 discussion of progress performance
    metrics
  • Progress EVM / Cost schedule reporting
  • Performance Initial estimates of goals
    thresholds for Ao, and other Key Performance
    Parameters (KPPs) as appropriate
  • ROI quantification - - to be used for later
    in-service validation

Guidelines to be tailored as appropriate for each
project subject to PDA agreement
20
Metrics For Project Management and Oversight
  • Milestone II Full Acquisition
  • Progress
  • EVM / Cost schedule reporting
  • Performance
  • Pilot / Prototype Results
  • Final goals and thresholds for Key Performance
    Parameters (KPPs)
  • Plans for verification of technical KPPs by
    formal developmental TE, and for suitability and
    effectiveness KPPs by formal independent user TE
    prior to MS III
  • Demonstrated successful performance at or above
    threshold required for MS III deployment decision
  • Plans for performance measurement reporting on
    a recurring basis in-service
  • Update ROI quantification - - to be used for
    later in-service validation

Guidelines to be tailored as appropriate for each
project subject to PDA agreement
21
Metrics For Project Management and Oversight
  • Milestone III Deployment
  • Progress
  • Planned versus actual deployment measurement
    approach
  • EVM / Cost schedule reporting
  • Performance
  • TE results versus final goals and thresholds for
    KPPs parameters
  • Verified by formal developmental TE and
    independent end user TE
  • Remediation / risk mitigation plans for any
    thresholds missed (assuming PM still asserts it
    is prudent to deploy)
  • Plans for performance parameters to be measured
    reported on a recurring basis in-service
  • ROI
  • Plans for in-service measurements to verify
    projected ROI

Guidelines to be tailored as appropriate for each
project subject to PDA agreement
22
Metrics For Project Management and Oversight
  • Milestone IV Post Implementation / In-Service
  • Progress
  • Planned versus actual deployment achieved
  • Overall cost schedule / EVM performance
    achieved during project
  • Performance
  • Performance parameter results measured on a
    recurring basis in-service
  • TE Results Versus Final Goals and Thresholds for
    Selected Performance Parameters
  • Results of remediation / risk mitigation plans
    for any thresholds missed (when a deployment
    decision was made with outstanding performance
    issues)
  • ROI
  • In-service measurements of actual ROI versus
    projected ROI

Guidelines to be tailored as appropriate for each
project subject to PDA agreement
23
OK Well Measure!! Problem Solved?
  • Many assume IT / software metrics is a solved
    problem!
  • Data suggests that 80 of measurement programs in
    IT / software development fail1

24
Why Measurement Efforts Fail
  • Primary reasons for failure1,2
  • Not tied to business goals
  • Irrelevant or not understood by key players
  • Perceived to be unfair / resisted
  • Motivated wrong behavior
  • Expensive / cumbersome
  • No action based on the numbers
  • No sustained management sponsorship / involvement
  • Goethert, W. Hayes, W., Experiences in
    Implementing Measurement Programs, CMU/SEI Tech
    Note 2001-TN-026, November 2001,
    http//www.sei.cmu.edu/pub/documents/01.reports/pd
    f/01tn026.pdf
  • Howard Rubin Associates Inc, The Making
    Measurements Happen Workshop, Proceedings of the
    3rd International Conference on Applications of
    Software Measurement, La Jolla CA, November
    15019, 1992

25
VA Departmental Performance Plan
  • Performance Plan for FY 2002 contains top level
    performance measures linked to Department goals
    and business objectives
  • Many performance measures directly supported by
    IT
  • Many with non trivial deltas performance between
    current state and goal
  • Objective should be to tie IT performance
    parameters to higher level business objectives
    and metrics
  • Make project KPPs directly and obviously
    meaningful to the business operations staff

26
One Process For Defining A Measurement Program
27
Metrics Breakout Session
  • Objectives
  • Performance metric definition exercise
  • Definition of candidate performance metrics for
    future project use
  • Linkage to project Milestone Decision process
  • Linkage of IT project metrics to departmental
    business goals and performance measures

28
Summary
  • We need to implement appropriately defined and
    sized progress and performance metrics in our IT
    projects
  • During development, deployment and
    in-serviceemployment
  • Performance metrics need to be directly tied to
    Departmental business objectives and performance
    metrics wherever possible
  • Use Department Performance Plan for FY 2002 as a
    starting point

29
Integrating Metrics into IT Projects During
Development and Operations
  • Dr. Frank A. Perry Robert S. McKeeman
  • February 25, 2002
Write a Comment
User Comments (0)
About PowerShow.com