Source Selection Improvements - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Source Selection Improvements

Description:

... probable cost estimate is verified by a certified cost estimator ... SAF ACE and SAF/AQC collaborating to develop standardized source selection training ... – PowerPoint PPT presentation

Number of Views:510
Avg rating:3.0/5.0
Slides: 39
Provided by: intrane3
Category:

less

Transcript and Presenter's Notes

Title: Source Selection Improvements


1
Source Selection Improvements
  • Edward C. Martin
  • ASC Acquisition Center
  • of Excellence (ACE)
  • 10 April 2007

2
Source Selection ImprovementTeam Charter
  • Examine DAPA report major findings, recommend
    performance improvements and implementation
    criteria and their implications for the Air Force
    source selection process.
  • Develop recommendations, proposed strategy,
    process changes, procedures, techniques and tools
    aimed primarily at the systems development and
    demonstration contracts for Acquisition Category
    (ACAT) programs.
  • Review FAR, DFARS and AFFARS to ascertain which
    policy, process or procedural aspects might
    conflict with team recommendations
  • Canvass all Air Force acquisition organizations
    to identify any proposed source selection
    additions, changes, deletions improvements or
    suggestions for modification of the FAR, DFARS
    and AFFARS beyond those envisioned in the DAPA
    report. Consider all and identify potential
    recommendations.
  • Determine what training, education and/or
    orientation may be necessary to institutionalize
    proposed startegy and source selection procedural
    changes recommended.

3
DAPA Report, January 2006,Page 45
A Conspiracy of Hope that introduces
instability at the very beginning of acquisition
programs. The Conspiracy of Hope occurs when
industry is encouraged to propose unrealistic
cost, optimistic performance and understate
technical risk estimates during the acquisition
solicitation process and the Department is
encouraged to accept these proposals as the
foundation for program baselines. The
Conspiracy of Hope is reinforced by the
cost-plus environment in our current acquisition
strategies that encourages industry to be overly
optimistic in their bids by imposing little or no
financial risk to those who submit such bids
4
DAPA Report - Source Selection Recommendations
  • Create an environment of open communication to
    insure that industry understands government
    requirements and government understands industry
    capabilities and limitations.
  • Include industry in development of program
    acquisition strategies for each phase of the
    process, and the acquisition and source selection
    plans for each competitive source selection.
  • Ensure traceability of requirements from program
    to the acquisition strategy to the acquisition
    plan, to the instruction to offerors, to the
    evaluation factors for award, to the contract
    incentive provisions and program control and to
    the performance evaluation metric selection.
  • Standardize the content of the Concept
    Development and Demonstration phase competitive
    prototype contracts to include conducting initial
    baseline review and preliminary design review for
    the contractors proposed system design and
    Development program.
  • Eliminate the requirement to share all questions
    or information submitted and eliminate answers
    provided to a single competitor with all
    competitors prior to issuance of the final
    request for proposal.
  • Incorporate existing scheduled contractor
    technical or program reviews as proposal
    elements, to the maximum extent possible.

5
DAPA Report - Source Selection Recommendations
  • Require oral presentations of proposals during
    source selection and encourage open exchanges
    between evaluators and industry, not limited to
    clarification only, during these presentations.
  • Use an affordability assessment based upon
    industry and government-agreed high confidence
    cost as the principal factor in competitive range
    determination during source selection. Once a
    competitors government and industry agreed-to
    development cost is determined to be affordable,
    and thus the competitor is determined to be
    within the competitive range, no other
    consideration will be given to the development
    cost, other than cost realism, during subsequent
    competitive source selection evaluations for
    Concept Development and Demonstration and System
    Design and development contracts.
  • Stress the critical nature of risk mitigation and
    completeness of data supporting offerors claims
    as a heavily weighted evaluation factor for
    award.
  • Establish performance and schedule confidence as
    well as management confidence including
    subcontractor selection and management as primary
    evaluation factors for award of Concept
    development and Demonstration and System design
    and Development contracts.
  • Set target cost for cost-type concept development
    and system design and development contracts at
    the Cost Analysis Improvement Group estimate,
    identifying the difference between proposed and
    target cost as management reserve, aggressively
    incentivizing cost performance, and penalizing
    cost growth over target.

6
(DAPA) Report GuidingPrinciples
  • Adopt a risk-based source selection process.
  • For development contracts, proposal cost should
    be replaced by industry and government agreement
    on a high confidence cost estimate.
  • Base SSA competitive range determinations upon
    high confidence costs which are considered
    affordable.
  • Create acquisition strategies that reflect
    restructuring source selection competitions for
    ACAT I and II programs to significantly shorten
    their length and base their results on system
    risk and management performance instead of cost.

7
Some Recommended Changes Implementing DAPA
Suggestions
8
Risk-Based Source Selection
  • SSIT Opinion - Air Force already does risk-based
    source selection .. but we can improve our
    approach.
  • Establish Cost/Price Risk as a mandatory,
    separate and more important evaluation factor for
    ACAT System Development and Demonstration (SDD)
    only.
  • Make Proposal Risk more important in the
    evaluation scheme by establishing it as a
    co-equal rating to current color rating for the
    Mission Capability factor or sub-factors .
  • Establish a new proposal risk rating category of
    Unacceptable Assigning this proposal risk
    rating would render a proposal not awardable.

9
Risk-Based Source Selection -Cost/Price Risk
Factor
  • Establish the current cost/price risk rating
    (part of the cost/price factor) as a separate,
    stand-alone evaluation factor.
  • This evaluation factor shall be used only for
    Acquisition Category (ACAT), Systems Development
    and Demonstration (SDD) phase programs that use
    (1) either a Cost Reimbursement or Fixed-Price
    Incentive type contract structure and (2) a
    separate evaluation factor that considers
    government Most Probable Cost (MPC). The
    Cost/Price Risk rating assesses the degree to
    which an offerors cost proposal in the intended
    contract compares with the governments computed
    MPC. Cost/Price Risk shall be a significant
    evaluation factor.
  • The purpose of this risk rating is to select
    offerors who propose rational and reasonable
    offers for the work to be accomplished and to
    communicate to those offerors the Air Forces
    deep concern for the risk to our programs
    associated with overly optimistic or unrealistic
    cost or price proposals as well as to clearly
    convey to them that submitting unrealistic costs
    or prices may put that offeror at danger for
    non-selection.

10
Risk-Based Source Selection - Cost/Price Risk
Factor
  • Better define what the cost/price risk rating is
    supposed to measure.
  • The cost/price risk evaluation assesses the
    degree to which an offerors cost proposal for
    the contract line items to be included in the
    intended contract and associated options, if
    evaluated, compares with the government most
    probable cost (MPC) for the same items.

11
Risk-Based Source Selection - Cost/Price Risk
Factor
  • Establish a new table with separate and distinct
    cost/price risk rating definitions that emphasize
    the risk quantification in terms of the degree to
    which an offerors proposed prices compare to our
    estimated probable cost.
  • High - Significant difference exists between the
    offerors proposed cost/price and the government
    most probable cost which may substantially impact
    the program's probability of success. Cost
    growth is likely to occur as indicated by this
    difference and/or other anomalies related to
    cost/price.
  • Moderate - Some difference exists between the
    offerors proposed cost/price and the government
    most probable cost that may somewhat impact the
    program's probability of success. Cost growth is
    possible to occur as indicated by this difference
    and/or other anomalies related to cost/price.
  • Low - Little difference exists between the
    offerors proposed cost/price and the government
    most probable cost which may cause minor or even
    negligible impact to program's probability of
    success. Cost growth is unlikely to occur as
    indicated by this difference and/or other
    anomalies related to cost/price, but the impact
    is manageable.

12
Risk-Based Source Selection - Make Proposal Risk
More Important
  • The Mission Capability factor or subfactors (If
    subfactors are established, the mission
    capability factor shall be rated at the subfactor
    level and an overall factor-level rating is not
    assigned), when established, shall receive two
    separate and distinct ratings a requirement
    assessment that reflects the degree to which the
    Mission Capability Factor or Subfactor exceeds,
    meets or does not meet the minimum performance or
    capability requirement (Color Rating) and a
    mission capability proposal risk rating that
    assesses the degree to which an offerors
    proposed approach to achieving the Mission
    Capability Factor or Subfactor may involve risk
    of disruption of schedule, increased cost or
    degradation of performance. These two ratings
    are presented together and are of equal impact
    for the rating for each Mission Capability Factor
    or Subfactor.
  • The mission capability requirement rating focuses
    on the strengths, deficiencies, uncertainties,
    and any notable aspects in the offeror's
    proposal.
  • The mission capability proposal risk rating
    focuses on the weaknesses associated with an
    offeror's proposed approach. Assessment of
    mission capability proposal risk considers
    potential for disruption of schedule, increased
    cost, or degradation of performance, the need for
    increased government oversight, and the
    likelihood of unsuccessful contract performance.

13
Risk-Based Source Selection - Proposal Risk
Ratings
  • Unacceptable - The existence of a significant
    weakness (or combination of significant
    weaknesses) that is likely to cause extreme
    disruption of schedule, drastically increased
    cost or severely degraded performance. Proposals
    with an unacceptable rating are not awardable.
    (New)
  • High - Likely to cause significant disruption of
    schedule, increased cost or degradation of
    performance. Extraordinary contractor emphasis
    and rigorous government monitoring may be able to
    overcome difficulties.
  • Moderate - Can potentially cause disruption of
    schedule, increased cost, or degradation of
    performance. Special contractor emphasis and
    close government monitoring will likely be able
    to overcome difficulties.
  • Low - Has little potential to cause disruption
    of schedule, increased cost or degradation of
    performance. Normal contractor effort and normal
    government monitoring will likely be able to
    overcome any difficulties.
  • Used when rating is in the upper boundaries
    but not enough to merit the next higher rating.
    (New feature)

14
Source Selection Evaluation
Mission Capability
Subfactor 1 Color Rating
Subfactor 3 Color Rating
Subfactor 2 Color Rating
Proposal Risk 2
Proposal Risk 1
Proposal Risk 3
Performance Confidence Rating Evaluated at
mission capability subfactor and price/cost
factor, assessed at factor level.
Cost/Price This factor may require a risk
assessment as described in Paragraph 5.5.4.
15
High Confidence Cost Estimates
  • Utilize most probable cost in ACAT SDD source
    selections as the cost/price evaluation factor.
  • Ensuring cost realism in source selection to
    changing the Conspiracy of Hope
  • Early in the solicitation development phase and
    during the source selection process, conduct
    openhanded discussions between government and
    offeror cost estimators concerning the methods of
    estimating costs for the program.
  • Before RFP release, this discourse must include
    the exchange of ideas and information concerning
    estimating methods.
  • Ensure meaningful discussions occur regarding
    cost estimate issues and differences during
    source selection.
  • In ACAT source selections, ensure that the
    government most probable cost estimate is
    verified by a certified cost estimator

16
High Confidence Cost Estimates Proposed
Regulatory Changes
  • Discussions with offerors on Most Probable Cost
    Analysis (Mandatory for ACAT SDD Awards). Ensure
    that, early as practical in the solicitation
    development for ACAT programs entering System
    Development and Demonstration that will utilize
    either a Cost Reimbursement or Fixed-Price
    Incentive type contract structure, detailed
    discussions occur between the government and
    potential offeror regarding cost estimates and on
    the methods of estimating costs for the program.
  • Government Most Probable Cost (MPC). The MPC
    estimate is the government estimate of the costs
    to acquire specified goods and/or services. This
    estimate includes not only those costs that will
    be included as part of the contract, but may
    include any other costs that will be incurred by
    the government in the performance of the
    acquisition program. The MPC must also include
    an analysis of the uncertainties inherent in any
    acquisition, from those related to the cost
    estimating methods chosen, to those associated
    with the technical and programmatic assumptions
    of the program. This is because most probable
    cost is defined as the most frequently occurring
    value (or mode) resulting from the cost
    uncertainty analysis. All offerors shall be
    required by the solicitation instructions to
    conduct their own uncertainty analysis and
    include in their proposal its results along with
    the range (or distribution) of possible costs
    that was developed based on statistical
    techniques applied.
  • When the evaluation will include a cost/price
    risk evaluation, the Source Selection Evaluation
    Team through the contracting officer shall
    indicate to, or discuss with, each offeror in the
    competitive range differences between the
    offerors proposed costs or prices and the
    government most probable cost for the same items
    with the goal of understanding these differences
    to the maximum extent possible. Conduct these
    discussions promptly so that each offeror has
    ample time to understand the governments
    perceptions of cost differences so that the
    offeror may adjust their cost or price proposal
    accordingly before or with submission of Final
    Proposal Revision.

17
Requirements Traceability
  • DAPA identified a need to ensure traceability of
    requirements throughout all solicitation
    documents.
  • Require all competitive RFPs for acquisitions
    over 10M utilize the traceability matrix process
    to traceability.
  • Make the traceability matrix a mandatory chart
    for the ASP template. Program offices must be
    prepared to explain during the ASP, how the
    results of their risk assessment were
    incorporated into their strategy, source
    selection criteria, and ultimately their
    incentive approach using the matrix.
  • Utilize the traceability matrix process as a key
    planning element and building block throughout
    all stages of acquisition strategy and
    solicitation development.
  • Publish our traceability matrix in the RFP and
    require Industry to add to it to show
    traceability of their proposal submission to our
    solicitation.

18
Sample Traceability Matrix
C Critical S Serious Mo Moderate Mi
Minor N Negligible Con Consequence
(importance) Prob Probability Rat Rating (
High, Medium, or Low)
19
Other Recommended Changes
20
Standard Process
  • Adopt a standard Air Force Source Selection
    Process
  • Require its use in all source selections
  • Develop new top-level electronic source selection
    guide
  • Adopted process forms the basis for all other
    current and to-be source selection sub-process
    guides

21
Source Selection Improvements - Field Inputs
  • SAF/AQC email to all AF acquisition organizations
  • Identify improvements beyond those in DAPA
  • Received 158 suggested changes
  • Team recommended 91 for potential implementation
  • Categories
  • - Evaluation Factors
  • - Source Selection Advisory Council
  • - Source Selection Evaluation Team
  • - Source Selection Plan
  • - Definitions
  • - Color Ratings
  • - SSET Staffing
  • - Past Performance
  • - Pre-Source Selection Process

- Cost Cost Price Risk - AFPEO/CM Issue -
Proposal Risk - Documentation - LPTA/PPT -
Discussion Process - Training - DAPA Associated -
Miscellaneous
22
Highlights
  • Streamline Source Selection Plan requirements
  • Emphasize SSET staffing
  • Develop Evaluation Criteria Guide
  • Define uncertainty
  • Add and define notable aspect
  • Reduce Past Performance ratings from six to four
  • Add guidance on relevancy ratings
  • Provide flexibility on ranking of Past
    Performance factor
  • Additional guidance on conducting discussions
  • FAR deviation permitting multi-step evaluation
    process for LPTA and PPT

23
Selected Examples
  • "Notable Aspect" is a positive attribute of a
    proposal that is note worthy or may present some
    value to the government but does not exceed a
    specified performance or capability requirement,
    and thus cannot merit a strength."
  • An example of a notable aspect may be when one of
    the evaluation factors requires the submission of
    a comprehensive plan and the offerors submission
    is so comprehensive as to offer other uses or
    reduces the expected burden to the government.
    Another example might be when an offerors
    approach not only meets the requirement, but
    their inherent process involves an automated
    approach that offers a significant advantage to
    the Government.
  • A notable aspect only contributes to a Green
    (Satisfactory) rating.

24
Selected Examples
  • Uncertainty is a doubt regarding whether an
    aspect of the proposal meets or potentially
    exceeds a material performance or capability
    requirement. It requires additional information
    from the offeror to further explain the proposal
    before the evaluator can complete his/her review
    and analysis and should generate the issuance of
    an EN. Proposals including an uncertainty about
    meeting a material performance or capability
    requirement will normally be rated yellow.
    Proposals including an uncertainty about
    potentially exceeding a material performance or
    capability requirement will normally be rated
    green.
  • Deficiency is a material failure of a proposal
    to meet a government requirement or a combination
    of significant weaknesses in a proposal that
    increases the risk of unsuccessful contract
    performance to an unacceptable level.
  • Current FAR Definition

25
Selected Examples Past Performance
  • Permit SSAs greater flexibility in determining
    the importance of the past performance
  • Current - Past performance may be established as
    the most important evaluation factor and shall be
    at least as important as the most important
    non-cost factor.
  • Future - In all cases, when utilized, the past
    performance factor must be a significant
    evaluation criteria.

26
Selected Examples Past Performance
  • Proposed Performance Confidence Ratings.
  • SUBSTANTIAL CONFIDENCE - Based on the offerors
    performance record, the government has a high
    expectation that the offeror will successfully
    perform the required effort.
  • SATISFACTORY CONFIDENCE - Based on the offerors
    performance record, the government has an
    adequate expectation that the offeror will
    successfully perform the required effort.
  • UNKNOWN CONFIDENCE - No performance record is
    identifiable or the offerors performance record
    is so limited that no confidence assessment
    rating can be reasonably assigned.
  • Offerors without a record of relevant past
    performance or for whom information on past
    performance is not available or the offerors
    performance record is so limited that no
    confidence assessment rating can be reasonably
    assigned will not be evaluated favorably or
    unfavorably.
  • LITTLE OR NO CONFIDENCE - Based on the offerors
    performance record, substantial doubt exists that
    the offeror will successfully perform the
    required effort.

27
Establish Future Teams
  • Alternative methods to evaluate Past Performance
  • Streamlined approaches when anticipating large
    number of proposals, many major subcontractors,
    or less complex selections where resources are an
    issue
  • Selection of standard AF Electronic Source
    Selection Tool
  • One tool with disciplined source selection
    process in-built
  • AF funded
  • Consolidate electronic source selection guides
  • Develop top-level source selection process guide
  • Using Source Selection Process
  • Links to guides/chapters on specific steps (e.g.,
    SSP, Sec LM, PAR, SSDD)

28
Source Selection Training Team (SSTT)
  • SAF ACE and SAF/AQC collaborating to develop
    standardized source selection training
  • Mandatory for all source selections gt 100M.
  • Anticipate electronic access to all training
    material.
  • Training Delivery
  • At Centers with a resident ACE, training provided
    by ACE cadre.
  • For operational commands or locations without
    resident ACE, training provided by SAF ACE or a
    qualified regional ACE.
  • SSTT will incorporate policy and process changes
    into training as they occur.

29
Whats Next?
  • SAF/AQC has review of recommendations
  • Preparing to vet concepts and language with
    MAJCOM/DRU acquisition community
  • Initiate AFFARS and Mandatory Procedures changes
    for accepted recommendations
  • Determine implementation plan
  • Update required source selection guides
  • Create new source selection guides
  • Develop traceability matrix process and tools
  • Develop and implement training SSTT

30
Questions?
31
BACK-UP SLIDES
32
SSIT Team Composition
  • Ed Martin ASC/AE Team Co-lead
  • Jackie Leitzel AAC/PK Team Co-lead
  • Dan Fulmer ASC/AE
  • Ann Marie Telepak HQ AFMC/PKP
  • Jay Jordan AFCAA
  • John Kreiger DAU
  • Pam Schwenke SAF ACE
  • Paul Commeau ESC/ACE
  • Rick Andreoli ESC ACE
  • Greg Snyder SAF/AQCP
  • Sandy Haberlin OUSD(ATL)(DPAP)
  • Dave Chaston SAF/AQRE
  • Robert Graham SMC/AXDP
  • Abby Horwitz SAF/GCQ
  • John Miller SAF/ACPO
  • Mike Foley SAF/ACPO
  • Kathy Boockholdt AFPEO/CM and SAF ACE
  • Steve Felosa WV-CRET

33
Source Selection Process Flow
34
Source Selection Process Flow
35
Source Selection Process Flow
36
Source Selection Process Flow
37
Source Selection Process Flow
38
Source Selection Process Flow
Write a Comment
User Comments (0)
About PowerShow.com