Report of Architecture and Product Working Group - PowerPoint PPT Presentation

1 / 16
About This Presentation
Title:

Report of Architecture and Product Working Group

Description:

J. D. Baker, BAE Systems. A. Winsor Brown, USC-CSSE. Karl Brunson, Lockheed Martin ... Robert Schwenk, Army ASA(ALT) J. Bruce Walker, SAF/AQRE. Lee Zhou, Boeing. 2 ... – PowerPoint PPT presentation

Number of Views:91
Avg rating:3.0/5.0
Slides: 17
Provided by: AWBr
Category:

less

Transcript and Presenter's Notes

Title: Report of Architecture and Product Working Group


1
Report of Architecture and Product Working Group
  • ICM Workshop
  • Washington, DC
  • July 17, 2008

2
Working Group Members
  • J. D. Baker, BAE Systems
  • A. Winsor Brown, USC-CSSE
  • Karl Brunson, Lockheed Martin
  • Paul Croll, CSC
  • Thomas Knott, OSD
  • Art Pyster, Stevens
  • Paul Russell, Aerospace
  • Robert Schwenk, Army ASA(ALT)
  • J. Bruce Walker, SAF/AQRE
  • Lee Zhou, Boeing

3
Working Group Charter
  • Identify and prioritize the most important issues
    associated with Architecture and Products
    (engineering artifacts) for ICM and Competitive
    Prototyping (CP)
  • Suggest OSD initiatives and other actions to
    address those issues

4
Definition of Architecture
  • IEEE 1471 fundamental organization of a system
    embodied in its components, their relationships
    to each other, and to the environment, and the
    principles guiding its design and evolution.
  • Don Firesmith (from the SEI) The set of all the
    most important, pervasive, higher-level strategic
    decisions, inventions, engineering trade-offs,
    and assumptions (DIETAs), and their associated
    rationales concerning how the system meets its
    allocated and derived product and process
    requirements.

The Firesmith definition is the more useful for
CP and ICM
5
Focus
  • Because CP is conducted to reduce risk, and the
    ICM is a risk-driven life cycle model, we focused
    on how to use Architecture and Product to
    understand, manage, and reduce risk.
  • As defined by Firesmith, the architecture
    includes many DIETAs and their rationale, not
    just the risky ones.
  • For CP and anchor points in the ICM, we will
    focus on risky DIETAs i.e., DIETAs with weak
    rationale which, if wrong, could have a
    significant negative impact on program cost,
    schedule, or performance.
  • Strong rationale is based on objective evidence.
    Weak rationale is based on assertion and opinion.

6
System Architecting Paradigm
  • Three activities should happen concurrently and
    iteratively
  • Systems and software engineers establish the most
    critical requirements/objectives including
    those for ilities
  • Systems and software architects develop a system
    and software architecture that the architects
    believe will simultaneously support all critical
    requirements/objectives
  • Engineers evaluate the architecture for how well
    it really supports critical requirements/objective
    s, creating substantiating evidence for the
    architecture or identifying weaknesses in it
  • Today, it is common for any of these activities
    to be shortchanged, especially the third.

7
Types of Evidence
  • Analytic models
  • Scenario-based execution of prototypes
  • Scenario-based execution of simulations
  • Benchmarking
  • Appeal to historical analogy (we did something
    similar several times before)
  • Architecture Quality Cases (analogous to safety
    cases) with claims, arguments, and evidence
  • Process execution results, such as test results
    from early software builds

8
CP/ICM Issues and Actions (unordered)
  • Architectures expressed using DoDAF typically do
    not include all of the DIETAs in sufficient
    detail to support rigorous evaluation.
  • Action Develop architectural representation
    guidance requiring DIETAs to be developed in
    sufficient detail to support rigorous evaluation.
    For example, DoDAF architectures typically dont
    contain enough information to perform safety case
    analyses or to understand the security properties
    of the system.
  • The ilities are often understated in the
    requirements/objectives, yet are often a key
    source of problems later in system development.
    An architectural view for each of the relevant
    quality characteristics is required.
  • Action Develop guidance requiring ilities to
    be sufficiently documented and articulating what
    sufficient means.
  • Action Research how to present sufficient
    information in the views to support adequate
    evaluation.

9
Examples of Quality Characteristics
  • Efficiency
  • Completeness
  • Correctness
  • Security
  • Compatibility
  • Interoperability
  • Maintainability
  • Expandability
  • Testability
  • Portability
  • Hardware Independence
  • Software Independence
  • Installability
  • Reusability
  • Reliability
  • Error Tolerance
  • Availability
  • Usability
  • Understandability
  • Ease of Learning
  • Operability
  • Communicativeness
  • Survivability
  • Flexibility

10
CP/ICM Issues and Actions (unordered)
  • Architectures often do not state the rationale
    (evidence) for their DIETAs in sufficient detail
    to understand which ones are particularly risky.
  • Action Develop guidance requiring the rationale
    for DIETAs to be stated in sufficient detail and
    articulating what sufficient means.
  • There is no guidance for what evidence is
    adequate for any given situation or how that
    evidence should be presented (analogous to the
    problem of knowing when you have tested enough).
    How much prototyping is enough? How much
    evidence is enough?
  • Action Conduct research on how much prototyping
    and evidence is enough and then document the
    research results in guidance.
  • Action Engage Chris Powell on his dissertation
    research based upon his assessment of ACAT 1D
    program architectures since July 2004.

11
CP/ICM Issues and Actions (unordered)
  • Government program offices are probably not
    staffed with enough people with the skills to
    request the correct evidence from the supplier
    and to evaluate that evidence when the supplier
    provides it. Government offices should not
    request evidence unless they are able to evaluate
    it.
  • Action Consider forming an architecture
    assessment team (and other types of assessment
    teams) at the OSD level that would be a resource
    available to interested programs.
  • Since competing suppliers will have different
    architectures, the architectures will have
    different risk profiles and therefore require
    different evidence. Who decides what evidence
    will be provided? The government? The supplier?
    How will the government fairly evaluate competing
    prototypes when presented with different types of
    evidence?
  • Action Investigate legal and contractual
    implications of specific evidence requirements.

12
CP/ICM Issues and Actions (unordered)
  • A competition should involve regular submission
    of evidence not just once at the end of the
    competition. Can suppliers fix problems along
    the way and resubmit stronger evidence? It would
    seem to be in the governments best interest to
    allow this, but could be construed as unfair by
    some competitors.
  • Action Investigate legal and contractual
    implications of requesting regular submission of
    evidence and propose ways to enable regular
    submission of evidence.
  • Creating evidence is often dependent on
    exercising scenarios, which are extremely
    difficult to generate in sufficient number and
    sufficient diversity to uncover weak DIETAs,
    especially for SoS.
  • Action Research how to generate an adequate and
    diverse set of scenarios, especially for SoS or
    investigate alternative approaches to developing
    scenarios.

13
ICM Issues and Actions (Unordered)
  • Providing evidence for an SoS at regular
    milestones is especially challenging because the
    evidence provided by the individual system
    elements may not be available when originally
    expected. Understanding impact analysis across
    elements when something changes is challenging.
  • Action Research how to perform impact analysis
    across elements and how to respond to breakage
    in synchronization across elements.
  • As development progresses from milestone to
    milestone, new evidence reconfirming key DIETAs
    is needed. There is no guidance as to what that
    evidence should be and how often it should be
    collected.
  • Action Research what evidence is required to
    reconfirm key DIETAs and then document the
    approaches in guidance.

14
ICM Issues and Actions (Unordered)
  • Program offices are inherently biased when it
    comes to evaluating evidence that a supplier is
    making sufficient progress to pass a milestone.
    Having independent non-advocate reviews of
    evidence eliminates that problem, but can be
    expensive and difficult to staff.
  • Action Investigate the cost and feasibility of
    independent non-advocate reviews vs. the cost of
    inadequate review by failing to use independent
    reviewers.

15
Value and Ease of Implementing Actions
16
Value and Ease of Implementing Actions
Write a Comment
User Comments (0)
About PowerShow.com