Overview of Current Progress with DO178C - PowerPoint PPT Presentation

1 / 60
About This Presentation
Title:

Overview of Current Progress with DO178C

Description:

... 178B requires non-traditional software engineering practices, such as MCDC and ... To promote safe implementation of aeronautical software. ... – PowerPoint PPT presentation

Number of Views:602
Avg rating:3.0/5.0
Slides: 61
Provided by: ibrahi3
Category:

less

Transcript and Presenter's Notes

Title: Overview of Current Progress with DO178C


1
Overview of Current Progress with DO-178C
  • Ibrahim Habli

2
Outline
  • Overview of DO-178B
  • DO-178C Committee and Scope
  • DO-178C Supplements
  • Tool Qualification
  • Model-based Development and Verification
  • Formal Methods
  • Your Input

3
Dont Kill the Messenger
4
Overview of DO-178B
5
DO-178B Purpose
  • Guidelines for production of software for
    airborne systems and equipment that performs its
    intended function with a level of confidence in
    safety that complies with airworthiness
    requirements
  • Objectives for software lifecycle processes.
  • Descriptions of activities and design
    considerations for achieving those objectives.
  • Descriptions of the evidence that indicate that
    the objectives have been satisfied.

6
Assurance vs. Objectives
  • Level A
  • (66 objectives)
  • Level B
  • MC/DC coverage
  • More independence
  • Source to object
  • Level B
  • (65 objectives)
  • Level C
  • Artefact compatibility
  • Verifiability
  • Independence
  • Decision coverage
  • Transition
  • Level C
  • (57 objectives)
  • Level D
  • More planning
  • Verif req, design,
  • integ processes
  • Test LL req
  • Verif test plan, proc
  • reults
  • LL req coverage
  • Statement coverage
  • Data control
  • coverage
  • Level D
  • (28 objectives)
  • Planning
  • CM
  • QA
  • HL req Coverage
  • HL req robustness
  • Code-target
  • compatibility
  • Cert liaison
  • Tool qual

7
Example ObjectivesTesting of Outputs of
Integration Process
8
Development Verification
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4. 8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-7.3-4 Functional Coverage (test) A-7.5-7
Structure Coverage (test)
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Compliance with requirements Conformance with
standards
Executable Object Code
9
Software Lifecycle Data
  • Planning
  • Plan for Software Aspects of Certification
  • Software Development Plan
  • Software Verification Plan
  • Software Configuration Management Plan
  • Software Quality Assurance Plan
  • Standards
  • Software Requirements Standards
  • Software Design Standards
  • Software Code Standards
  • Development
  • Software Requirements Data
  • Design Description
  • Source Code
  • Executable Object Code
  • Verification
  • Software Verification Cases and Procedures
  • Software Verification Results
  • Configuration Management
  • Software Life Cycle Environment Configuration
    Index
  • Software Configuration Index
  • Problem Reports
  • Software Configuration Management Records
  • Quality Assurance
  • Software Quality Assurance Records
  • Software Accomplishment Summary

The minimum software lifecycle data submitted to
the certification authority is
10
Problems with DO-178B
  • The FAA is not keeping pace with rapid advances
    in software technology, thereby delaying the use
    of potentially cost saving technology.
  • DO-178B requires non-traditional software
    engineering practices, such as MCDC and tool
    qualification, without providing sufficient
    background information. NASA SSAC
  • there is a poor correlation between standards
    and observed hazardous failure rates simply
    because the standards do not address safety
    issues. McDermid
  • DALs not well understood how derived and applied
    Redmill McDermid Pumfrey

11
EUROCAE WG71/RTCA SC205DO-178B Update DO-178C
12
Operations Plan
  • This plan is to be used as a mechanism for
    operating the joint EUROCAE WG71 and RTCA SC205
    committees

13
RTCA / EUROCAE and SC-205 / WG-71
SC205 / WG71 Exec Comm. Sub-groups
14
WG-71/SC-205 Structure
JOINT COMMITTEE WG-71/SC-205
EXECUTIVE COMMITTEE
SUB
SUB
SUB
SUB
GROUP
GROUP
GROUP
GROUP
Plenary meets 2 times per year March
US September - Europe
15
Sub-Groups
  • SG1 SCWG Document Integration
  • SG2 Issues Rationale
  • SG3 Tool Qualification
  • SG4 Model Based Design Verification
  • SG5 Object Oriented Technology
  • SG6 Formal Methods
  • SG7 Safety Related Considerations

16
ISSUE LIST
Interface Spec
Supplement A
DO-178C ED-12C
Supplement B
DO-278A / ED-109A
Supplement
DO-178C ED-12C Interface Spec for Supplements and
Rationale Documents
...Supplement N
DO-248C/ED-94C FAQ/DP/RATIONALE
DO-248B/ED-94B FAQ/DP/RATIONALE
17
Major Milestones and Deliverables
18
High-Level Process Flow
  • Issues raised against Documents to go SG2
  • SG2 assigns issue to SGx
  • SGx works issues
  • Sometimes no text change
  • Sometimes text changes
  • Close results with author of issue
  • SGx brings text to plenary
  • Plenary approves text by consensus

19
Plenary Text Approval Process
  • Text for DO-178C/ED-12C, etc.
  • Text must be submitted via Information Paper
    (process is defined in IP2)
  • Sub-group must have consensus on text
  • Consensus
  • A general agreement or accord. The approval may
    involve compromise that permits approval on the
    basis that I can live with it.

20
SC-205 / WG-71 Objectives
  • To promote safe implementation of aeronautical
    software.
  • To provide clear and consistent ties with the
    systems and safety processes.
  • To address emerging software trends and
    technologies.
  • To implement an approach that can change with the
    technology.

21
Some Conditions
  • Maintain the current objective-based approach for
    software assurance.
  • Maintain the technology independent nature of the
    DO-178B objectives.
  • Modifications to DO-178B/ED-12B should
  • Strive to minimize changes to the existing text
    (i.e., objectives, activities, software levels,
    and document structure)
  • Consider the economic impact relative to system
    certification without compromising system safety
  • Address clear errors or inconsistencies in
    DO-178B/ED-12B
  • Fill any clear gaps in DO-178B/ED-12B
  • Meet a documented need to a defined assurance
    benefit.

22
Terminology
  • Guidance text is text that is to be followed by
    the reader or user of the document.
  • Guideline text is text that is simply
    informative and may or may not be followed by the
    reader or user at their discretion. Informative
    guideline text can be helpful to the user.
    DO-178B/ED-12B text is typically guidance text
    (except where specifically noted), whereas
    DO-248B/ED-94B text is only an informative
    guideline.

23
Developing Guidance Text
  • Consider all aviation safety related material as
    appropriate
  • E.g. DO-178B/ED-12B, DO-278/ED-109 or
    DO-248/ED-94.
  • Published papers by the authorities including
    joint authority CAST (Certification Authorities
    Software Team) papers, and papers published by
    the authorities such as FAA Order 8110.49,
    Eurocontrol ESARR6, and SW1

24
Supplements
  • A supplement is guidance used in conjunction with
    DO-178C/ED-12C that addresses the unique nature
    of a specific technology or a specific method. A
    supplement adds, deletes or otherwise modifies
    objectives, activities, explanatory text, and
    software life cycle data in DO-178C/ED-12C.

25
SG 3 Tool Qualification
  • Tool qualification in DO-178B
  • Proposed changes in DO-178C

26
Types of Tools in Do-178B
  • Verification Tools
  • Cannot introduce errors, but may fail to detect
    them
  • Development Tools
  • Tools output is part of airborne software and
    thus can introduce errors
  • Creates or changes requirements, design or code

27
Tool Qualification in DO-178B
  • If the answer is Yes to all of the questions
    below, the tool should be qualified
  • (1) Can the tool insert an error into the
    airborne software or fail to detect an existing
    error in the software within the scope of its
    intended usage?
  • (2) Will the tools output not be verified or
    confirmed by other verification activities, as
    specified in Section 6 of RTCA/DO-178B?
  • (3) Are processes of RTCA/DO-178B eliminated,
    reduced, or automated by the use of the tool?
    That is, will the output from the tool be used to
    either meet an objective or replace an objective
    of RTCA/DO-178B, Annex A?

28
Example
  • Do we need to qualify a tool that checks
    differences between 2 files?
  • If software files are read-only, and satisfies CM
    objectives A 8-3 for change review, then it
    should be qualified as verification tool.
  • If the tool is used only to double-check a manual
    review, then there is no need for qualification.

29
Problems with Tool Qualification
  • Risk of disclosing proprietary information of
    tool vendors to potential competitors
  • Intellectual property rights may need to be
    waived by vendor to achieve qualification
  • Development tools needs to be qualified to same
    level as airborne software it produces
  • Airborne software runs on target computer while
    tools may run on general purpose workstation
    (e.g. running MS Windows)
  • Several DO-178B objectives do not apply to tools
    and cannot be satisfied
  • Only deterministic tools can be qualified!
  • Does this include the OS of host computer?

30
Data Required for Tool Qualification
31
Current Progress with Tool Qualification in
DO-178C
32
Tool Qualification Teams Objectives
  • Develop a qualification approach that meets the
    need of development tools
  • Keep approach for current classes of verification
    tools as close to the same as possible
  • Develop an approach that allows emerging tool
    technologies and uses
  • Provide an approach that enables reuse of tool
    qualification credit on multiple projects
  • Identify users and developers roles
  • Develop an approach which is clear but flexible
    for users
  • Develop an objective-based approach
  • Provide an approach that may be used by multiple
    domains

33
Proposed Tool Qualification Process Uses Levels
and Objectives
  • Not yet addressed
  • Tool service history
  • Cots tools
  • Tool reuse
  • Changes to previously qualified tool

Does tool function eliminate, reduce, or automate
any DO-178C/ED-12C objectives?
No
Yes
Is tool output verified?
LEVEL TAB
Yes
No
Yes
Can tool insert error into software Level A/B?
LEVEL TC
Level determines objectives to be met
No
Yes
Can tool insert error into software Level C?
No
Yes
Can tool insert error into software Level D?
LEVEL TD
No
Can tool fail to detect error in software Level
A-D?
Yes
No
LEVEL TV
NO QUALIFICATION REQUIRED
TV is a functional level qualification
34
Packaging
  • SG3 considered several options for packaging the
    tool qualification process
  • Supplement
  • Rewrite of DO-178C/ED-12C Section 12
  • New Annex of DO-178C/ED-12C
  • New DO/ED document

35
New Packaging Approach
  • DO-178C/ED-12C
  • Section 12.2
  • Identifies the need
  • to qualify
  • Explains how to
  • determine the
  • required level of
  • qualification
  • (2-3 pages)
  • Other Domain or
  • Supplements, e.g.
  • Formal Methods,
  • Model Based,
  • Object-Oriented,
  • HW,
  • Sys,
  • database,
  • IMA
  • Etc.
  • DO-XXX/ED-YYY
  • New document
  • Addresses how
  • to qualify tools
  • Objective-based
  • What to do by
  • level.
  • Domain independent
  • (50-70 pages)

36
Terminology
  • Two options seem available
  • (1) Keep the terms development and
    verification tools add text to explain how
    the criteria might apply to other tools.
  • (2) Remove the terms and focus more on what the
    tool actually does to determine the appropriate
    qualification level. Add a FAQ or some
    description up front to show how the criteria
    relates to the DO-178B terminology.
  • SG3 prefers option (2)

37
Tool Planning and Development
  • Tool Qualification Levels TAB, TC and TD
  • Tool Qualification Plan
  • Tool Development Plan
  • Tool Verification Plan
  • Tool Quality Assurance Plan
  • Tool Configuration Management Plan

Tool User Integration Process
Tool User Requirements Definition Process
Tool Development Process - Tool Requirement
process - Tool Design Process - Tool Coding
Process - Tool Integration Process
TUR
Tool Executable Object Code
38
Tool VV
  • Tool Verification Process (applies only to dev
    tools)
  • Review analysis of tool operational low-level
    requirements, architecture, source code,
    integration and test cases, procedures and
    results
  • Testing
  • Requirements-based testing (normal robustness),
    each requirement has at least one test case
  • Unintended Function Analysis
  • Structural coverage analysis
  • Data coupling and control coupling analysis
  • Tool user verification and validation process
    objectives
  • Applies to all types of qualified tools

39
SG-4 Model-Based Development Verification
40
MDBV Group Objectives
  • The objective of this proposed guidance
    supplement to DO-178B is to provide clarification
    as to how Model-based Development and
    Verification may be conducted so as to produce
    software in a manner that complies with the text
    and objectives of DO-178B. This guidance is
    applicable for processes where the source code is
    directly produced from a Model-based Design.
  • To be tailored according to the design assurance
    level

41
Help with Terminology!
  • Terms needed to be defined
  • Modelling Technique
  • Model Simulator
  • Modelling Process
  • Modelling Tool
  • Example definition
  • Structural Code Coverage Analysis Structural
    Coverage Analysis based on Code Structure
  • Typical comments
  • Many of these "definitions" are not definitions,
    but simply rewritings of the words supposedly
    being defined. As such, they provide no
    information not contained in the phrase itself.

42
MBD Foundation Concepts (1)
  • Modelling techniques may be suitable to express
    requirement and/or design and/or architecture
    lifecycle data items.
  • Modelling standards shall be defined and used for
    each modelling technique. The standard shall
    provide a means to satisfy all traceability
    expectations.

43
MBD Foundation Concepts (2)
  • MBDV aspects may be in both the system and the
    software area. However, the MDBV supplement only
    addresses the models that are involved in the
    production of the software development lifecycle
    items.
  • The guidance contained in the supplement should
    apply regardless of who performs the modelling
    activities system engineers or software
    engineers.

44
MBD Foundation Concepts (3)
  • At any level of abstraction, a model can contain
    derived requirements.
  • All derived requirements shall be identified.
  • All derived requirements shall be justified,
    validated and assessed for potential safety
    impact.

45
MBD Foundation Concepts (4)
  • Each lifecycle data item output from the
    modelling process shall be verified
  • Each successive level of abstraction shall be
    consistent with its parent requirements.
  • Each successive level of abstraction shall be
    compliant with the modelling standards.
  • Model Simulators may be used to prove the
    correctness and completeness of the models.

46
SG-6 Formal Methods
  • As presented in plenary session

47
The Problem
  • DO-178B Section 6 calls explicitly for Review,
    Analysis and Test rather than setting out the
    objectives for verification and leaving the
    applicant to create the appropriate verification
    plan.
  • However specific methods are deemed the only way
    to meet specific objectives.
  • There are issues with specific technologies such
    as Formal Methods, OO and MBD with respect to
    how the current section 6 objectives can be met.

48
We are not trying to eliminate testing from
DO-178C!
49
The Proposal
  • To generalise the wording to use verification
    instead of specific methods (Review, Analysis
    Test) without eliminating the need for testing in
    the target hardware.
  • To retain the complete testing process under the
    verification of executable object code.

50
The Verification Process
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4. 8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
Compliance with requirements Conformance with
standards
51
Comparison of Old -gt New
  • 6.0 SOFTWARE VERIFICATION PROCESS
  • 6.1 Software Verification Process Objectives
  • 6.2 Software Verification Process Activities
  • 6.3 Software Reviews and Analyses
  • 6.3.1 Reviews and Analyses of the High-Level
    Requirements
  • a. Compliance with system requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability
  • g. Algorithm aspects
  • 6.3.2 Reviews and Analyses of the Low-Level
    Requirements
  • a. Compliance with high-level requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability

6.0 SOFTWARE VERIFICATION PROCESS 6.1 Software
Verification Process Objectives 6.2 Software
Verification Process Activities 6.3 Detailed
Guidance for Verification Activities 6.3.1Verifica
tion Activities for the High-Level
Requirements a. Compliance with system
requirements b. Accuracy and consistency c.
Compatibility with the target computer d.
Verifiability e. Conformance to standards f.
Traceability g. Algorithm aspects 6.3.2Verificatio
n Activities for the Low-Level Requirements a.
Compliance with high-level requirements b.
Accuracy and consistency c. Compatibility with
the target computer d. Verifiability e.
Conformance to standards f. Traceability g.
Algorithm aspects
52
Comparison of Old -gt New
  • 6.3.3 Reviews and Analyses of the Software
    Architecture
  • a. Compliance with high-level requirements
  • b. Consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Partitioning integrity
  • 6.3.4 Reviews and Analyses of the Source Code
  • a. Compliance with low-level requirements
  • b. Compliance with the software architecture
  • c. Verifiability
  • d. Conformance to standards
  • e. Traceability
  • f. Accuracy and consistency
  • 6.3.5 Reviews and Analysis of the Outputs of the
    Integration Process

6.3.3Verification Activities for the Software
Architecture a. Compliance with high-level
requirements b. Consistency c. Compatibility with
the target computer d. Verifiability e.
Conformance to standards f. Partitioning
integrity 6.3.4 Verification Activities for the
Source Code a. Compliance with low-level
requirements b. Compliance with the software
architecture c. Verifiability d. Conformance to
standards e. Traceability f. Accuracy and
consistency 6.3.5 Verification Activities for the
Executable Object Code a. Completeness and
correctness b. Compliance with the high-level
requirements c. Robustness for high and low-level
requirements d. Compliance with the low-level
requirements e. Compatibility with the target
computer 6.3.5.1 Software Testing 6.3.5.2 Test
Environment
53
Comparison of Old -gt New
  • 6.3.6 Reviews and Analyses of the Test Cases,
    Procedures, and Results
  • 6.4 Software Testing Process
  • 6.4.1 Test Environment
  • 6.4.2 Requirements-Based Test Case Selection
  • 6.4.2.1 Normal Range Test Cases
  • 6.4.2.2 Robustness Test Cases
  • 6.4.3 Requirements-Based Testing Methods
  • 6.4.4 Test Coverage Analysis
  • 6.4.4.1 Requirements-Based Test Coverage Analysis
  • 6.4.4.2 Structural Coverage Analysis
  • 6.4.4.3 Structural Coverage Analysis Resolution

6.3.6 Verification Activities for the Analyses,
Test Cases, Procedures and Results a. Analysis
and Test cases b. Analysis and Test
procedures c. Analysis and Test results 6.3.6.1
Coverage Analysis 6.3.6.1.1 Requirements Coverage
Analysis 6.3.6.1.2 Structural Coverage
Analysis 6.3.6.1.3 Structural Coverage Analysis
Resolution Transferred to section
6.3.5.1 Transferred to section
6.3.5.2 Transferred to 6.3.5 Transferred to
6.3.5 Transferred to 6.3.5 Transferred to
6.3.5 Transferred to section
6.3.6.1 Transferred to section 6.3.6.1.1
Transferred to section 6.3.6.1.2 Transferred
to section 6.3.6.1.3
54
(No Transcript)
55
(No Transcript)
56
Major Comments Raised
  • The paper lowers the bar for testing
    significantly (To zero!)
  • Review and analysis are the only applicable
    methods for verification of higher level life
    cycle data.
  • Testing is the only applicable methods for
    meeting the objectives for the executable object
    code.

57
In Summary
  • Revision A of Paper emphasises the reliance on
    testing where there are no other accepted means
    for verification.
  • DO-178C used alone needs to be as forceful in the
    need for testing as DO-178B.
  • Only the use of approved guidance in conjunction
    with DO-178C could alter the amount of testing
    required.
  • Added statement In order to satisfy the
    verification objectives for the executable object
    code, testing should be performed unless
    appropriate analysis is used in accordance with
    approved guidance. As a minimum, testing should
    be carried out as part of hardware software
    integration to ensure that Executable Object Code
    is compatible with target computer.

58
Thank YouQuestions
  • SC205/WG71 http//ultra.pr.erau.edu/SCAS/

59
1 Packaging (cont)
60
1 Packaging (cont)
Write a Comment
User Comments (0)
About PowerShow.com