What Do You Mean It Doesn - PowerPoint PPT Presentation

1 / 11
About This Presentation
Title:

What Do You Mean It Doesn

Description:

What Do You Mean It Doesn t Do What We Thought? Validating a Design – PowerPoint PPT presentation

Number of Views:77
Avg rating:3.0/5.0
Slides: 12
Provided by: klabsOrg
Learn more at: http://klabs.org
Category:

less

Transcript and Presenter's Notes

Title: What Do You Mean It Doesn


1
What Do You Mean It Doesnt Do What We Thought?
  • Validating a Design

2
Agenda Design Validation
  • Concepts and implications
  • Specification mitigations
  • Design mitigations
  • Test set mitigation
  • Summary

3
Concepts
  • Validation Confirmation, through the provision
    of objective evidence, that the requirements for
    a specified intended use or application have been
    fulfilled

4
Implications
  • The issues relating to validation encompass those
    of verification
  • Additional concerns with validation
  • Caused by the need to match the application to
    the product
  • Application has been translated to specification
    through the requirements process
  • Requirements process is by nature imperfect
  • Sometimes the specification does not satisfy the
    needs of the application
  • Result a verified product might be invalid
  • May require significant rework to the product
  • May require accepting reduced functionality
    (waiver)
  • A goal of the development process is to minimize
    validation failures
  • Begins in review of the requirements process
    (hopefully primary point)
  • Mitigate by design activities
  • Reduce by robust test set design

5
The Implication Illustrated
  • Alice electronics (detector component and CDH
    component)
  • Joint requirement process gt 10 k sustained
    events per second
  • Individual requirements defined for detector and
    CDH processing
  • Both met individual requirements for processing
  • When combined only 6-7 k sustained events per
    second
  • Verification of individual units led to invalid
    system
  • What went wrong?
  • The overall requirements were not broken down
    correctly
  • The CDH and DE test sets were not high fidelity
  • Verified functionality, not performance
  • We got lucky that a waiver was acceptable

6
Specification Mitigation
  • Only list requirements, not desirements
  • State unambiguous performance requirements
  • Build in adequate margin
  • Not open-ended enhancement, but
  • Enough to accommodate performance shortfalls
  • Ruthlessly remove TBDs
  • Insist on definite test methods for mitigation
  • Remember Unless application needs can be
    unambiguously specified, they wont be met!

7
Design Mitigation
  • Implement specification exactly
  • Isolate various sub-sections
  • Minimizes corner cases and negative
    interactions
  • Allows correction with minimal impact when things
    dont work right
  • Verify complex functions early, thoroughly, and
    completely
  • Allows early look at potential problems
  • Analysis / simulation / what ifs should be as
    realistic as possible
  • Insist on end-user review of implementation
  • Allows user community to comment
  • Minimizes misunderstandings upon delivery
  • Develop test plans that have high fidelity to the
    end application

8
Test Set Mitigation
  • Ensure interfaces are maximally flight-like
  • Precludes misunderstandings of characteristics
  • Provides early indication of problems
  • Dont emulate only one characteristic of
    interface
  • Make test set reasonably sophisticated
  • Sufficient complexity to reproduce operational
    timing
  • Adequate functionality for stress testing
  • Run all interfaces at maximum speed with margin
  • Dont let the same group build the tested unit
    (design) and the unit tester (test bench)
  • Identical assumptions might go into both ends of
    an interface
  • Faithful reproduction is dependent on familiarity
    (if possible, test bench should be provided by
    end user)

9
Test Set Mitigation (cont.)
  • Make the control interface as application like as
    possible
  • Forces correct command structures / types
  • Allows all test scripts to be reproduced at
    higher levels
  • If at all possible, incorporate early interface
    tests of real engineering hardware
  • Keep the test (or simulation) environment unless
    the flight system changes
  • Dont change test equipment hardware
    configurations
  • Apples to apples comparisons during tests vital
  • Ensure that flight changes are reflected in test
    set design

10
Test Set Mitigation (cont.)
  • Use the same controls for test set development as
    for flight unit development
  • Configuration management
  • Software development
  • Peer reviews
  • Build in diagnostics so that anomalies can be
    traced to test equipment or unit under test
  • Ensure that test results mean something
  • Pass / fail criteria clear
  • Allowable flight parameter variations included
  • Reasonable displays (with significant information
    clearly shown)
  • Ensure that test set accommodates calibration

11
Summary
  • Successful verification does not always guarantee
    successful validation
  • Techniques can be incorporated that improve the
    likelihood that validation will succeed
  • Careful specification development
  • Thorough and cautious design techniques
  • Extensive test set fidelity to flight
    requirements
  • Effective techniques for validation are extra
    effort
  • More time consuming
  • More expensive
  • But, definitely worth it.
Write a Comment
User Comments (0)
About PowerShow.com