Title: Requirements-Based Testing
1Requirements-Based Testing
- Dr. Mats P. E. Heimdahl
- University of Minnesota Software Engineering
Center - Dr. Steven P. Miller
- Dr. Michael W. Whalen
- Advanced Computing Systems
- Rockwell Collins
- 400 Collins Road NE, MS 108-206
- Cedar Rapids, Iowa 52498
- spmiller_at_rockwellcollins.com
2Outline of Presentation
- Motivation
- Validation Testing
- Conformance Testing
- Whats Next
3How We Develop Software
SW High-Level Reqs. Development
HW/SW Integration Testing
SW Design Description Dev. (SW Low-Level Reqs.
SW Arch.
SW Integration Testing
SW Source Code Dev.
SW Low-Level Testing
SW Integration (Executable Code Production)
4How we Will Develop Software(From V to a Y)
SW High-Level Reqs. Development
HW/SW Integration Testing
Software Model
SW Integration Testing
How do we know our model is correct?
Validation Testing Formal Verification
Can we trust the code generator?
Conformance Testing
SW Integration (Executable Code Production)
5Outline of Presentation
- Motivation
- Validation Testing
- Conformance Testing
- Whats Next
6How we Will Develop Software(From V to a Y)
SW High-Level Reqs. Development
HW/SW Integration Testing
Software Model
SW Integration Testing
How do we know our model is correct?
SW Integration (Executable Code Production)
7Modeling Process
SW High-Level Reqs. Development
8ProblemModeling Frenzy
SW High-Level Reqs. Development
Desired Model Properties
Software Model
How do we know the model is right? How do we
test the model?
SW Integration (Executable Code Production)
9One Solution Redefine Requirements
System Reqs. Development
HW/SW Integration Testing
The model is the requirements
Software Model
SW Integration Testing
Use Engineering Judgment when Testing
SW Integration (Executable Code Production)
10One Solution Redefine Requirements
System Reqs. Development
HW/SW Integration Testing
The model is the requirements
Software Model
SW Integration Testing
Use Engineering Judgment when Testing
SW Integration (Executable Code Production)
11Testing Does not go Away
System Reqs. Development
HW/SW Integration Testing
Software Model
SW Integration Testing
Extensive Testing (MC/DC)
SW Integration (Executable Code Production)
12It Simply Moves
System Reqs. Development
HW/SW Integration Testing
Software Model
SW Integration Testing
Extensive Testing (MC/DC)
SW Integration (Executable Code Production)
13Do it Right!
SW High-Level Reqs. Development
Specification Test Is the Model Right?
14How Much to Test?
State Coverage
Masking MC/DC?
MC/DC
Decision Coverage?
Transition Coverage?
Something New??
Def-Use Coverage?
Where Do the TestsCome From?
15Requirements Based Testing
SW High-Level Reqs. Development
Cover the Properties!
16Properties are Requirements
17Requirements Based Testing Advantages
- Objective Measurement of Model Validation Efforts
- Requirements Coverage in Model-based Development
- Help Identify Missing Requirements
- Measure converge of model
- Basis for Automated Generation of
Requirements-based Tests - Even If Properties Are Not Used for Verification,
They Can Be Used for Test Automation
How Are Properties Covered with
Requirements-based Tests?
18Property Coverage
- If the onside FD cues are off, the onside FD
cues shall be displayed when the AP is engaged - G(((!Onside_FD_On !Is_AP_Engaged) -gt
X(Is_AP_Engaged -gt Onside_FD_On)) - Property Automata Coverage
- Cover a Synchronous Observer Representing the
Requirement (Property) - Structural Property Coverage
- Demonstrate Structurally Interesting Ways in
Which the Requirement (Property) Is Met
19Property Automata Coverage
- Cover Accepting State Machine As Opposed to
Structure of Property - Büchi Coverage
- State Coverage, Transition Coverage, Lasso
Coverage
20Alternative Machine
- Different synthesis algorithms give different
automata - Will affect the test cases required for coverage
21Structural Property Coverage
- Define Structural Coverage Criteria for the
Property Specification - Traditional Condition-based Criteria such as
MC/DC Prime Candidates - Property Coverage Different than Code Coverage
- Coverage of Code and Models
- Evaluate a decision with a specific combination
of truth values in the decision - Coverage of Properties
- Run an execution scenario that illustrates a
specific way a requirement (temporal property) is
satisfied
22Example
- G(((!Onside_FD_On !Is_AP_Engaged) -gt
X(Is_AP_Engaged -gt Onside_FD_On)) - Demonstrate That Somewhere Along Some Execution
Trace Each MC/DC Case Is Met - Only the positive MC/DC cases
- The negative cases should have no traces
- In the Case of G(p)Globally p Holdswe Need to
Find a Test Where - in the prefix the requirement p is met
- we reach a state of the trace where the
requirement p holds because of the specific MC/DC
case of interest let us call this case a - then the requirement p keeps on holding through
the remainder of the trace - p U ( a U X(G p))
p
p
a
p
p
p
23Summary
- Objective Measurement of Model Validation Efforts
- Requirements Coverage in Model-based Development
- Help Identify Missing Requirements
- Basis for Automated Generation of
Requirements-based Tests - Even If Properties Are Not Used for Verification,
They Can Be Used for Test Automation and Test
Measurement - Challenges
- How Are Properties Specified?
- Combination of Observers and Temporal Properties
- What Coverage Criteria Are Suitable?
- How Is Automation Achieved?
- How Do We Eliminate Obviously Bad Tests? Should
We? - How Do We Generate Realistic Test-cases?
- Rigorous Empirical Studies Badly Needed
24Outline of Presentation
- Motivation
- Validation Testing
- Conformance Testing
- Whats Next
25How we Will Develop Software(From V to a Y)
SW High-Level Reqs. Development
HW/SW Integration Testing
Software Model
SW Integration Testing
Can we trust the code generator?
SW Integration (Executable Code Production)
26Correct Code GenerationHow?
- Provably Correct Compilers
- Very Hard (and Often Not Convincing)
- Proof Carrying Code
- Generate Test Suites From Model
- Compare Model Behavior With Generated Code
- Unit Testing Is Now Not Eliminated, but Largely
Automated
27Existing Capabilities
- Several Commercial and Research Tools for
Test-Case Generation - TVEC
- Theorem Proving and Constraint Solving techniques
- Reactis from Reactive Systems Inc.
- Random, Heuristic, and Guided Search
- University of Minnesota
- Bounded Model Checking
- NASA Langley
- Bounded Model Checking/Decision
Procedures/Constraint Solving - Tools Applicable to Relevant Notations
- In Our Case Simulink
28An Initial Experiment
- Used a Model of the Mode Logic of a Flight
Guidance System As a Case Example - Fault Seeding
- Representative Faults
- Generated 100 Faulty Specifications
- Generate Test Suites
- Selection of Common (and Not So Common) Criteria
- Fault Detection
- Ran the Test Suites Against the Faulty
Specifications - Recorded the Total Number of Faults Detected
29Fault Finding Results
30Model Cheats Test Generator
FCS Architecture
31Effect of Test Set Size
32Summary
- Automated Generation of Conformance Tests
- Current Technology Largely Allows This Automation
- Challenges
- Development of Suitable Coverage Criteria
- Effect of Test Set Size on Test Set Effectiveness
- Effect of Model Structure on Coverage Criteria
Effectiveness - Traceability of Tests to Constructs Tested
- Empirical Studies of Great Importance
33Outline of Presentation
- Motivation
- Conformance Testing
- Validation Testing
- Whats Next
34New Challenges for Testing
- Model Validation Requirements-based Testing
- How Do We Best Formalize the Requirements?
- What Coverage Criteria Are Feasible?
- Which Coverage Criteria Are Effective (If Any)?
- How Do We Generate Realistic Tests?
- Will This Be a Practical (Tractable) Solution?
- Conformance Testing
- What Coverage Criteria Are Effective?
- Detecting Faults From Manual Coding
- Detecting Faults From Code Generation
- Relationship Between Model Structure and
Criteria Effectiveness - Traceability From Tests to Model
- Relationship Between Model Coverage and Code
Coverage - Optimizations in Code Generator Will Compromise
Coverage
35Discussion
36Perfection is Not Necessary
Missed Faults
- Tools and Models Only Need To Be Better Than
Manual Processes - How Do We Demonstrate This?
- Empirical Studies Are of Great Importance
I Think Many Already Are
37DO-178B Test Objectives
- The executable code complies with the high-level
requirements. - The executable code complies with the
specification (low-level requirements). - Test coverage of high-level requirements is
achieved - Test coverage of specification (low-level
requirements) is achieved - Test coverage of the executable code is achieved