Title: SOFTWARE TESTING
1SOFTWARE TESTING
- MN3309 - Session 17
- Prof. Mark Nissen
2Agenda
- Software Management Review
- Software Testing
- Peer Inspection Review
- Software Inspection Exercise
- EPLARS Caselet (hidden)
- Summary
3Software Management
- Poor management is 1 cause of failure
- Quality must be built-into software
- All S/W has errors defects
- Errors - introduced via rqmts design
- Defects - introduced via code fix
- Testing defect removal key to quality
- Cannot remove all defects
- Cost of error/defect removal increases
exponentially with program phase
4Nature of S/W Testing
- Labor-intensive activity
- Up to 50 of development cost
- 50-100 defects per KSLOC ave
- Only confirm presence of defects
- Cannot confirm absence of defects
- Like trying to prove a theory is true
- Testing applies to all S/W phases
- Must be integral to S/W process
- Coder is least likely to ID own defects?
5IVV
- Independent verification validation
- Verification?
- Validation?
- Why independent?
- S/W process analysis important? BPR?
- IVV for critical systems or all?
- How to determine criticality?
- IVV as investment in quality?
6Defect Detection Removal
- Measured with great precision
- Removal is indicator of program health
- 3 classes of defects tests
- Unit/component
- Integration
- System
- Remove 85 prior to delivery (Jones)
- What about 5000 year bugs?
- Defect removal does not remove error?
7Root Cause Analysis
8Structural Behavioral Testing
- Structural testing
- White-box or glass-box testing
- Know module structure code
- Ideally test all paths through code
- Settle for coverage
- Behavioral testing
- Black-box or functional testing
- No insight into internal module
- Ideally test all inputs to system
- Settle for coverage
9Cargo Movement Ops
- Early unit testing - why?
- End users involved during coding
- Validated user interface
- Tested S/W modules as developed
- Contractor fixed defects as found
- Results
- Decreased development time
- Identified critical errors during development
- Hands-on training for gov personnel
10AFOTEC Testing Objectives
- Usability - user surveys
- Effectiveness - validation
- Software maturity
- Differs from S/W development maturity
- Plot cum change points vs. time
- Reliability
- Supportability
- MEAT?
11Change Point Tracking
12ERROR SCRs-DISCOVERY CORRECTION
CURRENT STATUS
R
A key positive indicator will be when the
discovery line flattens out as the rate of new
errors encountered approaches zero.
The correction rate for Priority 1 2 SCRs
continues to keep pace with the discovery rate.
13Software Documentation
- Must support entire life cycle
- Technical managerial docs
- Significant cost element
- Critical to PDSS
- Valueless if not current, accurate
- Online (in-place) documentation
- Cost savings demand-pull
- Advantages disadvantages?
- Online readings for course/C-SAWS?
14Military vs. Commercial Software Effort
15Peer Inspection Review
- 15 increase in development cost
- 25-35 increase in total productivity
- Eliminate 80 of all S/W defects
- With testing, decrease latent S/W defects by
factor of 10 (OOM) - Most cost-effective quality technique
- Use in RFP source selection
16Peer Inspection Objectives
- Find errors at the earliest possible point in the
development cycle, - Ensure that the appropriate parties technically
agree on the work, - Verify that the work meets predefined criteria,
- Formally complete a technical task and,
- Provide data on the product and the inspection
process.
17Peer Inspection Benefits
- They ensure that associated team members are
technically aware of theirs and each others
products - They help to build high-performing technical
teams - They help to use the organizations best talents
- They provide team members with a sense of
achievement and participation - They help the participants develop their skills
as reviewers - They provide an orderly means to implement a
standard of software engineering excellence
throughout the development program and, - They pass along the lessons-learned of more
experienced engineers to their junior, less
experienced peers.
18Peer Inspection by Phase
19Peer Inspection Resource Effects
20Software Inspection Exercise
- Divide into two teams (PMO COM)
- 1 moderator (leads inspection)
- 1 reader (explains code)
- 1 recorder (documents defects)
- Others critique assist
- Use code from Reuse Exercise
- Re-input on board if necessary
- Inspect whole module
- Note design errors code defects
- Entry conditions if re-inspection needed
21EPLARS Caselet
- Army Tactical C2 System (ATCCS)
- AFATDS
- FAADC2I
- All Source Analysis System
- Maneuver Control System
- Combat Service Support Control System
- Technological Evolution
22THE ARMY TACTICAL COMMAND CONTROL SYSTEM
MANEUVER
Maneuver Control System
FIRE SUPPORT
AIR DEFENSE
Forward Area Air Defense Command Control System
Advanced Field Artillery Tactical Data System
ADDS
CNR
ACUS
INTEL / ELECTRONIC WARFARE
COMBAT SERVICE SUPPORT
All Source Analysis System
Combat Service Support Control System
CNR Combat Net Radio ACUS Area Common User
System ADDS Army Data Distribution System
23AFATDS FIELDING CURRENT TECHNOLOGY
INTEL FAMILY
80286
80486DX
80386DX
ASARC
(DOS / PC)
MOTOROLA FAMILY
68020
68030
68040
68000
(Apple / Macintosh)
AFATDS Fielding
1995
1990
1985
1980
375 68030 _at_ 50 MHZ
382 68040 _at_ 25 MHZ
HTU 80286 _at_ 6 MHZ
330 68020 _at_ 16 MHZ
735 PA RISC _at_ 99 MHZ
CHS 1
FSCT
FSCT
FSCT
Fire Support Control Terminal (FSCT)
Forward Entry Device (FED)
LCU 80486DX _at_ 66 MHZ
LCU
Fire Support Handheld Terminal Unit
24EPLARS System
- Network of UHF radio systems
- Managed by network control station
- Carried by soldiers, vehicles, aircraft
- Spread spectrum, anti-jam, crypto
- Provide data comm to ATCCS
- Near real-time communications
- 1/2 M SLOC, many interfaces
- ACAT 1D
25Acquisition Planning Phases
- I - feasibility study (79-80)
- II - feasibility demo (80-82)
- III/IV - prototype development (82-87)
- V - H/W S/W for tech testing (85-88)
- TT/IOTE (88-89)
- LRIP scheduled (88-)
- EMD scheduled (89-)
26Programmatics
- EPLARS interface rqmt with ATCCS
- ATCCS - parallel development
- Interface is known technical challenge
- EPG agency needed ATCCS simulator
- EPLARS in PEO COMM (CECOM)
- PM ADDS (EPLARS JTIDS)
- 2 project offices (NJ CA)
- Matrix structure
- 2 people assigned to software
27Test Results
- Test phase I (88)
- Simulation S/W inadequate
- Used anyway
- Error propagation through test results
- Technical problems with EPLARS
- Software firmware revisions needed
- Simulator improvements needed
- Test phase II (89)
- Software firmware problems persisted
- EPLARS failed test (ACAT 1D deviation)
28PMO Advice
- Assess situation
- What mistakes were made?
- How should it have been done?
- What do you say to GAO?
- Recommend action
- How to correct current problems?
- How to restructure program?
29Summary
- Poor management is 1 cause of failure
- Cost of fixing S/W f(phase)b
- Testing is labor intensive
- 50 of development cost
- Only confirm presence of defects
- IVV key for critical systems
- Documentation key for maintenance
- Inspection has best quality ROI