Title: Software Testing at Florida A
1Software Testing at Florida AM
UniversityPerspective, Direction and
Opportunities
- Dr. Edward L. Jones
- CIS Department
- Florida AM University
- Tallahassee, FL
2In 25 Words or Less
- FAMU a major player
- Vision -- software quality institute
- Technology transfer mission
- Incremental research expansion
- Collaboration opportunities
- Faculty student recruitment
- A work in progress . . .
3Outline
- Part I -- Introduction
- Part II -- Technology Transfer
- Part III -- Research
- Part IV -- Conclusion
4 5Part I - Introduction
- Where Im Coming From
- Where We Are
- What Can Academics Do?
6 7My Institution
- 11,000 students total
- 600 CIS majors
- 15 Faculty
- Heavy teaching
- Graduate program (20 students)
- Increasing research
8My Experience
- 13 years Harris Corporation
- Software engineering technology
- 200M NASA project
- methodology expert
- software inspections, testing
- process training
- 20 years university teaching
- 3 years focusing on testing
9Faculty Research Interests
- High-Performance Computing
- Harmon, Tseng, Chandra
- Databases and Data Engineering
- Chandra, Humphries
- Software Engineering
- Jones, Humphries -- testing, object database
- Evans -- cryptograhy, compression
- Intelligent Systems
- Allen, Riggs, Granville -- speech/natural
language, IR, data mining, fuzzy systems
10CIS Department Research
- High Performance Computing
- Analysis/Modeling
- Hardware/Software Architectures
- Large object data bases
- Data compression
- Computational Science
- (collaboration with other departments)
- Algorithms
- Computation
Solve using
How to engineer
How to use in
Application to
- Software Engineering
- Development
- Assurance / Testing
- Cryptography
How to engineer
- Intelligent Systems
- NL/speech processing
- Agents
- IR / Data mining
Application to software engineering
11My Perspective
- Testing is not just for testers!
- In ideal world, fewer testers required
- Verification vs testing
- More skilled developers
- No silver bullet just bricks
- Simple things provide leverage
- Testing in-the-small -- classroom the lab
- Technology transfer crucial
12 13The Problem
In Industry Testing is 50 of software cost!
14The Problem
In the University . . . Wheres the Beef?
15Why So Little Testing?
- Heredity -- teachers not trained
- Priority -- no room in curriculum
- Attitude
- Digression from the fun part
- Testing a necessary evil
- Not the creative part
- Testing is HARD to teach!
16Testing is a BROAD Subject!
- Simple Finding defects in software
- Complications Dimensions
- What Level?
- Why?
- How?
- When?
- Exhaustive treatment not possible
17SPRAE Experience Framework
S pecification no spec, no test P
remeditation follow a process R epeatability
tester independence A ccountability
documentation E conomy cost effectiveness
18A Model Test Life Cycle
19(Besides Teaching A Course?)
20The Vision
- Quality Awareness in Curriculum
- Solid Framework
- Enhanced Student Skill Set
- Expanded Student Opportunities
- Industry Partnerships
- Dissemination
21A Software Testing Course
- 80 practice, 20 theory
- 2/3 fine-grained testing (functions)
- 1/3 object and application testing
- Test cases, drivers, and scripts
- Decision tables the "formalism"
- Function, boundary, white-box testing
- Effectiveness Coverage error seeding
22Course -- Lessons Learned
- Advantages
- A taste" of how hard testing really is
- Preparation for advanced study
- Rounds out analytical programming skills
- Deficiencies
- Course not available to all students
- Students compartmentalize knowledge
23Testing in Action -- Automated Program Grading
- Shock outrage at exactness
- Important -- grade at stake
- Teacher overhead
- Testable specification
- Test automation up-front costs
- Benefits
- Better, objective grading process
- Students begin to think like testers!!
24Technology Transfer Program Grading Services
- Provided by TestLab students
- Experience designing automated tests
- Shell programming
- Instructor plays active part
- Refines specification
- Creates grading plan
- Benefits
- Faculty more inclined to incorporate testing
course insert
25Testing in Action -- Plagiarism Deterrent
- Mutation analysis/error seeding applied to
suspicious student program - Student must debug modified program
- TestLab project
- Tunable error seeding
- Goal debugging training (environment)
- Benefits
- Less plagiarism or more skilled programmers
26Other Notable Approaches
- Florida Tech Center for Software Engineering
Research - Industry sponsored projects
- Cem Kaner NSF project -- artifacts/modules
- Embry-Riddle Aeronautical University
- SEI Personal/Team Processes -- quality-centric
curriculum - USF Software Testing Center
27- Part II
- TECHNOLOGY TRANSFER
28Caught versus Taught
- Caught
- Attitudes
- Respect for consequence of not testing
- Taught
- Techniques for specific testing instance
- Basic analytical and programming skills
- Strategy must contain both elements
29A Holistic Approach
30What Is Meant By Holistic?
- Testing an integral part of curriculum
- Multiple test experiences
- Experiences in each course
- Repetition and reinforcement
- Accumulation of experiences
- Experiences cover test lifecycle
- Holistic, NOT Exhaustive!
31Why SPRAE?
- SPRAE essential values for practice of testing
- Explains the what and why
- Sets goals for test experiences
- Value system reinforced with each new experience
- A framework for life-long learning.
32Core Course Experiences
- Grade another student's program
- Treasure Hunt -- find seeded errors
- Write test cases before coding
- Develop and certify components
- Blind test -- develop specification for a working
program.
33The Software TestLab
- Environment for discovery learning
- Basic testing skills
- Mentoring / Competency based training
- Evolving Laboratory
- Tools tutorials
- Staffed by students and faculty
- Problems/solutions test bed
- Dissemination Strategy
34Software TestLab - Context
CS course
Technology Transfer
Testing Enthusiast
Donated/public test artifact
Artifact or tutorial
Graduate Testing Research
artifact
TestLab Artifacts
Test Arcade
artifact
Certification / Tutorial
artifact
Arcade Problems (Meta-Data)
TestLab Student Training
35Student Mentorship Model
- Managed skill development
- Clear achievement goals
- Key Practices x Competency Levels
- Progress certification
- Student-Student mentoring
- Recognition Program
36Key Practices
- Practitioner -- performs defined test
- Builder -- constructs test machinery
- Designer -- designs test cases
- Analyst -- sets test goals, strategy
- Inspector -- verifies process/results
- Environmentalist -- maintains test tools
environment - Specialist -- performs test life cycle.
37Specialist I - Competencies
1
2
3
4
5
...
Practitioner
Test Practitioner
1
2
3
4
5
...
Test Builder
1
2
3
4
5
...
Test Designer
1
2
3
4
5
...
Test Analyst
1
2
3
4
5
...
Test Inspector
1
2
3
4
5
...
Test Environmentalist
1
2
3
4
5
...
Test SPECIALIST
38Test Specialist I
- Practitioner I - Run function test script
document test results. - Builder I - Develop function test driver.
- Designer II - Develop functional and boundary
test cases from decision table. - Analyst I - Develop decision table from
functional specification. - Inspector I - Verify adherence to function test
process and standards.
39Training Topics
- TestLab Environment Basic Training
- Black-Box Function (unit) Testing
- Black-Box Object Testing
- White-Box Function Testing
- Clear-Box Object Testing
- . . .
40Problems/Solutions Testbed
- Repository of testing problems
- Repository of student test artifacts
- Best in class ? solutions testbed
- Deficient solutions ? almost testbed
- Testbed used for tester certification
41The Test Arcade
- Fun Games approach
- Players compete solving testing problem
- Scored on time and accuracy
- Ranked list of players by score
- HELP facility provides a "teaching" mode
- Supports testing contests, certification
- NSF Funding requested
42- Part III
- TESTLAB RESEARCH
43Research Obstacles
- Search for the silver bullet
- All eggs in one basket, i.e., testing
- In-the-large empirical studies
- Graduate/research student pool
- Critical mass of faculty
44Research Funding
- 85K Corporate support
- 50K from Dell Star Program (TestLab)
- 35K for student / travel
- Task on 1.5M NSF MII grant
- 500K NSF ITR proposal submitted
45Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns/verification agents
- Other Ideas
46Decision Based Testing (DBT)
- Decision Table
- Logic model, lightweight formality
- Simple to teach
- Systematic test condition generation
- Column gt behavioral equivalence class
- Conditions gt basis for boundary analysis
- Complication
- DT variables computed from inputs
47Simple ExampleDBT SPRAE
- Compute pay with overtime calculation based on
employee classification - Example illustrates
- SPRAE principles
- Test case design
- Opportunities for automation
48Principle Specification
Specification Compute pay for an employee, given
Hours worked and hourly pay Rate overtime is 1.5
times hourly Rate, for Hours above 40. Salaried
employees earn over 20/hour, and are always paid
for 40 hours.
49Principle Premeditation
Use a systematic process to devise test cases
based on the specification.
- Functional testing
- Decision rule (column) gt behaviors
- One test case per behavior
- Determine expected result.
50Decision Analysis
Behavior
51Functional Test Cases
1
2
52Boundary Test Cases
53Principle Repeatability
Processes for test case creation and test
execution must yield equivalent results,
independently of the tester.
- Systematic processes for deriving
- Functional test cases
- Boundary test cases
- Processes can be automated
54Principle Accountability
Keep records that document test process and
artifacts.
- Documentation answers
- What tests were planned?
- Which tests were conducted?
- Who did what testing,and when?
- What were the results?
- How were the results interpreted?
55Repeatability -- Test Scripts
Functional Test Cases
Boundary Test Cases
56Accountability -- Test Log
Test Results Summary
Failure Description
57Principle Economy
Economy Test activities must not require
excessive time or effort.
- Systematic processes
- Automation
- Test drivers/harnesses (classical)
- Test case generation/assistance
- Test results analysis (oracle, database).
58DBT -- Related Work
- State-based testing
- State models a form of decision table
- State explosion problem -- infeasibility of
exhaustive testing - Published methods
- Parnas Software Cost Reduction (SCR) methodology
- Object-oriented testing
59Commercial/Research Decision Table Tool
- Prologa (PROcedural Logic Analyzer)
- Interactive, rule-based
- Table construction and manipulation
- Facilitates tasks such as checking, adding or
reordering conditions. - Reference
- Robbin, F., Vanthienen, J., Developing Legal
Knowledge Based Systems Using Decision Tables,
Communications of the ACM, 1993.
60TestLab Decision-Based Test Tools
DT-Editor
Testdata
DT2-logicspec
DT
Test Case Specifications
TC-verify
DT-Reverse
BTD-Generate
Testset Adequacy
Boundary Testdata
Source Code
61Custom Decision Table Tools
- Support software specification and testing
- TC-verify determines adequacy based on DT rules
- Post facto verification of whitebox coverage
(when DT-Reverse used) - Execution-free adequacy test
62Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns / verification agents
- Other Ideas
63Reliability Test Simulator
- Prototype from Graduate IVV Course
- Published in ISSRE 2001 Proceedings
- Excerpts from Presentation
64Reliability Simulation -- TBDs
- Simulation for experimentation
- Imperfect defect removal (ref )
- Operational testing
- Thesis topic(s)
- Model validation vs. published/industry results
- Management decision-support via simulation
65Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns / verification agents
- Other Ideas
66Test Arcade
Problem Testbed of problem templates
problem/answer generator presentation
response manager score manager.
Approach Identify problem templates and develop
knowledge base for representation and generation.
Target problems to specific skill levels.
Research Refine competency levels x test
instances. Accumulate problem set per (level,
instance) pairs. Tool database web access.
67Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns / verification agents
- Other Ideas
68Design Mutation
- Reflects design misinterpretations
- Mutant killed when Faili is nonempty
Fail1
Tset1
D1
Design
Fail2
Tset2
D2
Fail3
Mutate
Tset3
D3
...
...
...
Failn
Tsetn
Dn
69Mutation of Decision Table
Mutant Behavior
Mutant Behavior
70Impact of Mutations on Functional Test Cases
Bad test case
Bad test case
71Design Mutation -- Prospectus
- Application
- Requirements/design models
- Guided inspection -- ensure mutants not being
created - Testing creation of BAD test cases
- Practical if automated
- Facilitated by formal representation
72Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns / verification agents
- Other Ideas
73Test Patterns (Example)
- GUI input controls
- Properties determine required testing
- Derive black-box test patterns for GUI input
controls - Propose automated test agents knowledgeable about
specific test pattern
74Test Patterns (Continued)
- Test patterns for design patterns
- Empirical test patterns reflecting organizational
practices - Reverse engineer practices into patterns
- Forward engineer patterns into practice
75Test Agents
- Properties of Agents
- Responsibility
- Knowledge
- Autonomy
- Scope of Agents
- Functional Unit (F)
- Architectural Component (A)
- System (S)
76 Test Agents -- Prospectus
- Intelligence
- Generator of checklist / procedure for human
tester - Watch-dog and status reporter
- Reminder to complete test action
- Performer of test action
- Coordinator of test agents
77Seminal Projects
- Decision-based test methods
- Reliability testing training simulator
- Test arcade
- Testing via design mutation
- Test patterns / verification agents
- Other Ideas
78Testing A State of Practice Study
Problem Develop a method for characterizing the
state of practice in software testing.
Approach Use frameworks like the Test Maturity
Model and SPRAE to characterize/assess state of
practice.
Research Relate attributes of testing practice
to qualitative and quantitative indicators of
effectiveness and satisfaction. Devise easy to
use evaluation system that identifies areas
needing improvement, and which provide insightful
measures of performance.
79Adaptive Operational Testing
Problem Perfecting software by OT is biased by
expected feature usage. Even when errors are
flushed out for one feature, the test emphasis
remains the same. OT leads to slow rate of error
discovery in other features.
Approach Given features A,B,C with probabilities
pA, pB, pC, and MTBF of tA, tB, tC. Shift feature
probability as feature reliability increases.
Research Determine criteria for shifting ps so
that feature starvation does not occur. Tool
Reliability simulator to prototype solutions.
80Speech Support for Intuitive Testing
Problem Intuitive testing exploits tester
expertise and experience but lacks the formality
which ensures repeatability and documentation of
the test and results.
Approach Use speech input to capture test cases,
test results, and error reports. Generate test
artifacts (1) test script (2) test log and (3)
problem reports.
Research Develop dialog model. Experiment with
student system testing.
81Specification-Driven Testing of Speech Based
Systems
Problem. Use a formal dialogue model as the
basis for test case design and test adequacy
evaluation.
Approach. Given the graphical dialogue model used
by the FAMU JSBB tool (1) define dialogue
coverage adequacy criteria (2) extend the JSBB
code generation to include coverage
instrumentation (3) design a test case generator
based on the dialogue model (4) implement
coverage analyzer.
Research. Evaluate effectiveness of model-based
test cases using control-flow and dialogue-model
coverage measures.
82Testing and Reverse Engineering
- Testing answers the questions
- What do we have here? (exploratory)
- Is this what we wanted? (acceptance)
- Is this what we expected? (scripted)
- Testing is the last chance to document the
as-built system - Exploratory testing -- can it be sooner?
83Testability via Self-Diagnostics in Objects
Problem Lack of observability Voas in O-O
software complicates testability. Encapsulation
prevents error propagation to outside.
Approach Design in self diagnostics along with
means of propagation to outside. Use multiple
techniques such as assertions for state
verification.
Research Research observability problem in
object-based testing. Apply and extend
frameworks/methods like JUnit to implement
self-diagnostics.
84 85Acknowledgments
- Lucent Technologies
- Dell Computer
- 3M
- Telcordia
- Students
- NSF (EIA-9906590)
- Lockheed Martin
- Abbott Laboratories
- Eli Lilly and Company
- Software Research, Inc.
86Future
- Evolve TestLab Mentorship Model
- Transfer to Selected Courses
- TestLab students transfer agents
- Disseminate Results
- Web site, newsletter, conferences
- Procure Funding for Research
- Find Research Collaborators
87Who Wins?
- Students -- expanded career options
- Academic Departments
- Departmental Faculty
- Corporate Employers
- The State of Florida
- The Software Industry
88 89Opportunities to Partner USF-FAMU
- Artifacts testbed development
- TestLab student exchange
- Student/Faculty Internship(s)
- State-of-Practice studies
- Thesis committee participation
- Scholar in residence
90Questions?
Questions?
Questions?