Software Testing at Florida A - PowerPoint PPT Presentation

1 / 71
About This Presentation
Title:

Software Testing at Florida A

Description:

Software Testing at Florida A – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 72
Provided by: edward109
Category:

less

Transcript and Presenter's Notes

Title: Software Testing at Florida A


1
Software Testing at Florida AM
UniversityPerspective, Direction and
Opportunities
  • Dr. Edward L. Jones
  • CIS Department
  • Florida AM University
  • Tallahassee, FL

2
  • Part I
  • INTRODUCTION

3
My Institution
  • 12,000 students total
  • 600 CIS majors
  • 15 Faculty
  • Heavy teaching
  • Graduate program (20 students)
  • Increasing research

4
My Experience
  • 13 years Harris Corporation
  • Software engineering technology
  • 200M NASA project
  • methodology expert
  • software inspections, testing
  • process training
  • 20 years university teaching
  • 3 years focusing on testing

5
The Vision
  • Quality Awareness in Curriculum
  • Solid Framework
  • Enhanced Student Skill Set
  • Expanded Student Opportunities
  • Industry Partnerships
  • Research Dissemination

6
Software TestLab - Context
CS course
Technology Transfer
Testing Enthusiast
Donated/public test artifact
Artifact or tutorial
Graduate Testing Research
artifact
TestLab Artifacts
Test Arcade
artifact
Certification / Tutorial
artifact
Arcade Problems (Meta-Data)
TestLab Student Training
7
My Perspective
  • Testing is not just for testers!
  • In ideal world, fewer testers required
  • Verification vs testing
  • More skilled developers
  • No silver bullet just bricks
  • Simple things provide leverage
  • Testing in-the-small -- classroom the lab
  • Technology transfer crucial

8
SPRAE Testing Framework
S pecification basis for testing P
remeditation follow a process R epeatability
tester independence A ccountability
documentation E conomy cost effectiveness
9
Why SPRAE?
  • A value system for testing practice
  • Explains the what and why
  • Sets goals for test experiences
  • Each experience reinforces values
  • A framework for life-long learning.

10
A Test Life Cycle
11
  • Part II
  • EDUCATION MISSION

(What can Academics Do Besides Teaching A Course?)
12
Caught and Taught
  • Caught
  • Attitudes
  • Respect for consequence of not testing
  • Taught
  • Techniques for specific kinds of testing
  • Basic analytical and programming skills
  • Strategy must contain both elements

13
A Holistic Approach
14
What Is Meant By Holistic?
  • Testing an integral part of curriculum
  • Multiple test experiences
  • Experiences in each course
  • Repetition and reinforcement
  • Accumulation of experiences
  • Experiences cover test lifecycle
  • Holistic, NOT Exhaustive!

15
A Software Testing Course
  • 80 practice, 20 theory
  • 2/3 fine-grained testing (functions)
  • 1/3 object and application testing
  • Test cases, drivers, and scripts
  • Decision tables the "formalism"
  • Function, boundary, white-box testing
  • Effectiveness Coverage error seeding

16
Course -- Lessons Learned
  • Advantages
  • A taste" of how hard testing really is
  • Preparation for advanced study
  • Rounds out analytical programming skills
  • Deficiencies
  • Course not available to all students
  • Students compartmentalize knowledge

17
Technology Transfer -- Program Grading Services
  • Provided by TestLab students
  • Experience designing automated tests
  • Shell programming
  • Instructor plays active part
  • Refines specification
  • Creates grading plan
  • Benefits
  • Faculty more inclined to incorporate testing
    module into course

18
Technology Transfer -- Plagiarism Deterrent
  • Mutation analysis/error seeding applied to
    suspicious student program
  • Student must debug modified program
  • TestLab project tunable error seeding
  • Benefits
  • Tool for teaching students to debug
  • Less plagiarism or more skilled programmers

19
Core Course Test Experiences
  • Grade another student's program
  • Blind test -- develop specification by testing a
    working program.
  • Treasure Hunt find/fix seeded errors
  • Write test cases before coding
  • Develop, certify and reuse components

20
The Software TestLab
  • Environment for discovery learning
  • Basic testing skills
  • Mentoring / Competency based training
  • Evolving Laboratory
  • Tools tutorials
  • Staffed by students and faculty
  • Problems/solutions test bed
  • Dissemination Strategy

21
Software TestLab - Context
CS course
Technology Transfer
Testing Enthusiast
Donated/public test artifact
Artifact or tutorial
Graduate Testing Research
artifact
TestLab Artifacts
Test Arcade
artifact
Certification / Tutorial
artifact
Arcade Problems (Meta-Data)
TestLab Student Training
22
Student Mentorship Model
  • Managed skill development
  • Clear achievement goals
  • Key Practices x Competency Levels
  • Progress certification
  • Student-Student mentoring
  • Recognition Program

23
Key Practices
  • Practitioner -- performs defined test
  • Builder -- constructs test machinery
  • Designer -- designs test cases
  • Analyst -- sets test goals, strategy
  • Inspector -- verifies process/results
  • Environmentalist -- maintains test tools
    environment
  • Specialist -- performs test life cycle.

24
Specialist I - Competencies
1
2
3
4
5
...
Practitioner
Test Practitioner
1
2
3
4
5
...
Test Builder
1
2
3
4
5
...
Test Designer
1
2
3
4
5
...
Test Analyst
1
2
3
4
5
...
Test Inspector
1
2
3
4
5
...
Test Environmentalist
1
2
3
4
5
...
Test SPECIALIST
25
Test Specialist I
  • Practitioner I - Run function test script
    document test results.
  • Builder I - Develop function test driver.
  • Designer II - Develop functional and boundary
    test cases from decision table.
  • Analyst I - Develop decision table from
    functional specification.
  • Inspector I - Verify adherence to function test
    process and standards.

26
Training Topics
  • TestLab Environment Basic Training
  • Black-Box Function (unit) Testing
  • Black-Box Object Testing
  • White-Box Function Testing
  • Clear-Box Object Testing
  • . . .

27
Problems/Solutions Testbed
  • Repository of testing problems
  • Repository of student test artifacts
  • Best in class ? solutions testbed
  • Deficient solutions ? almost testbed
  • Testbed used for tester certification

28
The Test Arcade
  • Fun Games approach
  • Players compete solving testing problem
  • Scored on time and accuracy
  • Ranked list of players by score
  • HELP facility provides a "teaching" mode
  • Supports testing contests, certification
  • NSF Funding requested

29
  • Part III
  • RESEARCH ACTIVITIES

30
Research Obstacles
  • Search for the silver bullet
  • All eggs in one basket, i.e., testing only
  • In-the-large empirical studies
  • Access to project data
  • Experimental design
  • Graduate/research student pool
  • Critical mass of faculty

31
Research Funding
  • 85K Corporate support
  • 50K from Dell Star Program (TestLab)
  • 35K for student / travel
  • Task on 1.5M NSF MII grant
  • 500K NSF ITR proposal submitted (2x)

32
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns/verification agents
  • Other Ideas

33
Decision Based Testing (DBT)
  • Decision Table
  • Logic model, lightweight formality
  • Simple to teach
  • Systematic test condition generation
  • Column gt behavioral equivalence class
  • Conditions gt basis for boundary analysis
  • Complication
  • DT variables computed from inputs

34
Simple ExampleDBT SPRAE
  • Compute pay with overtime calculation based on
    employee classification
  • Example illustrates
  • SPRAE principles
  • Test case design
  • Opportunities for automation

35
Principle Specification
Specification Compute pay for an employee, given
Hours worked and hourly pay Rate overtime is 1.5
times hourly Rate, for Hours above 40. Salaried
employees earn over 20/hour, and are always paid
for 40 hours.
36
Principle Premeditation
Use a systematic process to devise test cases
based on the specification.
  • Functional testing
  • Decision rule (column) gt behaviors
  • One test case per behavior
  • Determine expected result.

37
Decision Analysis
Behavior
38
Functional Test Cases
1
2
39
Boundary Test Cases
40
Principle Repeatability
Processes for test case creation and test
execution must yield equivalent results,
independently of the tester.
  • Systematic processes for deriving
  • Functional test cases
  • Boundary test cases
  • Processes can be automated

41
Repeatability -- Test Scripts
Functional Test Cases
Boundary Test Cases
42
Principle Accountability
Keep records that document test process and
artifacts.
  • Documentation answers
  • What tests were planned?
  • Which tests were conducted?
  • Who did what testing,and when?
  • What were the results?
  • How were the results interpreted?

43
Accountability -- Test Log
Test Results Summary
Failure Description
44
Principle Economy
Economy Test activities must not require
excessive time or effort.
  • Systematic processes
  • Risk-based
  • Automation
  • Test drivers/harnesses (classical)
  • Test case generation/assistance
  • Test results analysis (oracle, database).

45
DBT -- Related Work
  • State-based testing
  • State models a form of decision table
  • State explosion problem -- infeasibility of
    exhaustive testing
  • Published methods
  • Parnas Software Cost Reduction (SCR) methodology
  • Object-oriented testing

46
Commercial/Research Decision Table Tool
  • Prologa (PROcedural Logic Analyzer)
  • Interactive, rule-based
  • Table construction and manipulation
  • Facilitates tasks such as checking, adding or
    reordering conditions.
  • Reference
  • Robbin, F., Vanthienen, J., Developing Legal
    Knowledge Based Systems Using Decision Tables,
    Communications of the ACM, 1993.

47
TestLab Decision-Based Test Tools
DT-Editor
Testdata
DT-2-logicspec
DT
Test Data Specifications
TD-verify
DT-Reverse
BTD-Generate
Testset Adequacy
Boundary Testdata
Source Code
48
Custom Decision Table Tools
  • Support software specification and testing
  • TD-verify determines functional test set adequacy
    based on DT rules
  • Post facto verification of whitebox coverage
    (when DT-Reverse used)
  • Execution-free adequacy test
  • Formalism to support test arcade

49
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns / verification agents
  • Other Ideas

50
Test Arcade
Problem Testbed of problem templates
problem/answer generator presentation
response manager score manager.
Approach Identify problem templates and develop
knowledge base for representation and generation.
Target problems to specific skill levels.
Research Refine competency levels x test
instances. Accumulate problem set per (level,
instance) pairs. Tool database web access.
51
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns / verification agents
  • Other Ideas

52
Reliability Test Simulator
  • Prototype from Graduate IVV Course
  • Published in ISSRE 2001 Proceedings
  • Excerpts from Presentation

53
Reliability Simulation -- TBDs
  • Simulation for experimentation
  • Imperfect defect removal
  • Operational testing
  • Thesis topic(s)
  • Model validation vs. published/industry results
  • Management decision-support via simulation

54
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns / verification agents
  • Other Ideas

55
Design Mutation
  • Reflects design misinterpretations
  • Mutant killed when Failk is nonempty

Fail1
Tset1
D1
Design
Fail2
Tset2
D2
Fail3
Mutate
Tset3
D3
...
...
...
Failn
Tsetn
Dn
56
Mutation of Decision Table
Mutant Behavior
Mutant Behavior
57
Impact of Mutations on Functional Test Cases
Bad test case
Bad test case
58
Design Mutation -- Prospectus
  • Application
  • Requirements/design models
  • Guided inspection -- ensure mutants not being
    created
  • Testing creation of BAD test cases
  • Practical if automated
  • Facilitated by formal representation

59
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns / verification agents
  • Other Ideas

60
Test Patterns (Example)
  • GUI input controls
  • Properties determine required testing
  • Derive black-box test patterns for GUI input
    controls
  • Propose automated test agents knowledgeable about
    specific test pattern

61
Test Patterns (Continued)
  • Test patterns for design patterns
  • Empirical test patterns reflecting organizational
    practices
  • Reverse engineer practices into patterns
  • Forward engineer patterns into practice

62
Test Agents
  • Properties of Agents
  • Responsibility
  • Knowledge
  • Autonomy
  • Scope of Agents
  • Functional Unit (F)
  • Architectural Component (A)
  • System (S)

63
Test Agents -- Prospectus
  • Intelligence
  • Generator of checklist / procedure for human
    tester
  • Watch-dog and status reporter
  • Reminder to complete test action
  • Performer of test action
  • Coordinator of other test agents

64
Seminal Projects
  • Decision-based test methods
  • Reliability testing training simulator
  • Test arcade
  • Testing via design mutation
  • Test patterns / verification agents
  • Other Ideas

65
Testing A State of Practice Study
Problem Develop a method for characterizing the
state of practice in software testing.
Approach Use frameworks like the Test Maturity
Model and SPRAE to characterize/assess state of
practice.
Research Relate attributes of testing practice
to qualitative and quantitative indicators of
effectiveness and satisfaction. Devise easy to
use evaluation system that identifies areas
needing improvement, and which provide insightful
measures of performance.
66
Adaptive Operational Testing
Problem Perfecting software by OT is biased by
expected feature usage. Even when errors are
flushed out for one feature, the test emphasis
remains the same. OT leads to slow rate of error
discovery in other features.
Approach Given features A,B,C with probabilities
pA, pB, pC, and MTBF of tA, tB, tC. Shift feature
probability as feature reliability increases.
Research Determine criteria for shifting ps so
that feature starvation does not occur. Tool
Reliability simulator to prototype solutions.
67
Testing and Reverse Engineering
  • Testing answers the questions
  • What do we have here? (exploratory)
  • Is this what we wanted? (acceptance)
  • Is this what we expected? (scripted)
  • Testing is the last chance to document the
    as-built system
  • Exploratory testing -- can it be sooner?

68
Testability via Self-Diagnostics in Objects
Problem Lack of observability Voas in O-O
software complicates testability. Encapsulation
prevents error propagation to outside.
Approach Design in self diagnostics along with
means of propagation to outside.
Research Research observability problem in
object-based testing. Apply and extend
frameworks/methods like JUnit to implement
self-diagnostics.
69
  • Part IV
  • CONCLUSION

70
Support Acknowledgment
  • Lucent Technologies
  • Dell Computer
  • 3M
  • Telcordia
  • Students
  • NSF (EIA-9906590)
  • Lockheed Martin
  • Abbott Laboratories
  • Eli Lilly and Company
  • Software Research, Inc.

71
Future
  • Evolve TestLab Mentorship Model
  • Transfer to Selected Courses
  • TestLab students transfer agents
  • Disseminate Results
  • Web site, newsletter, conferences
  • Procure Funding for Research
  • Find Research Collaborators

72
  • Opportunities To

73
Questions?
Questions?
Questions?
Write a Comment
User Comments (0)
About PowerShow.com