Introducing Test Automation Into An Organization - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

Introducing Test Automation Into An Organization

Description:

Tabernus Aries for generating I/O. InterWorking Labs (IWL) Silvercreek for SNMP testing ... in itself (If there are several man-years in a function, why are ... – PowerPoint PPT presentation

Number of Views:130
Avg rating:3.0/5.0
Slides: 20
Provided by: stuart3
Category:

less

Transcript and Presenter's Notes

Title: Introducing Test Automation Into An Organization


1
Introducing Test Automation Into An Organization
  • Lessons Learned So Far

Lorraine Bier
2
Presenting Lessons Learned During Test
Automation- overview
  • Some background who we are and where we were
    when we decided to automate
  • Our approach for automation
  • First step deciding on the overall approach
  • Second step establishing the list of tests to
    automate
  • Third step choosing the tools to use
  • Forth step defining each test sufficiently for
    automation
  • Fifth step writing and deploying the tests
  • Lessons Learned from each step of the way!

3
Background on Crossroads Systems
  • Austin-based company (CRDS, www.crossroads.com)
  • Founded 1994
  • Currently 130 employees
  • 26 programmers
  • 16 testers
  • Developer/manufacturer of storage routers
  • Permits multi-protocol servers and storage
    devices to operate as peers on storage networks
  • Easy, cost effective way to store, retrieve, and
    protect data integrity
  • Boxed and embedded products
  • Custom-designed cards (not chips)
  • Real-time embedded software

4
The Motivation for Automation
  • Test organization was using manual procedures
  • Protocol analyzers, error injection tools used
    too
  • In use since start-up days when no formal product
    specs were available
  • Based on programmers description of product as
    built no traceability to source of data
  • High level procedures required very experienced
    testers
  • Barrier to hiring new personnel
  • Decided to automate in 2002
  • Want to gain benefits of
  • Execution speed
  • Repeatability for results
  • Easier execution for testers
  • Products use of standard protocols make good
    candidates for automation

5
Step 1 Deciding The Overall Approach
  • Question automate existing procedures or start
    fresh?
  • Evaluation of existing procedures showed problems
  • Incomplete coverage, unclear purpose,
    insufficient evaluation criteria
  • Our decision Start fresh! (do not automate old
    procedures)
  • Better able to ensure even test coverage by
    focusing on product, not old procedures
  • Game plan
  • Formed a dedicated automation team separate from
    testers supporting projects
  • Create outline of needed tests
  • Did not look to create an integrated test bed
  • Survey automation tools, perform tradeoff
    analysis and select tools needed to support tests
  • Prioritize tests to introduce automation where
    its most needed
  • Refine documentation processes, templates to
    support effort
  • Start automating in priority order

6
Step 2 Planning The List of Tests
  • Goal complete, non-redundant coverage
  • Focused on software functionality but must
    exercise all supported platforms
  • Support both feature verification testing and
    overall system behavior
  • Approach Architects, senior engineers
    brainstormed, refined test categories structure
  • Several proposed structures were considered and
    rejected
  • Settled on outline that reflected software
    architecture, then specific protocol standards
  • Structures outline is nested 3 deep
  • Reflecting software architecture helps ensure
    coverage
  • More intuitive for software developers reviewing
    the test information

7
Step 3 Choosing the Tools
  • Looked for tools that supported the major
    functional test areas
  • Wanted as small a set as possible
  • Performed tradeoff analysis
  • Got vendor demos
  • Ran evaluation copies
  • Established criteria (including price
    considerations)
  • Decisions
  • Acquired tools that met criteria
  • I-tech Eagle 6160 for low-level FC
  • I-tech Passport for SANMark testing (FC)
  • Tabernus Aries for generating I/O
  • InterWorking Labs (IWL) Silvercreek for SNMP
    testing
  • Rational Robot for User Interface testing
  • Rational Test Director for overall automation
    test execution
  • Misc. freeware as available Ethereal, IOmeter,
    FCONOFF, others
  • Decided against some tools that did not meet
    criteria
  • Interoperability problems, unjustifiable cost,
    customer support problems

8
Step 4 Defining Each Test Sufficiently for
Automation
  • Need for rigorous test designs was obvious must
    decide what to do before how to do it
  • Want development engineers to review the test
    without having to learn the tool/read the code
  • Still making up for lack of complete requirements
    on oldest features
  • Iterated several times on best test design
    template for automation
  • Tried IEEE, DoD, etc.
  • Settled on what supports the most important
    information
  • Assumptions and Constraints
  • Test Approach (narrative overview)
  • Test Cases
  • Test Cases subdivided into
  • Included
  • Potential
  • Excluded

9
Step 5 Writing and Deploying the Tests
  • First wrote the tools programs and scripts
    according to the design
  • Modified design as needed based on tools
    performance
  • Then wrote the procedures for test techs to
    execute them
  • Reviewed the procedures with the test executors
  • Gave hands-on training in set-up and execution
  • Review and hands-on demos served as verification
    test for the procedures

10
Lessons Learned From the Overall Plan
  • Automation needs highly specialized people
  • Strong test theory, programming abilities, domain
    expertise, sys admin skills
  • Most testers cannot be automators
  • Dedicated team was necessary but has drawbacks
  • Must allow automators to focus on technical
    issues, not a specific projects time constraints
    and needs
  • Automation and Execution groups can grow apart,
    lose touch with each others efforts

11
Lessons Learned from Establishing the Test List
  • Partitioning a system completely into
    non-overlapping pieces is hard!
  • Supporting both function and system testing added
    complexity
  • When partitioning
  • Keep it simple- expand as needed
  • Be concrete regarding the scope and purpose of
    each category
  • When prioritizing
  • Create a weighting scheme to deliver the biggest
    bang for the buck- consider
  • Frequency of execution
  • Difficulty of automation
  • Time for automation vs manual execution
  • Tie test priorities to projects (helps motivate
    roll-out)

12
Lessons Learned from Choosing the Tools
  • Caveat Emptor
  • Marketing terms can be misleading regarding scope
    of capabilities
  • Check tools supported version of standard vs
    your implemented version
  • Look at the tools implementation details for
    ease of use
  • There can be restrictions on moving licenses from
    server to server
  • Dongles are a drag! Too easy to lose, need
    manual/physical tracking
  • Need good working relationships with tools
    vendors when debugging problems
  • Trade!  Send them equipment!
  • Consider whitepapers, joint findings
  • Be prepared for Tool Vendors to change their
    product line out from under you!
  • Keep good records of communications, commitments

13
Lessons Learned from Defining the Tests
  • Just because it can be done doesnt mean you
    should do it
  • Perform build or buy analysis
  • Check youve made the appropriate tool choice
  • Not worth time spent automating, considering some
    tests frequency or duration of manual execution
  • Test Design documents were extremely beneficial
  • Excellent vehicle for programmer inputs
  • Fosters team building, inter-group cooperation
  • Found many bugs just writing the designs
  • Especially efficient when code is being developed
    concurrently with test cases
  • Allows outsourcing
  • NOTE THESE POINTS ARE TRUE FOR CREATING MANUAL
    TESTS, TOO!

14
Lessons Learned from Writing Tests
  • When automating youll be debugging the tools
    along with your procedures and the product
  • One more thing to be broken!
  • Be prepared for tools not working as expected
  • Wait on fix or redo approach?
  • Helps to have a tool expert for each tool
  • Go to person for questions
  • Standardize engineering for tools usage
  • Expect maintenance
  • Allow time to learn the tool
  • Remember to engineer for reusability across
    products
  • Test automation still takes lots of engineering. 
    It takes time!
  • It is a software development effort in itself
    (If there are several man-years in a function,
    why are you expecting to code the test in weeks?)

15
Lessons Learned from Deploying the Tests
  • Problems getting tests out of development group
    and into hands of test techs
  • Training hand-holding help
  • Schedule time for it!
  • Working on tests they are most interested in
    helps
  • A static configuration fosters execution
  • If the automated test is too difficult to use, it
    reduces the advantage of the automation
  • People will avoid running it
  • Resulting bugs severity, importance can be hard
    to assess
  • Cant easily tie back to real world impact
  • Automated testing results taken less seriously by
    developers

16
Misc Lessons, Final Thoughts
  • Configuration Management is crucial          
  • Need to relate designs to procedures- they dont
    map 1-1
  • Need EVERYTHING under configuration control     
                                 
  • Need good librarianship!
  • Metrics
  • Average of 31 Included, 12 Potential, 13 Excluded
    test cases in test designs
  • Procedures length averages 9 pages

17
Backup Slides
  • Reference Material

18
Definitions
  • Test Case
  • A set of inputs/execution conditions and expected
    results developed for a particular objective,
    such as exercising a particular program path or
    verifying compliance with a specific requirement.
  • Test Design
  • A description of test cases that should be
    exercised to prove correct product behavior.
  • Test Procedure
  • A deterministic, reusable, repeatable set of
    instructions for the set-up, execution, and
    evaluation of results. One or more Test
    Procedures implement a Test Designs cases.

19
Test Design Table of Contents
  • 1. Introduction
  • 1.1 Identification
  • 1.2 Definitions Acronyms
  • 1.3 Design Assumptions and Constraints
  • 1.4 Test Objectives Identification
  • 1.4.1 Implemented Cases (table)
  • 1.4.2 Potential Cases (table)
  • 1.4.3 Excluded Cases (table)
  • 2. Test Cases
  • 2.1
  • 2.1.1 Test Description
  • 2.1.2 Referenced Documents
  • 2.1.3 Technical Approach
  • 2.1.4 Configuration/Devices
  • 2.1.5 Required Test Tools
  • 2.1.6 Expected Results/ Qualification Method
  • 2.2 , etc.
  • Appendix 1 Test Case Rationale Discussion
Write a Comment
User Comments (0)
About PowerShow.com