Successful Methods in Test Automation - PowerPoint PPT Presentation

1 / 21
About This Presentation
Title:

Successful Methods in Test Automation

Description:

elf-check work. Nightly Regression. team use automation. data ... Evaluate 'Build it yourself' vs. 3rd party. Do a full needs assessment of your test library ... – PowerPoint PPT presentation

Number of Views:109
Avg rating:3.0/5.0
Slides: 22
Provided by: saqa
Category:

less

Transcript and Presenter's Notes

Title: Successful Methods in Test Automation


1
Successful Methods in Test Automation
  • Ray Arell
  • (ray.arell_at_intel.com)

The views expressed are those of the presenter
Not his employer
2
Agenda
  • Background
  • Lessons learned
  • Questions

3
Background Example of Large Scale Automation
An enterprise test and platform automation
infrastructure. Testing is ordered via the
intranet, automatically run on parallel systems
for fast TPT, and results are emailed back to the
user.
  • Today Users
  • 160 total world wide
  • Build, Regression, Functional, Coverage testing
  • Typical usage 450K tests / wk
  • 20M tests run in 2002
  • Test Floor Systems
  • 140 test systems 200 by EOY
  • 8 platforms 10 by EOY
  • 6 OS versions (multiplied by DX versions)
  • 19K tests

4
Parallel Execution Faster Time-to-Information
300 People Hours
30 Machine Hours
5
Background Manual vs. Automated
  • Manual Execution
  • 30 people testing (90K/month)
  • 20,000 tests per month
  • 4.5 per test
  • Automation
  • 25 people maintaining/enhancing (80K/month)
  • 2,000,000 tests per month
  • 100X vs. Manual
  • 0.04 per test

6
Background It can get big
7
Lessons learned Test integration and management
  • Do not try to automate a moving test target!
  • Establish standards of how automated tests
    written
  • Have and maintain an owner for each test and
    suite
  • Establish a low-touch test qualification/integrati
    on process
  • Version your tests and keep good records on
    changes
  • Run virus detection on all software!

8
Lessons learned Test execution
  • Focus on parallel execution
  • Remember that the longest test dictates the TPT
  • It needs to be fully automatic
  • Enables Code by day and test by night
  • Implement administration functions
  • Monitor infrastructure health
  • Stop/Pause testing
  • Adjust test priority

9
Example Test Monitor Application
10
Lessons learned Use a Web interface
  • Gives the ability to control and order testing
    world wide via the intranet 24/7
  • Allow users to
  • Upload SW product to be tested
  • Monitor progress
  • Control testing
  • Make the interface user context driven

11
Example Web Interface
12
Example Monitor usage
13
Lessons learned Debug features
  • Design your automation system to anticipate that
    tests will fail!
  • Design your tests to provide proper failure root
    cause
  • Tests should also work outside the automation
    infrastructure for easy product and test debug
  • Implement a remote debug capability
  • Structure your test execution to run simple to
    complex

14
Lessons learned Product quality criteria linkage
  • Make sure each test has a voice in shipping the
    product
  • If not then question the ROI on running it
  • Directly link the passing of tests to when
    product development is done
  • Establish a clear escalation path

15
Example Integrated Process
Product DST
Code Development/ Bug Fix
Automation
No
On-Demand ULT (Pre-Check-in)
Pass?
Log Defect
No
Rebuild with Latest code in Baseline
Pass?
Self-Check ULT (Optional)
Yes
Yes
Automation
No
Check In
Pass?
Automation
Production Baseline
Engineering Baseline
Pre-Production Baseline
Nightly ULT Regression (Post-Check-In)
16
Lessons learned Traceability
  • Keep records on all test results
  • Result
  • Last time passed
  • Etc..
  • Put results on the web
  • Track your test performance
  • Pass/Fail rates at each phase of your product
    life cycle

17
Example Performance Tracking
Component Teams use automation s elf-check work
Nightly Regression team use automation data to
find regression issues
ECT (Next Page)
Formal Build
Eng Build
Self Build
Data Reaction
18
Lessons learned Other Factors
  • Watch your test capacity
  • Make sure that people know the limitations of
    your infrastructure
  • Make sure management understands the automation
    ROI
  • Monitor your infrastructure health
  • Publish your known issue list
  • Publish a FAQ on the web
  • Lab space

19
Lessons learned Other Factors
  • Evaluate Build it yourself vs. 3rd party
  • Do a full needs assessment of your test library
  • Look for key features needed by tool kit
  • Talk to other customers of the tool
  • As about maintainability
  • Scalability
  • Training
  • Evaluate support cost of all tool options
  • License fees for both test clients and servers

20
Questions???
21
About the Presenter
  • Ray Arell manages the Validation Architecture
    department within Intel's Desktop Platforms
    Group. Ray has over 18 years of development,
    validation, and management experience, and has
    been with Intel for 11 years. During this time,
    he has worked on a variety of teams focused on
    CPU, chipsets, and graphics system-level testing
    including the i386, i486, and Pentium
    processors, and supporting chipsets. He is also
    the co-author of Change-Based Test Management
    Improving the Software Validation Process (ISBN
    0971786127).
  • Email ray.arell_at_intel.com
Write a Comment
User Comments (0)
About PowerShow.com