Title: Model Based Testing of Large-Scale Software
1Model Based Testing of Large-Scale Software
How Can Simple Model Test Complex System
- Victor Kuliamin
- kualimin_at_ispras.ru
- ISP RAS, Moscow
2Real Software Systems
- They are huge and have a lot of functions
- They have very complex interfaces
- They are developed by hundreds of people
- They are distributed and concurrent
System Year Size
(MLOC) Windows 3.1 1992
3 Windows NT 3.1 1993
6 Windows 95 1995
15 Windows NT 4.0 1996
16,5 Red Hat Linux 5.2 1998
12 Debian Linux 2.0 1998
25 Windows 2000 1999 29 Red
Hat Linux 6.2 2000 17 Sun
StarOffice 5.2 2000 7,6 Debian
Linux 2.2 2000 59 Red Hat
Linux 7.1 2001 30 Windows XP
2001 45 Red Hat Linux 8.0
2002 50 Debian Linux 3.0 2002
105
System Year DevTeam
Size Windows NT 3.1 1993
200 Windows NT 3.5 1994
300 Windows NT 4.0 1996
800 Debian Linux 1.2 1996 120
Debian Linux 2.0 1998 400
Windows 2000 1999
1400 Debian Linux 2.2 2000 450
Debian Linux 3.0 2002 1000
3Quality of Real Software Systems
- They are tested a lot
- But
- Details of their behavior are not well defined
- And they still do have serious bugs
System Year TestTeam
Size Windows NT 3.1 1993
140 Windows NT 3.5 1994
230 Windows NT 4.0 1996 700
(0.9) Windows 2000 1999 1700
(1.2)
System Test Cases, K MS Word XP
35 Oracle 10i 100 Window
XP gt2000 (?)
4Model Based Testing a Solution?
- Potential to test very large systems with high
adequacy - Parallelization of work on system and its tests
- Google on model based testing case study
gives - 630 links on 230 sites
- 60 separate case studies concerned with industry
since 1990 - Most MBT case studies are small
- lt 10 case studies concerned with systems of size
gt 30 KLOC - Most MBT techniques are based on state models and
hence prone to state explosion problem
?
5Fighting Complexity
- No simple way to test a complex system adequately
- But manageable way exists use of general
engineering principles - Abstraction
- Separation of concerns
- Modularization
- Reuse
6UniTesK Solutions
- Modularize the system under test contract
specifications of components - Modularize the test system flexible test system
architecture - Adapters binding test system and SUT
- Contracts Oracles checking SUTs
behavior - Test coverage goals based on contracts
- Test data generators for single operation
- Testing models (test scenario) test sequence
composition - Abstract contract, more abstract testing model
- Reusability of contracts, testing models,test
data generators
?
7Software Contracts
Contracts (preconditions, postconditions, data
integrity constraints) help to describe
components on different abstraction levels
Contract A
Component A
Component C
Contract I
Contract C
Subsystem II
Subsystem I
Contract B
Contract II
Component B
Component D
Contract D
8Test Coverage Goals
- post
- if ( f(a, b) g(a) )
- else if( h(a, c) !g(b) )
- else
9Testing Model
parameters
operation domain
2
3
coverage goals
1
states
10Test Data Generation
- Computation of single call arguments
parameters
Test data generation is based on simple
generators and coverage filtering
2
3
1
states
current state
11The Whole Picture
Testing Model
Behavior Model
System under Test
Coverage Model
On-the-fly Test Sequence Generation
Single Input Checking
12Testing Concurrency
11
21
21
s11
s12
11
Target System
r12
r11
12
s21
31
12
22
r22
r21
s31
Time
- Multisequence is used instead of sequence of
stimuli - Stimuli and reactions form a partially ordered set
13Checking Composed Behavior
- Plain concurrency behavior of the system is
equivalent to some sequence of the actions
Plain concurrency axiom
?
14(No Transcript)
15The Case Study
- 1994 1996
- ISP RAS Nortel Networks project on functional
test suite - development for Switch Operating System kernel
- Size of the SUT is 250 KLOC
530 interface operations - 44 components were determined
- 60 KLOC of specifications40 KLOC test
scenarios developed in 1.5 year by 6 people
- A lot of bugs found in the SUT, which had been in
use for 10 yearsSeveral of them cause cold
restart - 30 of specifications are used to test other
components - 3 versions of the SUT were tested by 2000 (500
KLOC)Changes in the test suite were lt5
16Other Case Studies
- IPv6 implementations - 2001-2003
- Microsoft Research
- Mobile IPv6 (in Windows CE 4.1)
- Oktet
- Intel compilers - 2001-2003
- Web-based banking client management system
- Enterprise application development framework
- Billing system
- Components of TinyOS
- http//www.unitesk.com
17UniTesK Tools
- J_at_T 2001
Java / NetBeans, Eclipse (plan) - J_at_T-C Link 2003
- C / NetBeans MS Visual Studio
- CTesK 2002C
/ Visual Studio 6.0, gcc - Ch_at_se 2003
C / Visual Studio .NET 7.1 - OTK
2003Specialized tool for compiler testing
18References
- V. Kuliamin, A. Petrenko, I. Bourdonov, and A.
Kossatchev. UniTesK Test Suite Architecture.
Proc. of FME 2002. LNCS 2391, pp. 77-88,
Springer-Verlag, 2002. - V. Kuliamin, A. Petrenko, N. Pakoulin, I.
Bourdonov, and A. Kossatchev. Integration of
Functional and Timed Testing of Real-time and
Concurrent Systems. Proc. of PSI 2003. LNCS 2890,
pp. 450-461, Springer-Verlag, 2003. - V. Kuliamin, A. Petrenko. Applying Model Based
Testing in Different Contexts. Proceedings of
seminar on Perspectives of Model Based Testing,
Dagstuhl, Germany, September 2004. - A. Kossatchev, A. Petrenko, S. Zelenov, S.
Zelenova. Using Model-Based Approach for
Automated Testing of Optimizing Compilers. Proc.
Intl. Workshop on Program Undestanding,
Gorno-Altaisk, 2003. - V. Kuliamin, A. Petrenko, A. Kossatchev, and I.
Burdonov. The UniTesK Approach to Designing Test
Suites. Programming and Computer Software, Vol.
29, No. 6 , 2003, pp. 310-322. (Translation from
Russian) - S. Zelenov, S. Zelenova, A. Kossatchev, A.
Petrenko. Test Generation for Compilers and Other
Formal Text Processors. Programming and Computer
Software, Vol. 29, No. 2 , 2003, pp. 104-111.
(Translation from Russian)
19Contacts
- Victor V. Kuliamin
- kuliamin_at_ispras.ru
- 109004, B. Kommunisticheskaya, 25
- Moscow, Russia
- Web http//www.ispras.ru/groups/rv/rv.html
- Phone 7-095-9125317
- Fax 7-095-9121524
20Thank you!