Emerging Trends in Software Model Checking - PowerPoint PPT Presentation

1 / 57
About This Presentation
Title:

Emerging Trends in Software Model Checking

Description:

Compatible to relate to complex artifacts. A Model Checking - A formal Definition ... Elisabeth A. Strunk, M. Anthony Aiello, John C. Knight, Eds.Technical ... – PowerPoint PPT presentation

Number of Views:142
Avg rating:3.0/5.0
Slides: 58
Provided by: sasi9
Category:

less

Transcript and Presenter's Notes

Title: Emerging Trends in Software Model Checking


1
Emerging Trends in Software Model Checking
  • Presented By
  • Deivanai Chidambaram
  • Gayathri Varadarajan
  • Priyanka Namburu

2
Agenda
  • Motivation
  • Introduction
  • Popular Model Checkers
  • An Industrial Case Study
  • Conclusion

3
An Ideal Verification Scheme
  • Verification method should..
  • Have a formal foundation
  • Facilitate implementation of tool
  • Has a higher degree of automation
  • Lesser User Interaction
  • Scalable
  • Compatible to relate to complex artifacts

4
A Model Checking - A formal Definition
  • Model checking Formal verification technique
    tuned for finding errors by comprehensively
    exploring the state spaces defined by a system.

5
(No Transcript)
6
Automata Based MC
  • For explicit state model checking
  • A finite state like automata modeled to accept
    infinite behaviors
  • LTL specification used for specifying the number
    of accepting states.
  • A run should have a non-empty intersection with
    the LTL listing.

7
LTL Specification
  • Buchi-Automaton,A
  • The Automaton
  • Never Claim Automaton, A
  • Condition for Acceptable Run
  • L(B) ? L(A) F

8
When Software Model Checking is efficient?
  • For application domains where the cost of just a
    handful of high-severity bugs is high
  • When code inspections and scenario-by-scenario
    requirement-driven testing are critical
  • the cost required to apply model checking is
    likely to be significantly offset by the high
    cost (and embarrassment) of severe software
    failures in the field.

9
Model Checking Tools for C Language
  • SLAM
  • BLAST

10
  • Software Verification Tools
  • Input-C Program Specification Property
  • Output-Safe if program satisfies the property.
  • else
  • -Not safe and gives error
  • trace

11
Overview of SLAM BLAST
A Survey of Tools for Model Checking and
Model-Based Development Elisabeth A. Strunk, M.
Anthony Aiello, John C. Knight,
Eds.Technical Technical Report CS-2006-17
Department of Computer Science University of
Virginia June 2006
12
SLAM
  • Main goal-To check C programs for safety
    properties using model checking.
  • Three main components
  • -c2bpEvaluates a boolean abstraction
  • -bebop Performs reachability analysis
  • -Newton Verifies feasibility of error paths

13
contd
  • Specification Language-SLIC (Specification
    Language for Interface Checking).
  • Specifications
  • -Sate machines
  • -State variables
  • -Set of events
  • -State transitions

14
Working of SLAM
A Survey of Tools for Model Checking and
Model-Based Development Elisabeth A. Strunk, M.
Anthony Aiello, John C. Knight,
Eds.Technical Technical Report CS-2006-17
Department of Computer Science University of
Virginia June 2006
15
  • C2bp
  • -Input C program, set of predicates
  • -Output Boolean program
  • Bebop
  • -Input Boolean program
  • -Output Yes if error is reachable
  • Newton
  • -If path found, checks for feasibility.
  • -If feasible, output error trace.
  • -If path not found, additional predicates.

16
c2bp
  • Computing boolean expression
  • -Linear in size of program
  • -Exponential in size of predicates
  • c2bp modular abstraction in isolation
  • -program
  • -procedures
  • -statements

17
Bebop
  • Model checker
  • Time consuming component
  • States Bit vectors
  • Provides error path

18
Newton
  • Symbolically simulates the trace
  • If trace spurious
  • -new predicates, refined abstraction
  • Else
  • -Imprecise abstraction
  • -SLAM invokes another refinement
  • method-constrain.

19
Problems faced
  • Dealing with pointers
  • Dealing with imprecision
  • Scalability

20
Constrain
  • Examines each step of the trace.
  • Refines the abstract transition relation.
  • Improves accuracy of the abstraction.

21
Dealing with pointers
  • Abstracting from C to boolean.
  • In C, call-by-reference simulated.
  • Boolean programs support call-by-value.

22
  • Successfully checks control dominated properties.
  • Handles recursive procedures.
  • Satisfies properties of good model checker
  • -Soundness
  • -Completeness
  • -Usefulness

23
  • Precise in bug detecting.
  • Error trace mapped to original program.
  • Results
  • -Largest driver processed60K loc
  • -Largest abstraction several hundred boolean
    variables

24
BLAST
  • Berkeley Lazy Abstraction Software Verification
    Tool
  • Abstract-check-refine approach
  • Concepts of
  • -Lazy abstraction
  • -Interpolation-based predicate theory

25
Methodology
  • Input
  • -C program
  • -Specification property
  • Output
  • -Instrumented program with error label

26
A Survey of Tools for Model Checking and
Model-Based Development Elisabeth A. Strunk, M.
Anthony Aiello, John C. Knight,
Eds.Technical Technical Report CS-2006-17
Department of Computer Science University of
Virginia June 2006
27
  • Spec.opt merges to form the instrumented program
  • Instrumented program fed to pblast.opt (Model
    checker for BLAST)
  • If no paths to error label, then safe and
    generate proof.
  • If paths found then check for feasibility.

28
  • If feasible, generate output trace.
  • Else model refined further.

29
  • BLAST specification language similar to C
    language.
  • Specification looks for patterns
  • Inserts checks, actions when patterns matched.

30
Abstract-Check-Refine Approach
  • Three phases
  • -Abstract
  • -Check
  • -Refine

31
Abstract phase
  • Set of predicates to create abstraction.
  • Set of truth assignments to represent abstracted
    states.

32
Check phase
  • Checks safety property
  • If abstracted model is safe then original program
    is safe else error path generated.
  • If error phase spurious, go to refine phase.

33
Refine phase
  • Add more predicates.
  • Model refined further.

34
Lazy Abstraction
  • To integrate phases
  • Principles involved
  • -On-the-fly abstraction
  • -On-demand refinement

35
On-the fly abstraction
  • Same level of precision not advisable some
    states unreachable
  • Abstracted only when needed by check phase

36
On-demand refinement
  • Model reused after refinement.
  • Regions proven to be safe are not refined.

37
Comparison of SLAM BLAST
  • Similarities
  • -Static analysis
  • -CEGAR
  • -Extract finite state model from C prgm.
  • -Both verify safety properties
  • -Both tested on device drivers
  • -Handle C language constructs

38
  • Differences
  • -Use of lazy abstraction in BLAST
  • -Predicate discovery on-demand,
  • hence conservation of time
  • space.
  • -With respect to specification language.

39
  • Abstraction
  • -In SLAM, boolean program before execution
  • -IN BLAST, abstract reachability tree on-the-fly
    during execution.
  • Recursive functions
  • -SLAM handles
  • -BLAST does not

40
SPIN
  • Designed for analyzing logical consistency of
    software systems.
  • Focused on proving correctness of process
    interactions.
  • Abstracts internal sequential computations.
  • Detects design errors

41
PROMELA
  • System models in PROMELA (Process Meta Language)
  • Easier for good abstraction
  • Emphasis on process synchronization and
    coordination

42
  • SPIN performs
  • -Simulations of the system execution
  • -Generate verifier in C
  • SPIN reports on
  • -Deadlocks
  • -Unspecified receptions

43
Contd.
  • -Unexecutable code
  • -Flags incompleteness
  • -Race conditions
  • -Unwarranted assumptions
  • Verifier used to
  • -check logical consistency
  • -verify correctness of system invariance

44
Architecture
An Introduction to the SPIN Model Checker Project
Report Tool Description and Analysis Xiang Yin
45
  • Initiated from command line using XSPIN.
  • Users specify high-level model in PROMELA.
  • All correctness properties specified as LTL
    requirements.
  • LTL translator translates LTL formula to PROMELA.

46
  • Generates verification program from high level
    model.
  • Verifier compiled and executed.
  • If counter example detected, fed back to SPIN
    simulator.

47
Graphical Interface
  • Optional, not a must.
  • Eases the whole process of SPIN.
  • Provides clean view of commands.
  • Executes SPIN commands in the background.
  • Graphical displays of message flows etc.

48
Modes of operation
  • Two modes of operation
  • -Simulation
  • -Verification

49
Simulation
  • Step-by-step trace of system execution.
  • -Deals with larger state spaces
  • Three forms
  • -Random
  • -Interactive
  • -Guided
  • Use-To inspect error trial or counter example.

50
Verification
  • Verifier generated, compiled and run separately.
  • Verification procedure based on reach ability
    analysis.
  • Uses DFS or BFS.
  • Constructed on-the-fly
  • Avoids preconstruction of global state graph

51
Model checking software with verisoft
  • - An Industrial case study

52
Verisoft - Call Processing Library
  • Runs in Every Significant Base Station of Lucent
    Tech.
  • Supports hand off
  • In 3G - WNs, the call-processing software
    implements complex dynamic resource allocation
    algorithms to manage hand offs

53
Challenges
  • large number of possible scenarios
  • Module embedded in a highly networked environment
    composed of multiple processes, invoking
    call-processing functions through multiple
    interfaces.
  • Huge application implemeting a complex
    architecture.

54
  • Software Model Checking An Industrial Case Study
  • Satish Chandra, Patrice Godefroid, Christopher
    Palm
  • Proceedings of the 24th International Conference
    on Software Engineering

55
Limitation usingVerisoft MC
  • Test automation.
  • Integration into testing environment.
  • Test drivers.
  • Specifying properties.
  • State explosion.

56
Conclusion
  • Naïve Usage Chances of getting interesting
    feed-back is small.
  • Effective Usage
  • extremely effective test coverage
  • hard-to-find bugs are deducted, and reducing
  • Expertise is needed to limit state space
    explosion
  • Trade off between cost of finding bugs and
    failure recovery

57
  • Thank You.
Write a Comment
User Comments (0)
About PowerShow.com