Title: Predicate Learning and Selective Theory Deduction for Solving Difference Logic
1Predicate Learning and Selective Theory Deduction
for Solving Difference Logic
- Chao Wang, Aarti Gupta, Malay Ganai
- NEC Laboratories America
- Princeton, New Jersey, USA
- August 21, 2006
Presentation-only for more info. please check
Wang et al LPAR05 and Wang et al DAC06
2Difference Logic
- Logic to model systems at the word-level
- Subset of quantifier-free first order logic
- Boolean connectives predicates like (x y c)
- Formal verification applications
- Pipelined processors, timed systems, embedded
software - e.g., back-end of the UCLID Verifier
- Existing solvers
- Eager approach Strichman et al. 02, Talupur
et al. 04, UCLID - Lazy approach TSAT, MathSAT, DPLL(T),
Saten, SLICE, Yices, HTP, - Hybrid approach Seshia et al. 03, UCLID, SD-SAT
3Our contribution
- Lessons learned from previous works
- Incremental conflict detection and zero-cost
theory backtracking Wang et al. LPAR05 - Exhaustive theory deduction
Nieuwenhuis Oliveras CAV05 - Eager chordal transitivity constraints
Strichman et al. FMCAD02 - Whats new?
- Incremental conflict detection PLUS selective
theory deduction - with little additional cost
- Dynamic predicate learning to combat exponential
blow-up
4Outline
- Preliminaries
- Selective theory (implication) deduction
- Dynamic predicate learning
- Experiments
- Conclusions
5Preliminaries
Difference logic formula Difference
predicates Boolean skeleton Constraint graph for
assignment (A,B,C,D)
A ( x y 2 ), B ( z x -7 ) C ( y -
z 3 ), D ( w - y 10 )
A ( x y 2 ) B ( z x -7 )
C ( y - z 3 ) D ( w - y 10 )
6Theory conflict infeasible Boolean assignment
- Negative weighted cycle ?? Theory conflict
- Theory conflict ? Lemma or blocking clause ?
Boolean conflict
A ( x y 2 ) B ( z x -7 ) C (
y - z 3 ) D ( w - y 10 )
Conflicting clause (false false false)
7Theory implication implied Boolean assignment
- If adding an edge creates a negative cycle ?
negated edge is implied - Theory implication ? var assignment ? Boolean
implication (BCP)
A ( x y 2 ) B ( z x -7 ) C (
y - z 3 ) D ( w - y 10 )
8Negative cycle detection
- Called repeatedly to solve many similar
subproblems - For conflict detection (incremental, efficient)
- For implication deduction (often expensive)
- Incremental detection versus exhaustive deduction
-
- SLICE Incremental cycle detection -- O(n log
n) - DPLL(T) Exhaustive theory deduction -- O(n m)
SLICE LPAR05 DPLL(T) Barcelogic CAV05
Conflict detection Incremental NO
Implication deduction NO Exhaustive
9Data from LPAR05 Comparing SLICE solver
(SMT benchmarks repository,as of 08-2005)
vs. UCLID
vs. MathSAT
vs. ICS 2.0
vs. DPLL(T) Barcelogic
vs. DPLL(T)B (linear scale)
vs. TSAT
Points above the diagonals ? Wins for SLICE solver
10From the previous results
- We have learned that
- Incremental conflict detection ? more scalable
- Exhaustive theory deduction ? also helpful
- Can we combine their relative strengths?
- Our new solution
- Incremental conflict detection (SLICE)
- Zero-cost theory backtracking (SLICE)
- PLUS selective theory deduction with O(n) cost
11Outline
- Preliminaries
- Selective theory (implication) deduction
- Dynamic predicate learning
- Experiments
- Conclusions
12Constraint Propagation
Theory Constraint Propagation
Deduce() while (implications.empty())
set_var_value(implications.pop()) if
(detect_conflict()) return
CONFLICT add_new_implications() if (
ready_for_theory_propagation() ) if
(theory_detect_conflict()) return
CONFLICT theory_add_new_implications()
Boolean CP (BCP)
13Incremental conflict detection
Ramalingam 1999 Bozzano et al. 2005 Cotton
2005 Wang et al. LPAR05
Relax Edge (u,v) if ( dv gt duwu,v
) dv duwu,v piv u
0
0
x
y
w
z
-7
0
- Add an edge ? relax, relax, relax,
- Remove an edge ? do nothing (zero-cost
backtracking in SLICE)
14Selective theory deduction
Post(x) x, z,
Pre(y) y, w,
y
w
z
x
Piy w
through relax
15Outline
- Preliminaries
- Selective theory (implication) deduction
- Dynamic predicate learning
- Experiments
- Conclusions
16Diamonds with O(2n) negative cycles
e1
e2
-1
e0
Observations With existing predicates (e1,e2,)
? exponential number of lemmas Add new
predicates (E1,E2,E3) and dummies (E1!E1)
(E2!E2) ? almost linear number
of lemmas Previous eager chordal transitivity
used by Strichmann et al. FMCAD02
17Add new predicates to reduce lemmas
z
x
y
w
Heuristics to choose GOOD predicates
(short-cuts) Nodes that show up frequently in
negative cycles Nodes that are re-convergence
points of the graph (Conceptually) adding a
dummy constraint (E3 ! E3)
Predicates E1 x y lt 5 E2 y x lt
5 Lemma ( ! E1 ! E2 )
18Experiments with SLICE
- Implemented upon SLICE
- i.e., Wang et al. LPAR05
- Controlled experiments
- Flexible theory propagation invocation
- Per predicate assignment, per BCP, or per full
assignment - Selective theory deduction
- No deduction, Fwd-only, or Both-directions
- Dynamic predicate learning
- With, or Without
19When to call the theory solver?
On the DTP benchmark suite
per BCP versus per predicate assignment
per BCP versus per full assignment
Points above the diagonals ? Wins for per BCP
20Comparing theory deduction schemes
On the DTP benchmark suite
Fwd-only deduction vs. no deduction total 660
seconds
Both-directions vs. no deduction total 1138
seconds
Points above the diagonals ? Wins for no deduction
21Comparing dynamic predicate learning
On the diamonds benchmark suite
22Comparing dynamic predicate learning
On the DTP benchmark suite
Dyn. pred. learning vs. No pred. learning
Points above the diagonals ? Wins for No pred.
learning
23Lessons learned
- Timing to invoke theory solver
- after every BCP finishes gives the best
performance - Selective implication deduction
- Little added cost, but improves the performance
significantly - Dynamic predicate learning
- Reduces the exponential blow-up in certain
examples - In the spirit of predicate abstraction
Questions ?