Title: Sandro Spina, John Abela
1 Mutually compatible and incompatible merges for
the search of the smallest consistent DFA
- Sandro Spina, John Abela
- Department of CS AI,
- University of Malta.
Francois Coste INRIA/IRISA, Campus de Beaulieu,
35042 Rennes Cedex, France
2Evidence Driven State Merging
- The motivation behind our work was that of
improving the (greedy) heuristic used by EDSM.
Work was also carried out on diversification of
the search strategy. - EDSM Price98 is very effective at inferring
regular languages, except when training data is
sparse. - According to Price98, Abbadingo style problems
can be solved with high confidence (0.93) when
the number of matched state labels is greater
than 10. - EDSM determines its merge sequence (greedily) by
using a heuristic which compares language
suffixes between two states in a DFA. - Three Complementary Tracks
- Improve on Heuristic Score.
- Improve on Search Strategy.
- Combine these two.
3Sharing Evidence
- Q. Whenever EDSM does not correctly infer the
target language, can we (using a greedy
depth-first search) improve the learners merge
path by gathering and combining information
(state label matches) from multiple valid merges?
Does the combination of their evidence scores
result in valuable information? Can this
information be used to guide the search? - We think so !!! Some of the initial results are
encouraging. - Target Size Convergence Improves Drastically
- Classification Rate Does Not Improve Consistently
- EDSM score ? focuses on single merge analysis
- S-EDSM score ? score is a combination of single
merge analysis
4Pairwise Compatible Merges
- Let M be the set of all possible merges.
- A merge M ltq1,q2gt, is said to be valid if all the
states in the subtree of q1 are state compatible
with the corresponding states in the subtree of
q2. - Let M1, M2 2 M be two valid merges
- We define the relation ? µ M X M as follows
- M1?M2 if M2 remains a valid merge in the
hypothesis obtained by applying M1 - If M1?M2 , we say that M1 is pairwise compatible
to M2
5Pairwise Compatible Merges (Simple) Example
6Mutual Compatible Merges
- Suppose that M1, M2 and M3 2 M where M1 ? M2
and M2 ? M3 - This does not necessarily imply that M1?M3. This
is because some states in M2 can be labelled
differently by M1 and M3. - Therefore ? is not transitive.
- In order to make ? a transitive relation (
denoted as ? ), M1 ? M3 needs to be checked as
well to create the set M1, M2, M3 - Set cardinality of mutual compatible merges can
direct S-EDSM s heuristic score. This is
currently not implemented.
7S-EDSM Algorithm
8Initial Results
- Shared Evidence Driven State Merging (S-EDSM)
implements only pairwise compatibility by
creating classes of M1 ? M2 Mn for the top
30 valid merges. Scores are recalculated and the
best merge is determined and executed. Various
strategies can be implemented. - In terms of classification rate we are still not
consistently performing better than classic EDSM. - S-EDSM approximates better the target size of the
target automaton. However this improvement does
NOT help on its own. Its only (possibly) an
indication of a direction to follow.
9Results II ( 400 State Target Size Convergence )
- This graph documents 10 consecutive problems
downloaded from - Gowachin. Training set consisted of 20,000
strings.
10Results III (256 State Target Size Classification
)
11Pairwise Incompatible Merges for Search
Classical Search Tree
?
m1?M
?
?
m2?M
m3?M
?
m5?M
m4?M
12Pairwise Incompatible Merges for Search
Candidates limitation after backtrack
?
m1?M
?
m2?M
m3?M?I(m1)
?
m5?M?I(m2)
?
?
m4?M
13Pairwise Incompatible Merges for Search
- Rationale
- A merge m ? I(m) may be tried after m.
- Introduces diversity in the search
- Edsm I(m) may be computed Coste
Fredouille,ICGI00 - S-Edsm I(m) is available for free
- Significant improvement when applied to the 3
first choices. - Best application of scheme after the choice
m3?M?I(m1) ? - After merging m3 ()
- After not merging m3 (?)
14Future Directions
- Develop a calculus to describe merge
interactions. Implement all the relations and
functions ( mutual compatibility, dominance, etc.
) of the calculus. Analyse the results achieved
from these different implementations. - Combine heuristic with better search strategies
and study the best combination of heuristic and
search strategy. Introduction of diversity in the
exploration of the search space by limiting
choice of candidate merges after a backtrack. - Noisy Data !! Can S-EDSM perform better by
combining information between different merges.
Maybe with information gathered from merge
interactions S-EDSM can discover noise in the
training set. - Ultimately we want to see how far we can push, in
terms of data sparseness, DFA learning. - Thank you.