Title: Synthesis For Finite State Machines
1Synthesis For Finite State Machines
2FSM (Finite State Machine) Optimization
State tables
identify and remove equivalent states
State minimization
assign unique binary code to each state
State assignment
use unassigned state-codes as dont care
Combinational logic optimization
net-list
3FSM Optimization
00
01 -0
S3
S2
0-
10 -1
01 -0
11
1-
S4
S1
11
PI
PO
Combinational Logic
u1
v1
NS
PS
u2
v2
4State Minimization
- Goal identify and remove redundant states
- (states which can not be observed from
the - FSM I/O behavior)
- Why 1. Reduce number of latches
- assign minimum-length encoding
- only as the logarithm of the number
- of states
- 2. Increase the number of unassigned
states - codes
- heuristic to improve state-assignment
- and logic-optimization
5State Minimization Definition
- Completely-specified state machine
- two states are equivalent if outputs are
- identical for all input combinations
- Next states are equivalent for all input
- combinations
- equivalence of states is an equivalence relation
which partitions the states into disjoint
equivalence classes - Incompletely specified state machines
6Classical State Minimization
- 1. Partition states based on input output values
- asserted in the state
- 2. Define the partitions so that all states in a
partition transition into the same next-state
partition (under corresponding inputs)
7Example
- Ex
- 0 A B 0
- 1 A C 0
- 0 B D 0 (A,B,C,D,E,F,H)(G)
- 1 B E 0
- 0 C F 0 (A,B,C,E,F,H)(G)(D)
- 1 C A 0
- 0 D H 0 (A,C,E,H)(G)(D)(B,F)
- 1 D G 0
- 0 E B 0 (A,C,E)(G)(D)(B,F)(H)
- 1 E C 0
- 0 F D 0
- 1 F E 0
- 0 G F 1
- 1 G A 0
- 0 H H 0
- 1 H A 0
8State Assignment
- Assign unique code to each state to produce
logic-level description - utilize unassigned codes effectively as dont
cares - Choice for S state machine
- minimum-bit encoding
- log S
- maximum-bit encoding
- one-hot encoding
- using one bit per state
- something in between
- Modern techniques
- hypercube embedding of face constraint derived
for collections of states (Kiss,Nova) - adjacency embedding guided by weights derived
between state pairs (Mustang)
9Hypercube Embedding Technique
- Observation one -hot encoding is the easiest
- to decode
- Am I in state 2,5,12 or 17?
- binary x4x3x2x1x0(00010)
- x4x3x2x1x0 (00101)
- x4x3x2x1x0(01100)
- x4x3x2x1x0 (10001)
- one hot x2x5x12x17
- But one hot uses too many flip flops.
- Exploit this observation
- 1. two-level minimization after one hot
- encoding identifies useful state group
for - decoding
- 2. assigning the states in each group to a
single - face of the hypercube allows a single
product - term to decode the group to states.
10State Group Identification
- Ex state machine
- input current-state next state output
- 0 start S6
00 - 0 S2 S5
00 - 0 S3 S5
00 - 0 S4 S6
00 - 0 S5 start
10 - 0 S6 start
01 - 0 S7 S5
00 - 1 start S4
01 - 1 S2 S3
10 - 1 S3 S7
10 - 1 S4 S6
10 - 1 S5 S2
00 - 1 S6 S2
00 - 1 S7 S6
00 - Symbolic Implicant represent a transition from
- one or more state to a next state under some
input - condition.
11Representation of Symbolic Implicant
- Symbolic cover representation is related to a
- multiple-valued logic.
- Positional cube notation a p multiple-valued
- logic is represented as P bits
- (V1,V2,...,Vp)
- Ex V 4 for 5-value logic
- (00010)
- represent a set of values by one string
- V 2 or V 4
- (01010)
12Minimization of Multi-valued Logic
- Find a minimum multiple-valued-input cover
- - espresso
- Ex A minimal multiple-valued-input cover
- 0 0110001 0000100 00
- 0 1001000 0000010 00
- 1 0001001 0000010 10
13State Group
- Consider the first symbolic implicant
- 0 0110001 0000100 00
- This implicant shows that input 0 maps
- state-2 or state-3 or state-7 into
state-5 - and assert output 00
- This example shows the effect of symbolic logic
minimization is to group together the states that
are mapped by some input into the same next-state
and assert the same output. - We call it state group if we give encodings to
- the states in the state group in adjacent
binary - logic and no other states in the group face,
then the states group can be implemented as a
cube.
14Group Face
- group face the minimal dimension subspace
containing the encoding assigned to that group. - Ex 0 010 00 group face
- 0100
- 0110
15Hyper-cube Embedding
c
state groups 2,5,12,17 2,6,17
b
a
12
17
6
2
5
2
17
wrong!
6
12
5
16Hyper-cube Embedding
c
state groups 2, 6, 17 2, 4, 5
b
a
6
17
2
4
5
17
6
2
4
wrong!
5
17How to Check if a State Assignment Satisfies the
Constraint Matrix?
Step1 Step2
Find the group face of the encoding For all
states, check if a state that does not belong to
a state group intersects that group face
18Example
Constraint matrix A, state encoding S and
group-face matrix F
010 110 101 000 001 011 100
0110001 1001000 0001001
S
A
Step1 Group face F A?S
Step2 Check encoding of state-6 011
Since it does not belong to group 1, 2 and 3,
Encoding of
state-6 satisfies the constraint
check 0 1 1 n 1 0 1
1 n 0 0 0 1 1 n 0 0
19Other State Encoding
If encoding of state-6 111,
check Do not satisfy the
constraint.
1 1 1 n 1 111 1 1 1 n 0
0 1 1 1 n 0 0
20Algorithm for State Assignment
Step 1 Step 2 Step 3 Step 4 Step 5
Select an uncoded state (or a state
subset). Determine the encodings for that state
(states) satisfying the constraint relation. If
no encoding exists, increase the state code
dimension and go to Step 2. Assign an encoding
to the selected state (states). If all states
have been encoded, stop. Else go to Step 1.
21Step 3
- Can always increase the coding length by one bit
- New state assignment
- For states already assigned, append 0 at the end
- For the new state, ns,
ns does not belong to any state group, encoding
of ns c 1 ? c is any vector ns belongs to
some state group, encoding of ns c 1 ?c is
the encoding of any state that belongs to the
state group
case1 case2
22Example
Ex
To encode a new state (state-5), we have a new
constraint matrix,
For the states already assigned, we have a new
encoding,
For the new state (state-5), we have encodings
ns 1 0 1 or 1 1 1
23Hyper-cube Embedding Method
- Advantage
- use two-level logic minimizer to identify good
state group - almost all of the advantage of one-hot encoding,
but fewer state-bit
24Adjacency-Based State Assignment
- Basic algorithm
- (1) Assign weight w(s,t) to each pair of states
- weight reflects desire of placing states
- adjacent on the hypercube
- (2) Define cost function for assignment of codes
- to the states
- penalize weights for the distance between the
state codes - eg. w(s,t) distance(enc(s),enc(t))
- (3) Find assignment of codes which minimize
- this cost function summed over all pairs of
- states.
- heuristic to find an initial solution
- pair-wise interchange (simulated annealing)
- to improve solution
25Adjacency-Based State Assignment
- Mustang weight assignment technique based on
loosely maximizing common cube factors
26How to Assign Weight to State Pair
- Assign weights to state pairs based on ability to
extract a common-cube factor if these two states
are adjacent on the hyper-cube.
27Fan-Out-Oriented (examine present-state pairs)
- Present state pair transition to the same next
state
S1
S3
S2
S1 S2 S3 S2 Add n
to w(S1,S3) because of S2
28Fan-Out-Oriented
- present states pair asserts the same output
S3
S1
/j
/j
S4
S2
S1 S2 1 S3 S4
1
Add 1 to w(S1 , S3) because of output j
29Fanin-Oriented (exam next state pair)
- The same present state causes transition to next
state pair. - S1 S2
- S1 S4
- Add n/2 to w(S2,S4) because of S1
S1
S4
S2
30Fanin-Oriented (exam next state pair)
- The same input causes transition to next state
pair. -
- 0 S1 S2
- 0 S3 S4
- Add 1 to w(S2,S4) because of input i
S1
S3
i
i
S2
S4
31Which Method Is Better?
- Which is better?
- FSMs have no useful two-level
- face constraints gt adjacency-embedding
- FSMs have many two-level
- face constraints gt face-embedding