Title: Decoding of Convolutional Codes
1Decoding of Convolutional Codes
- Let Cm be the set of allowable code sequences
of length m. - Not all sequences in 0,1m are allowable code
sequences! - Each code sequence can be
represented by a unique path - through the trellis diagram
- What is the probability that the code sequence
is sent and the - binary sequence is received?
- where p is the probability of bit error of BSC
from modulation -
-
2Decoding Rule for Convolutional Codes
- Maximum Likelihood Decoding Rule
- Choose the code sequence through the trellis
which has the - smallest Hamming distance to the received
sequence!
3The Viterbi Algorithm
- The Viterbi Algorithm (Viterbi, 1967) is a
clever way of - implementing Maximum Likelihood Decoding.
- Computer Scientists will recognize the Viterbi
Algorithm as an - example of a CS technique called Dynamic
Programming
- Reference G. D. Forney, The Viterbi
Algorithm, - Proceedings of the IEEE, 1973
- Chips are available from many manufacturers
which - implement the Viterbi Algorithm for K lt 10
- Can be used for either hard or soft decision
decoding - We consider hard decision decoding initially
4Basic Idea of Viterbi Algorithm
- There are 2rm code sequences in Cm .
- This number of sequences approaches infinity as
m - becomes large
- Instead of searching through all possible
sequences, - find the best code sequence "one stage at a
time"
5The Viterbi Algorithm(Hamming Distance Metric)
- Initialization
- Let time i 0.
- We assign each state j a metric Z j (0) at time
0. - We know that the code must start in the state 0.
- Therefore we assign
- Z j (0) 0
- Z j (0) for all other states
6The Viterbi Algorithm (continued)
- Consider decoding of the ith segment
- Let be the segment of n bits received
between times i - and i 1
- There are several code segments of n bits
which lead - into state j at time i1. We wish to
find the most likely one. - Let be the state from which the code
segment emerged - For each state j, we assume that is the
path leading into - state j if
- is the smallest of all the code
segments leading into state j.
7The Viterbi Algorithm (continued)
- Iteration
- Let
- Let ii1
- Repeat previous step
- Incorrect paths drop out as i approaches infinity.
8Viterbi Algorithm Decoding Example
- r 1/2, K 3 code from previous example
- (0 0 1 1 0 1 0 0 10 10
1 1) is sent - (0 1 1 1 0 1 0 0 10 10
1 1) is received. - What path through the trellis does the Viterbi
Algorithm choose?
9Viterbi Algorithm Decoding Example(continued)
10Viterbi Decoding Examples
- There is a company Alantro with a example
Viterbi - decoder on the web, made available to
promote their - website
- http//www.alantro.com/viterbi/workshop.html
- Your browser must have JAVA-enabled
11Summary of Encoding and Decoding ofConvolutional
Codes
- Convolutional are encoded using a finite state
machine. - Optimal decoder for convolutional codes will
find the path - through the trellis which lies at the shortest
distance to the - received signal.
- Viterbi algorithm reduces the complexity of this
search by - finding the optimal path one stage at a time.
- The complexity of the Viterbi algorithm is
proportional to the number of states - exponential relationship to constraint length
12Implementation of Viterbi Decoder
- Complexity is proportional to number of states
- increases exponentially with constrain length
K 2K - Very suited to parallel implementation
- Each state has two transitions into it
- Each state has two transitions out of it
- Each node must compute two path metrics, add
them to previous metric and compare - Much analysis as gone into optimizing
implementation of this - Butterfly calculation
13Other Applications of Viterbi Algorithm
- Any problem that can be framed in terms of
sequence detection can be - solved with the Viterbi Algorithm
- MLSE Equalization
- Decoding of continuous phase modulation
- Multiuser detection
14Continuous Operation
- When continuous operation is desired, decoder
will automatically synchronize with transmitted
signal without knowing state - Optimal decoding requires waiting until all
bits are received to trace back path. - In practice, it is usually safe to assume that
all paths have merged after approximately 5K time
intervals - diminishing returns after delay of 5K
15Frame Operation of Convolutional Codes
- Frequently, we desire to transfer short (e.g.,
192 bit) - frames with convolutional codes.
- When we do this, we must find a way to terminate
- code trellis
- Truncation
- Zero-Padding
- Tail-biting
- Note that the trellis code is serving as a
block code in - this application
16Trellis Termination Zero Padding
- Add K-1 0s to the end of the data sequence to
force - the trellis back to the all zeros state
- Performance is good
- Now both start and ending state are known by the
decoder - Wastes bits in short frame
17Performance of Convolutional Codes
- When the decoder chooses a path through the
trellis which diverges from the correct path,
this is called an "error event - The probability that an error event begins
during the current time interval is the
"first-event error probablity Pe - The minimum Hamming distance separating any two
distinct path through the trellis is called the
free distance dfree.
18Calculation of Error Event Probability
- What's the pairwise probability of choosing a
path at distance d from the correct path?
19Calculation of First Event Error Probability
20Evaluating Error ProbabilityUsing the Transfer
Function Bound
21Finding T(D) from State Diagram
- Break all 0s state in two, creating a starting
state and a terminating state - Re-label every output 1 as a D
- ad is the number of distinct paths leading from
the starting state to the terminating state while
generating the function Dd
22Example of State Diagram
23Performance Example for Convolutional Code
24Performance of r1/2 Convolutional Codeswith
Hard Decisions
25Performance of r1/3 Convolutional Codeswith
Hard Decisions
26Punctured Convolutional Codes
27Practical Examples of Convolutional Codes