Title: Joint SourceChannel Coding for
1 - Joint Source-Channel Coding for
- Correlated Sensors
- Wei Zhong and Javier Garcia-Frias
- Department of Electrical and Computer Engineering
- University of Delaware
2Outline
- Joint Source-Channel Coding of Correlated Sensors
over a Multiple Access Channel - Slepian-Wolf Coding for correlated sources
- Gaussian Multiple Access Channel
- Shannons Separate Source-Channel Coding is NOT
optimum in this case - Proposed system
- Combining Data Fusion with Joint Source-Channel
Coding of Correlated Sensors - Proposed system
- Many sensors observing a hidden source
- Data fusion performed at the Base Station jointly
with decoding
3Correlated Sources Practical Applications
Sensor networks Several sensors in a given
environment receiving correlated information.
Sensors have very low complexity, do not
communicate with each other, and send information
to processing unit (base station)
- Use of turbo-like codes (LDGM codes) to exploit
the correlation, so that the transmitted energy
necessary to achieve a given performance is
reduced - Data compression
- Joint source-channel coding
4Slepian-Wolf Coding of Correlated Sources
U1
Encoder 1
Joint Decoder
No Collaboration
Correlated
U2
Encoder 2
- Main Result of Slepian-Wolf Theorem
- Optimum data compression (i.e. as if each encoder
knows both sources) can be achieved without
collaboration between the encoders.
5Achievable Region of Slepian-Wolf Coding
6Gaussian Multiple Access Channel
N
S1
Joint Decoder r S1S2SnN
S2
Sn
- Information Theorem has established result about
GMAC - Random coding achieves capacity
7Joint Source-Channel Coding of Correlated Source
over MAC
N
coding
S1
Joint Decoder r S1S2SnN
coding
S2
coding
Sn
What is the capacity/limit? How to achieve
it? Still an open problem
8System Model
S1...1010110
N
encoder
Joint Decoder
e
S2.0110111
encoder
- S1, S2 are binary sequences
- i.i.d. correlation characterized by Pr(e 1)p,
i.e. S1 is different from S2 with probability p
9Most Closely Related Work
- Early work
- MAC with arbitrarily correlated sources (T. M.
Cover et al., 1980) - Separate source-channel coding not optimum
- Bounds, non-closed form
- Binary correlated sources
- Turbo-like codes for correlated sources over MAC
(J. Garcia-Frias et al., 2003) - Turbo codes
- Low Density Generator Matrix (LDGM) codes
- Interleaver design, exploiting correlation
- Correlated source and wireless channels (H. El
Gamal et al., 2004) - LDGM codes
- Not a pure MAC, need independent links for a
small fraction of parity bits
10Theoretical Limits Assuming Separation between
Source and Channel Coding
- Theoretical limit unknown
- The separation limit is achieved by
- Slepian-Wolf source coding optimum channel
coding
R1
- Ei Energy constraint for sender i (we assume
E1E2) - Ri Information rate for sender i (we assume
R1R2R/2)
11LDGM Codes
- Systematic linear codes with sparse generator
matrix GI P, Ppml - uu1uL systematic bits
- c uP coded (parity) bits
- LDGM codes are LDPC codes, since HGT I is also
sparse - Advantage over turbo codes Less decoding
complexity - Advantage over standard LDPC codes Less encoding
complexity
12LDGM Codes in Channel Coding (BSC)
- Message length10,000
- Code rate Rc.5 with different degrees (X,Y)
- As noticed by MacKay, LDGM codes are bad (error
floor does not decrease with the block length)
- Solution Concatenated scheme
13Serial Concatenated LDGM Codes
For BER10-5, 0.8 dB from theoretical limit,
comparable to LDPC and turbo codes
14LDGM Encoder for Correlated Senders over MAC
Single LDGM Encoder per Sender
u11 uL1
Sender 1
LDGM Encoder
Ok1
u12 uL2
Sender 2
LDGM Encoder
Ok2
- To exploit correlation, each sender encoded using
the same LDGM code
15LDGM Encoder for Correlated Senders over MAC
Information bits
Parity bits
Sender 1
Sender 2
- Information bits are correlated by pPr(u1k?u2k)
- Parity bits are correlated by p Pr(c1k?c2k)
Parity bits are generated as
16Drawback of Single LDGM Encoder Scheme
- Each sender is encoded by the same LDGM codebook
- Decoder graph completely symmetric
- At the receiver, even if the decoder can recover
the sum perfectly, there is no way to tell which
sequence corresponds to sender 1 and which to
sender 2 - Solution
- Introduce asymmetry in decoding graph
- Concatenated scheme with additional interleaved
parity bits
17LDGM Encoder for Correlated Senders over MAC
Concatenated Scheme
u11 uL1
Ok1
Eouter
Einner
Sender 1
Encoder 1
u12 uL2
Channel Interleaver
Sender 2
Eouter
Einner
Ok2
Encoder 2
- Each sender is encoded by a serial concatenated
LDGM code - Sender 2s sequence is scrambled by a special
channel interleaver - Information bits are not interleaved (most
correlation preserved). - Inner coded bits are partially interleaved
(trade-off between exploiting correlation and
introducing asymmetry). - Outer coded bits are totally interleaved (little
correlation, introduce asymmetry).
18LDGM Decoder for Correlated Senders over MAC
Concatenated Scheme
- Detailed message passing expressions can be
obtained by applying Belief Propagation over the
graph
19Simulation Results Single LDGM Scheme
- Information sequences divided into blocks of
length L10,000 - Rate 1/3 LDGM codes
- P0.01
- Error floor at 0.5p
20Simulation Results Single LDGM Scheme
- As SNR increases,
- Error due to channel noise fades away
- Interference stays constant due to the ambiguity
(symmetry in the decoder graph) explained before - Comment single LDGM scheme is capable of
transforming X1X2N into almost noise-free X1X2
(leaving interference intact)
21Simulation Results Concatenated Scheme
- Trade-off between error floor and threshold,
driven by fraction of interleaved inner parity
bits
22Conclusion
- For correlated sources over MAC, code design
should exploit correlation - Joint source-channel coding using LDGM codes can
indeed outperform separate-source-channel coding -
23Combining Data Fusion and Joint Source-Channel
Coding of Correlated Sensors
JSC coding
S1
AWGN
Joint DecodingandDataFusion
BSC
JSC coding
S2
AWGN
S
BSC
BSC
JSC coding
AWGN
Sn
- Many dumb sensors observe hidden source S with
observation error (modeled by BSC, i.e. bit
flipped with probability Pe) - Separate channel for each sensor (TDMA, FDMA,
etc.) - Joint decoding-data-fusion at the base station
-
24Receiver Structure
Joint DecodingandDataFusion
- Turbo decoding/Information exchange
- Centralized structure
- Complexity grows linearly with the number of
sensors -
25Theoretical Analysis (1)
- Pe Sensor observation error parameter
- BER Bit error rate of
- Increasing the number of sensors produces good
BER even for high Pe -
26Theoretical Analysis (2) Simulation Results
- Eso/N0 Energy per sensor bit (BPSK) to achieve
reliable communication - Simulation results show that as the number of
sensors grows, only a little gain in SNR achieved - Need optimized code design
-
27Simulation Results
- Simulation results show that as the number of
sensors grows, the gap away from the theoretical
limit increases -
28Conclusion
- Given a target performance in terms of BER, we
can offer solutions with different configurations
(sensor quality VS network size) - For our model of sensor network, LDGM code can
offer near-optimum performance with very low
coding complexity when the number of sensors is
small - When the number of sensors is large, the
performance degrades. Our current on-going work
is to optimize the code design for this case -