Distributed Source Coding and Recent Advancements in Practical Constructions - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

Distributed Source Coding and Recent Advancements in Practical Constructions

Description:

zX = xHX; zY = yHY (mod 2 arithmetic) Elements of matrices are generated I.I.D. ~Bernoulli(1/2) ... Collection of 2k code words in the space {0, 1}n. Forms ... – PowerPoint PPT presentation

Number of Views:63
Avg rating:3.0/5.0
Slides: 15
Provided by: yew9
Category:

less

Transcript and Presenter's Notes

Title: Distributed Source Coding and Recent Advancements in Practical Constructions


1
Distributed Source Coding and Recent Advancements
in Practical Constructions
  • SC 500 Course Project
  • Ye Wang

2
Outline
  • Distributed source coding problem
  • Slepian-Wolf results
  • Proof outline
  • Discussion of practicality
  • Linear binning
  • Linear channel codes
  • LDPC codes
  • Application to example problem
  • Concluding Remarks

3
Distributed Source Coding
  • X and Y are Correlated
  • What are achievable rates R1 and R2 for conveying
    X and Y?

4
Slepian-Wolf Theorem
  • Possible to convey X and Y with small probability
    of error If and Only If
  • RX H(XY)
  • RY H(YX)
  • RX RY H(X,Y)

5
Random Binning Proof
  • Assign each length-n sequence Xn a random index
    from (1, 2nRx) uniformly, I.I.D.
  • Do same for Yn over indices (1, 2nRy)
  • Send bin indices for Xn and Yn
  • Decoder picks the jointly typical pair from the
    selected bin
  • Pe lt e if Slepian-Wolf conditions met

xn
2nRx bins
yn
2nRy bins
jointly typical (xn, yn) pairs
6
Practicality of Random (Unstructured) Codes
  • Need to store 2n (nRX nRY) bits to represent
    codebook
  • Need to check joint typicality of 2n(1 Rx 1
    Ry) code words
  • Exponential in complexity and memory
  • Bad since in order to achieve a low probability
    of error, n must be large

7
Binning with Random Matrices
  • Produce bin indicies by multiplying x, y with
    random matrices HX (n by nRX), HY (n by nRY)
  • zX xHX zY yHY (mod 2 arithmetic)
  • Elements of matrices are generated I.I.D.
    Bernoulli(1/2)
  • Probability of two different x and x being in
    the same bin is 2nRx (similarly for y)
  • All that is needed in Slepian-Wolf proof, proves
    that there exist matrices that work
  • Reduces memory requirements and adds structure

8
Binary Linear Channel Code
  • 2k codewords in the space 0,1n
  • Forms k-dimensional subspace (k lt n)

Channel Codeword
Decoding Ball
9
Arithmetic Definition
  • Collection of 2k code words in the space 0, 1n
  • Forms k-dim subspace
  • G (k X n) Generator Matrix, codeword subspace
  • sn ukGk,n sn codeword, uk ? 0,1k
  • H (n X n k) Parity Check Matrix, null space of
    code
  • Produce coset labels called syndromes
  • zn k rnHn,n k rn ? 0,1n, zn k syndrome

10
Properties of Syndromes
  • z rH r ? 0,1n, z syndrome
  • Code words have all-zero syndrome
  • Only a function of the noise from a codeword
  • rH (s n) H sH nH nH

0
s codewords r received words n noise words
r1H n1H n2H r2H
11
Decoding Procedures
  • Received codeword r s n n s
  • MAP rule pick most likely code word s given r
  • Nearest code word for BSC, equiprob code words
  • Pick most likely noise vector n given syndrome
    rH
  • Shortest noise vector if BSC, flip prob lt 0.5
  • Exhaustive search
  • Not practical

12
Low Density Parity Check (LDPC) Codes
  • Linear code with sparse parity check matrix
  • Decoding algorithm Sum-Product-Algorithm (SPA)
  • Approximates maximal likelihood (ML) decoding
  • Iterative message passing algorithm
  • Complexity related to sparseness of H matrix O(n
    log n)
  • Provably approaches capacity
  • SPA can be conceptualized in two ways
  • Finds ML code word given received vector
  • Finds ML noise vector given syndrome of noise
  • More generally, finds ML word given prob and
    syndrome

13
Source Coding Example
  • Xi I.I.D Bernoulli(q)
  • Yi Xi Zi ? i Zi I.I.D. Bernoulli(p) plt0.5
  • Operating at corner point
  • RX H(X) h2(q), RY H(YX) H(Z) h2(p)
  • Encoding
  • Send X with entropy coder Send yH (syndrome of
    Y)
  • Use H from LDPC code designed for BSC with flip
    prob p
  • Decoding
  • Decode X with entropy decoder
  • Compute xH (syndrome of X)
  • Compute xH yH zH
  • Decode Z with LDPC SPA
  • Compute Y X Z

14
Concluding Remarks
  • Correlation channel between X and Y is like a
    transmission channel
  • Strategic idea for corner point
  • Convey X with entropy coder
  • Serves as noisy version of Y
  • Encoder for Y conveys additional channel code
    information
  • Decoder acts like a channel decoder in order to
    recover Y
  • Schonberg, Pradhan, Ramchandran propose
    generalized method to achieve the tradeoff points
Write a Comment
User Comments (0)
About PowerShow.com