Multiple Access Channels with Correlated Sources and Feedback Sufficient Conditions for SourceChanne - PowerPoint PPT Presentation

1 / 57
About This Presentation
Title:

Multiple Access Channels with Correlated Sources and Feedback Sufficient Conditions for SourceChanne

Description:

Assuming relay correctly decoded message wb-1 sent in the previous block, relay ... Key addition to the coding strategy: List Decoding ... – PowerPoint PPT presentation

Number of Views:47
Avg rating:3.0/5.0
Slides: 58
Provided by: sand206
Category:

less

Transcript and Presenter's Notes

Title: Multiple Access Channels with Correlated Sources and Feedback Sufficient Conditions for SourceChanne


1
Multiple Access Channels with Correlated Sources
and Feedback(Sufficient Conditions for
Source-Channel Separation)
  • Sriram Vishwanath
  • in collaboration with
  • W. Wu, B. Smith and S. Sridharan
  • University of Texas at Austin

2
Overview
  • Single user
  • MAC with correlated sources tough problem
  • MAC with feedback tough problem
  • Why combine the two? easier problem (perhaps)
  • Source channel separation for MAC channels
  • Source channel separation in general networks
  • Conclusions

3
Single User DMC
Rx
Source and Tx
  • Source-channel separation holds
  • Feedback leads to no increase in capacity

4
Source-Channel Separation
  • Source-channel separation inspires many of our
    current architectures
  • Pros (Big)
  • Allows for coding at separate layers
  • Complexity reduction
  • Caution an asymptotic result not true for
    systems with stringent block length constraints

5
Role of Feedback
  • Apparently useless
  • It can, in many cases (e.g. Gaussian) greatly
    reduce
  • Coding complexity
  • Code latency

6
MAC Channel
Tx1
Rx
Tx2
  • In general
  • NO source channel separation
  • Feedback increases capacity region

7
Capacity of MAC with Feedback
  • In general an open problem
  • Two user Gaussian case Ozarow
  • Result Capacity region is the cutset outer bound
  • R1 lt I(X1YX2)
  • R2 lt I(X2YX1)
  • R1 R2 lt I(X1,X2Y)
  • over all cdfs F(x1,x2)

8
Gaussian MAC with feedback
  • Gaussian inputs extremize the outer bound
  • R1 lt log(1 (1-?2)P1)
  • R2 lt log(1 (1-?2)P2)
  • R1 R2 lt log(1 P1 P2 2?vP1P2)
  • Achieved by modified Schalkwijk-Kailath Coding
    Scheme
  • Feedback
  • Increases region
  • Simplifies coding
  • Reduces latency

9
One more gain from feedback..
  • Feedback induces source-channel separation

10
MAC channel with correlated sources
U
Tx1
Rx
Tx2
V
  • U and V correlated and no feedback
  • Well known that there is no source channel
    separation Cover, El Gamal Salehi , Gastpar,
    Dueck

11
MAC channel with correlated sources (and no
feedback)
  • Finding actual capacity region nearly impossible
  • Only extreme cases, like UV
  • UV results in a MISO capacity gain
  • Even if U and V are 99.9 correlated, capacity
    region a very hard problem (!)
  • Isolating even 1 bit of common information very
    difficult Gacs Korner, Ahlswede Csiszar

12
The meat of the talk
  • For two-user Gaussian MAC with feedback source
    channel separation holds
  • For a general MAC with feedback if the
    correlation between U and V is greater than a
    threshold value then source channel separation
    holds
  • Proof in all cases Fanos inequality

13
Two user Gaussian MAC with Feedback Correlation
  • Achievability Separate Coding

Tx1 SK
binning SW
U
Rx SK Decoder
SW decoder
Tx2 SK
binning SW
V
14
Two user Gaussian MAC with Feedback
  • By separate source channel coding (binning with
    Ozarow coding) we achieve
  • H(UV) lt I(X1YX2)
  • H(VU) lt I(X2YX1)
  • H(U,V) lt I(X1,X2Y)
  • for any joint c.d.f F(x1,x2)
  • Converse By Fanos
  • Intuition Arbitrary correlation between X1 and
    X2 already possible. Correlation between U and V
    is thus useless.

15
Converse
  • nH(UV) lt ?H(YiYi-1,Vn) (Usual Fanos)
  • - ?H(YiYi-1,Vn,Un)
  • lt ?H(YiYi-1,Vn,X2i)
  • - ?H(YiYi-1,Vn,Un,X1i,X2i)
  • lt?H(YiX2i)
  • - ?H(YiX1i,X2i)
  • and thats it!

16
DMC MAC with Correlated Sources and Feedback
  • Achievable with separate source-channel coding
  • H(UV) lt I(X1YX2,T)
  • H(VU) lt I(X2YX1,T)
  • H(U,V) lt I(X1,X2Y)
  • for any p(t)p(x1t)p(x2t)
  • T is the cooperation auxiliary random variable
  • Note Ability to cooperate limited by first two
    inequalities.

17
Achievability proof
H(VU) lt I(X2YX1,T)
Tx1 Decodes V
U
Rx Decodes U V
Tx2 Decodes U
V
H(U,V) lt I(X1,X2Y)
H(UV) lt I(X1YX2,T)
  • Extension of Cover, El Gamal Salehi argument

18
Achievable Region vs. Outer Bound
  • Achievable Region
  • H(UV) lt I(X1YX2,T)
  • H(VU) lt I(X2YX1,T)
  • H(U,V) lt I(X1,X2Y)
  • Cutset outer Bound
  • H(UV) lt I(X1YX2)
  • H(VU) lt I(X2YX1)
  • H(U,V) lt I(X1,X2Y)
  • Missing term I(TYXi) in the first two
    inequalities of Achievable region

for p(t)p(x1t)p(x2t)
19
When does separation hold?
  • If (T,X1,X2) be a triplet such that
  • (X1,X2) extremizes the outer bound
  • (X1,X2) conditionally independent given T
  • And for this triplet
  • H(UV) lt I(X1YX2,T)
  • H(VU) lt I(X2YX1,T)
  • Then
  • That extremum point of the outer bound is
    achievable
  • Source-channel separation holds for that boundary
    point
  • Intuition If U and V correlated beyond a
    threshold value
  • (highly correlated), then source channel
    separation holds.

20
A more general network
U1
Tx1
U1,Um
Rx1
U2
Tx2
Network
Ui
Txi
U1,Um
Rxn
Um
Txm
21
Towards a general statement
  • Statement 1 For a system with m transmitters
    (messages), n receivers where
  • Condition 1 Every receiver wishes to receive all
    n messages
  • Condition 2 The cutset outer bound can be
    achieved for n independent messages
  • Then source-channel separation holds
  • Proof By Fanos again

22
An even more general (but loose statement)
  • Statement 2 For a system with m transmitters
    (messages), n receivers where
  • Condition 1 Every receiver wishes to receive all
    n messages
  • Condition 2 U1,Um are highly correlated
    (beyond a threshold value)
  • Then source-channel separation holds
  • Proof By Fanos again

23
Conclusions
  • The ability to cooperate at the channel level
    induces source-channel separation
  • For a 2 user MAC with feedback, we always have
    source channel separation
  • For an arbitrary m Tx, n Rx network, it may
    hold if correlations are large enough

24
Sensor Networks
  • Many distributed nodes can make measurements and
    cooperate to propagate information through the
    system, perhaps to a single endpoint
  • Key Observation Nodes located near each other
    may have correlated information. What effect
    does this have on information flow?

25
The Relay Channel
  • Introduced by van der Meulen
  • Discrete Memoryless Relay Channel consists of
  • An Input X.
  • A Relay Output Y1.
  • A Relay Sender X1, which can depend upon
    previously received Y1.
  • A Channel Output Y.
  • A conditional probability distribution
    p(y,y1x,x1).

Relay, Receives Y1 Inputs X1
Transmitter, Input X
Receiver, Receives Y
26
The Relay Channel in Sensor Networks
  • A Relay Channel in a sensor network differs in
    that both sender and relay have access to
    correlated sources of information.
  • Can exploit the relays source as side
    information
  • If a MAC channel, i.e. no channel between
    transmitter and relay
  • Then correlation less than 100 is nearly useless
    (Gacs, Korner)
  • If relay channel, correlation is very useful!

27
The Relay Channel
  • Capacity of General Relay Channel is Unknown
  • Upper Bound by Cut-Set Argument
  • Achievable Rate by Cover, El Gamal Thm 1
  • This Rate is Achieved by Block-Markov Coding
  • Introduce a correlation between the transmitter
    input and the relay
  • Decode and Forward Policy

28
Block-Markov Coding
  • Overview of Block-Markov Coding for Classic Relay
    Channel
  • Relay Terminal completely decodes a message index
    w from the set 1..2nR sent by the transmitter
  • Relay sends the bin index of the message that it
    received to aid the receiver in decoding
  • This is helpful, because the transmitters
    codeword is dependent on the bin index that the
    relay is transmitting in the same block
  • Codebook generation
  • Fix any p(x,x1) as
  • Generate 2nR0 codewords x1n as
  • Index them as x1n(s)
  • For each of these x1n codewords, generate 2nR xn
    codewords as
  • Index these as xn(ws)
  • Independently bin the messages w into the 2nR0
    bins s

29
Block-Markov Coding
  • Encoding
  • Messages sent in a total of B blocks
  • In the first block
  • Relay sends codeword x1n corresponding to a
    pre-determined null message, say x1n(f)
  • Transmitter sends codeword xn, dependant on the
    first message w1 and the null message, say
    xn(w1f)
  • In block b
  • Assuming relay correctly decoded message wb-1
    sent in the previous block, relay sends codeword
    x1n(sb-1)
  • Transmitter sends xn(wbsb-1)
  • Same bin index s that relay is simultaneously
    sending
  • Shifted by one block to allow relay to decode
    current message

30
Block-Markov Transmission
1
2
B
B-1
Block
Message
w1
w2
wB-1
wB
Transmitter
xn(wBwB-1)
xn(w2w1)
Relay
xrn(f)
xrn(w1)
xrn(wB-1)
At the end of each block
w1
wB-1
Relay Decodes
w2
wB
Receiver Decodes
w1
wB-2
wB-1
31
Block-Markov Decoding
Y1X1
Source U
X
Y
  • Decoding
  • At the Relay
  • Can determine message if R lt I(XY1X1)
  • At the Receiver
  • Can determine message if R lt I(X, X1 Y)
  • Thus we get R lt min(I(XY1X1), I(X, X1 Y))

32
Relay Channel with Correlated Side Information
  • Key addition to the coding strategy List
    Decoding
  • With no side information, block-Markov requires
    that all of H(U) be sent to relay
  • If relay has V correlated with U, only need to
    push H(UV) information across the channel
  • Will show that this relay channel has achievable
    rate
  • Implies that if correlation between sources, i.e.
    I(UV)/H(U) is large enough, we have found
    capacity!

33
Block Markov Coding
  • Encoding
  • Identical to Block-Markov
  • Decoding at Relay
  • Form two lists of message indices
  • One of Un jointly typical to Y1n when X1n is
    known
  • Second of Un jointly typical with Vn
  • Choose Un that appears in both lists
  • Rlt I(XY1X1) I(UV)

34
Example MIMO Channel
  • Example where Capacity Found
  • Description of example MIMO System
  • Transmitter to Relay is Point-to-Point with gain
    h
  • Relay and Transmitter each have single antenna,
    power constraints P and P1
  • Receiver has two antennae and matrix gain H
  • Independent Gaussian noise Model with unit noise
    power
  • Desire to transmit the single source U

Y1hX ?1
X
YHx,x1T ?Y1, ?Y2T
35
Example Channel
YHx,x1T ?Y1, ?Y2T
  • Assumption The transmitter to relay link is
    better than the transmit to destination link
  • Result With 23 correlation full cooperative
    capacity can be obtained
  • With no cooperation data rate is 1 bit/sec
  • With block-Markov but no correlated relay - 1.34
    bit/sec
  • With correlated relay, block-Markov and joint
    source-channel coding 1.55 bit/sec!
  • On the flip side, for given data rate, lower
    power consumption

36
Summary
  • Started with the observation that nodes in a
    sensor network may have correlated data
  • Showed that can use this side information to
    increase rate (or decrease power usage)
  • This is a joint source-channel coding strategy
  • For some relay channels with correlated sources,
    capacity can be found

37
Part II Network Coding in Interference Networks
38
Overview
  • Introduction
  • Interference vs. Non-Interference Networks
  • Network Coding in Non-Interference Networks
  • Gaussian Interference Networks
  • Classic Network Coding Example on a Gaussian
    Network
  • Network Coding on Bow-Tie Network
  • Network Coding with Node Cooperation
  • Summary

39
Introduction
  • Show by series of examples
  • Network-coding in interference network can act
    same as network coding in equivalent
    non-interference network
  • Can provide a benefit in interference network
    when it does not in the equivalent
    non-interference network
  • Node-cooperation and network-coding used together
    can additionally increase throughput

40
Non-Interference vs. Interference Networks
  • Arbitrary network configuration with conditional
    probability distributions across links

Source Node 2
Source Node 1
Terminal Node 1
Terminal Node 3
Terminal Node 2
41
Non-Interference vs. Interference Networks
42
Classic Network Coding
  • Ahlswede, Cai, Li, Yeung, 2000
  • Model
  • Links transmit single symbol with no errors
  • Nodes can perform linear operations on received
    symbols
  • Result
  • Multicasting from single source to multiple
    receivers can be performed at min-cut max-flow
    rate ACLY 2000
  • Example
  • Ubiquitous routing vs. network coding example

43
The Network Coding Example

Network Coding
Routing
1.5 bits to each receiver
2 bits to each receiver
44
Gaussian Interference Networks
  • Use additive complex white Gaussian noise
    channels in the illustrative examples
  • Each node has transmit power constraint
  • Received signal Y??Xi ? ? N(0,N)
  • Point-to-point channel
  • Capacity Rlg(1P/N)
  • Multiple access channel with independent messages
  • Sum Rate Capacity
  • R1R2lg(1 P1/N P2/N)

P
P1
X1
Y
P2
X2
45
Gaussian Interference Networks
  • Broadcast Channel with independent messages
  • Sum Rate Capacity
  • R1R2 lg(1P/N)
  • Multicasting over Broadcast Channel
  • Same message can be sent to both receivers
  • at point-to-point rate
  • Broadcast channel with common message
  • First example Classic network-coding network
    configuration with Gaussian channels
  • Use P3/2 and N1 at all nodes
  • To compare with bit-pipe model

46
Gaussian Network Coding Example

A
B
P
P
P
X
C2
Clg(13/2)
Y
C1
P
Clg(13/23/2)2
D1
D2
47
Interference Network Example Routing
  • Strategies for the classic configuration in
    interference model
  • Ignore center nodes X,Y completely
  • Receive lg(5/2) bits/transmission each
  • Routing strategy
  • Constraints if multicasting at broadcast nodes
  • MACs 2 bits/transmission
  • Direct links lg(5/2) bits/transmission
  • RARBRXYRT1 is viable
  • Route so that X-Y and Y-D links carry As
    information
  • 1 bit to D1, 2 bits to D2, 1.5 bits on average
  • Can show with linear programming that no better
    pure routing strategy exists

RA
RB
RA
RB
RXY
RT
RT
48
Interference Network Example Network Coding
  • Instead, use the same strategy as in the
    non-interference network
  • Set all links to rate 1 bit/transmission
  • Send A B on links X-Y, Y-D1, and Y-D2
  • Each terminal receives 2 bits/transmission
  • Rate of 2 bits/transmission is maximum with
  • independent codebooks
  • cut C1 across multiple access channel
  • Exceeding the independent codebooks rate is
  • the subject of the node-cooperation section
  • Essentially, taking the same actions as the
    non-interference network

A
B
P
P
P
X
Y
C1
P
D1
D2
49
Aside Multicasting
  • In the previous example, all broadcast nodes
    operated by multicasting
  • The same amount of information crosses cut C1 in
    either mode
  • But, greater amount crosses each of C2 and C3 in
    the multicasting mode

P
(1-?)P
C1
C1
?P
C2
C3
C2
C3
R1
R2
R1
R2
Multicasting
Broadcasting
R1R2lg(1P)
R1R2lg(1P)
50
Example Multicasting Outperforms Broadcasting
  • In the following example, multicasting is
    superior to broadcasting
  • Three independent sources
  • Achievable rate across cut C1 when center node
    multicasts
  • Final term removes rate of common message
  • When center node broadcasts

S0
S2
S1
P
P
P
R1
R2
C1
51
Effect of Multicasting
  • Intuition Broadcast requirement of the
    interference network acts like a T
    configuration in the non-interference network
  • Adding a bottleneck link of equal capacity onto
    the node in the non-interference case constrains
    the two downlinks to send the same signal
  • Otherwise, not operating at full capacity
  • Analogy useful for next example

P
Bottleneck link
Multicasting in Interference Network
T Configuration in Non-Interference Network
52
Bow-Tie Network
  • Non-interference network routing performs
    optimally
  • Interference Network Strategies
  • Ignore center node X
  • Allows lg(13/2) bits/transmission to each
    terminal node
  • Routing Center node X alternately chooses A or
    B
  • to forward
  • Rate of 1.5 bits/transmission to each terminal
    node
  • Network Coding
  • Node X multicasts A B to both destination nodes
  • Rate of 2 bits/transmission to each terminal node
    is optimal
  • Network Coding is useful in some interference
    networks when it provides no benefit in the
    non-interference network with the same
    configuration

B
A
X
D1
D2
53
Node Cooperation
  • Network coding is one method of inducing
    correlations on a network
  • Other methods
  • Using correlated codebooks at separate nodes
  • Block-Markov coding for relay channel
  • For non-interference networks, independent coding
    shown optimal
  • Demonstrate interference network counter-example
  • Example Network coding and node-cooperation in
    tandem increasing performance

U,V
V
U
P0
P
P
D1
D2
54
Node Cooperation
U,V
V
U
  • Nodes T1 and T2 have access to data sources U and
    V, respectively, while node T0 has access to both
    U and V
  • Nodes T1 and T2 have power constraint P, while
    node T0 has constraint P0
  • When PP03/2
  • Routing 1.5 bits/transmission
  • Network coding with independent codebooks 2
    bits/transmission

T1
T0
T2
P0
P
P
D1
D2
55
Node Cooperation
U,V
V
U
T1
T0
T2
P0
P
P
  • Network coding with node cooperation scheme
  • Three codebooks for U message, V message, and U
    V message
  • Equal timesharing over two modes of operation
  • In first mode,
  • Node T1 transmits U codeword scaled to power P?
  • Node T2 transmits V codeword scaled to power P- ?
  • Node T0 splits its power to transmit both U and U
    V
  • Second mode Reverse of first mode

D1
D2
56
Node Cooperation
U,V
V
U
T1
T0
T2
P ?
?0
P- ?
P0- ?0
  • Network coding with node cooperation scheme
  • T1 and T0 cooperate to send the codeword for U to
    D1
  • T0 sends codeword for U V to both D1 and D2
  • T2 sends codeword for V to D2
  • Destinations receive their own sources directly
  • Destinations can decode the opposite source with
    U V message
  • With appropriate choice of parameters
  • Average rate of 2.06 is achievable at each node

D1
D2
57
Summary
  • Bit-pipe non-interference model is not complete
  • Neglects interactions between channels
  • In wireless scenario, imposes orthogonality
    constraint on channels
  • Any gains due to node-cooperation necessarily
    lost
  • Noisy channel model captures these effects
  • More difficult to handle
  • Results for non-interference networks can be
    duplicated
  • Configurations exists where bit-pipe model has no
    network coding gain, while interference model
    benefits from network coding
  • Node cooperation strategies can increase the
    throughput beyond that rate achieved by using
    independent codebooks
Write a Comment
User Comments (0)
About PowerShow.com