Title: Multiterminal Channel Capacity with State Information
1Multi-terminal Channel Capacity with State
Information
- Arak Sutivong, Young-Han Kim, and Mung Chiang
- Information Theory Group
- (Professor Tom Cover)
2Research Goals
1
- To better understand the role of side information
in multi-terminal Information Theory. - Side information at the transmitter(s)
- Side information at the receiver(s)
- Side information at both
- Side information at the transmitter(s) allows for
transmit optimization (e.g., water-filling,
beam-forming, adaptive, etc.) - Wish to quantify its effect on channel capacity.
- Consider primarily discrete memoryless channels
(DMC). - Extend existing results for point-to-point to
multi-terminals.
Focus of this poster
3Causal vs. Noncausal Side Information
2
Causal Side Information
Noncausal Side Information
- Fading coefficients
- Channel interference levels
- State of a Markov channel
- Channel gains
- Memory defect locations
- Host signal (watermarking)
- Crosstalk in DSL
- Known jamming sequence
Examples
- Adaptive rate/power control over Rayleigh fading
channels - MIMO beam-forming
- Precoding
- Multi-tone water-filling
- Storage over defective memory
- Digital watermarking
- DSL crosstalk cancellation
- Transmission in the presence of a known third
party jamming
Scenarios Applications
4Causal Side Information (I)
3
- First introduced by Shannon in 1958.
- Channel side information S is available causally.
5Causal Side Information (II)
4
- Transform the channel with state into a channel
without state, but with an exponentially larger
alphabet size. - Conceptually, the transmitted symbol at time i is
a function of W and Si.
Same Capacity
Each symbol in a codeword comes from X.
Each symbol in a codeword is a vector of length
S, with each element coming from X (i.e., T
X S).
Pe ? 0 if R lt I(TY), i.e. R lt I(X(.)Y)
6Noncausal Side Information (I)
5
- First studied by Gelfand and Pinsker in 1980.
- Channel side information S is available
noncausally.
(where U is a finite alphabet auxiliary random
variable)
7Noncausal Side Information (II)
6
- Generate 2n(RI(US)) Un codewords.
- Randomly color them into 2nR colors.
- Encoder
- For a given Sn, find a codeword Un such that
- (Un,Sn) is jointly strongly typical,
- Color of Un is Wthe message index.
- Based on Un and Sn, compute Xn and transmit it.
- Decoder
- Find Un such that (Un,Yn) is jointly strongly
typical, and declare the color of this Un to be
W. - Pe ? 0 if R lt I(UY) I(US)
8Gaussian Channels with State Information
7
Causal
Noncausal
Costas 1984 Writing on Dirty Paper
State Sn and its power Q affect the channel
capacity. However, it can be shown that the loss
is at most the shaping gain ( 0.25 bits).
State Sn, regardless of its power, can be made
irrelevant by the transmitter as long as the
transmitter has complete knowledge of Sn.
9Degraded Broadcast Channels
8
Causal
Noncausal
Si may represent fading coefficients or
interference levels up to time i (feedback from
the receivers).
Sn may represent interference locally
generated, and is known noncausally (e.g.,
leakage of signals from other co-located transmit
antennas).
(Achievable Region)
10Gaussian Broadcast Channels
9
Causal
Noncausal
The capacity region is unknown
Extra noise Sn is innocuous as long as the
transmitter knows about it!!
11Multiple Access Channels
10
Causal
Noncausal
Si may represent the state of the channel that
is revealed to both transmitters just before
each transmission instant.
Sn may represent a jamming sequence introduced by
a third partyunknown to the receiver, but both
transmitters happen to know about it.
(Achievable Region)
12Gaussian Multiple Access Channels
11
Causal
Noncausal
The capacity region is unknown
Once again, extra noise Sn is harmless as long as
both transmitters know about it!!
13Multi-terminal Communications
12
(Bound)
Capacity region without state information is
still unknown. Existing upper-bounds are based
on a Max-Flow Min-Cut idea.
Causal
Noncausal
No known nontrivial bounds
(Bound)
14Concluding Remarks
13
Causal Side Information
Noncausal Side Information
- Results exist primarily for discrete alphabets
- Utilizes extended alphabets by transforming
channels with state into channels without state. - Lacks theories for continuous alphabet channel,
even for an additive Gaussian channel Y X S
Z!! - Open problems Capacity regions for most
multi-terminal channels (e.g. multiple-access,
broadcast, relay, interference, two-way, and
multi-terminals, etc.) are still unknown.
- Many results exist for both discrete and
continuous alphabets - Utilizes auxiliary random variables and random
binning - Intuition from point-to-point extend well to
multi-terminal - Moral Compensate rather than Override the state
- Costas results extend to MAC, BC, Relay, etc.
- Open problems Capacity regions for interference,
two-way, and multi-terminals in general are still
unknown.