Information, Channel Capacity and MultiUser Rate Regions Selected Concepts from Information Theory - PowerPoint PPT Presentation

1 / 25
About This Presentation
Title:

Information, Channel Capacity and MultiUser Rate Regions Selected Concepts from Information Theory

Description:

X independent of Y Pr(Xi, Yj) = Pr(Xi) Pr(Yj) I(X;Y) = 0. V o d a f o n e C h a i r ... Channel = collection of parallel 1-tap channels (ex.: M = N) U. y. x. l1 ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 26
Provided by: thomasd86
Category:

less

Transcript and Presenter's Notes

Title: Information, Channel Capacity and MultiUser Rate Regions Selected Concepts from Information Theory


1
Information, Channel Capacity and Multi-User
Rate Regions Selected Concepts from
Information Theory
  • Thomas Deckert
  • Vodafone Chair Mobile Communications Systems

2
  • Preliminaries

3
What Channel ?
Single link
Multiple links
Multiple access
Broadcast
Relaying
Arbitrary
4
Single Link Channel Model
v
Transmitted signal
H
y
x
Received signal
Receiver noise
y Hx v
D taps
AWGN
OFDM
Hk,l From TX antenna l to RX antenna k
Flat MIMO
Diagonal Sub-carriers
5
Channel Capacity
Water-filling
AWGN capacity
Capacity maximum mutual information
Ergodic capacity
Capacity region
Outage capacity
All rates below capacity are achievable.
MIMO capacity
6
Topics covered
  • What is information ?
  • Mutual information of transmitted and received
    signals
  • Link capacity maximum mutual information
  • Channel knowledge and water-filling
  • Code word length and channel variations
    Ergodic and outage capacity
  • Multiple users Capacity region

7
  • Single Link

8
Concept of Information
What is information of a signal ?
Signal x realization of random variable X,
values X1, X2, , XK How much does Xi
occurs tell us about X ? ? Information measure
H(Xi)
  • Desired properties of H(Xi)
  • H(Xi) 0
  • H(Xi) f (Pr(Xi)) Pr(Xi) low ? H(Xi) large
  • X1, X2 independent ? H(X1, X2) H(X1) H(X2)
  • H(Xi) log2 Pr(Xi)
  • base 2 ? measured in bits /
    channel use

9
Mutual Information
  • (Average) Mutual Information
  • What will I know about X if I know realization of
    Y ?
  • Reduction in uncertainty due to additional
    observation
  • Measure of dependence of X and Y
  • I(XY) 0
  • X independent of Y ? Pr(Xi, Yj) Pr(Xi) Pr(Yj)
    ? I(XY) 0

10
Mutual Information and Error Probability
Transmit 1 of Q code words
Pick arbitrary x
1
1
Channel
y
. . .
. . .
x
Match ?
Q
Q
n channel uses
Rate R log2Q / nQ 2nR
Given y there is one right x in a set of
2nI(XY) code words
  • Should have Q 2nI(XY)
  • R I(XY)

(Coding Theorem by Shannon, 1948)
11
Maximum Mutual Information Capacity
  • What is the maximum rate for the channel ?
  • Maximize I(XY) over Pr(Xi) ? channel capacity

Channel transition
12
Instantaneous Channel Capacity
  • Recall
  • Capacity given H and power constraint
  • Achieved for Gaussian transmit signal,
  • AWGN M N 1, H 1,

13
Maximizing Mutual Information
  • Aim Maximize I(x y H,Y)
  • Maximize
  • Linear algebra
    should be diagonal

? Choose Y based on H
14
Channel Structure
  • Singular value decomposition of H
  • ? Channel collection of parallel 1-tap channels
    (ex. M N)

l1
y
x

W
U



lM
Receiverprocessing
Transmitterprocessing
Channel
15
Rate-maximizing Input Distribution
  • Mutual information maximized if
  • Data streams mutually independent
  • Put power in channels with
    high effective gain( large)
  • Water-filling solution

Channel i
1
2
3
4
5
16
Capacity and Channel Knowledge
Instantaneous channel capacity
Ergodic (average) channel capacity
  • M independent Gaussian transmit signals
  • Channel known at transmitter
  • Pre-distortion W
  • Power allocation L
  • Rate R(H) C(H) ? On average R CE
  • Channel known at receiver
  • Front-end processing U
  • Power levels L

17
Reduced Transmitter Channel Knowledge
  • Transmitter knows distribution of H but not H
    itself
  • No pre-processing (W) possible
  • Transmit N independent equal-power signals
  • Transmit at what rate ?
  • Constant
  • Below average

18
Channel Knowledge Example
Pr(C(H) gt R)
M N 4 Hij i.i.d. Gaussian, EHij2
1 Channel use one transmission of x
Transmitter does notknow H
Transmitterknows H
R
CE
19
Decoding Delay and Outages
  • Long code words, see all possible channel
    states ? Ergodic capacity
  • Short tolerable decoding delay ? Guaranteed
    rate ?
  • p-outage capacity CpRate possible in (1-p) of
    all states
  • Delay-limited / Zero-outage capacity p
    0Rayleigh, Rice, Nakagami fading ? C0 0

Pr(C(H) gt R)
(1-p)
R
Cp
20
  • Multiple Users

21
Multiple Users
  • Multiple transmitting and receiving users ?
    Network Information Theory

22
Example Multiple Access of 2 Users
  • Given channels. Capacity region
  • for some Pr(x1)Pr(x2)

23
  • Conclusion

24
Summary
  • Mutual information I(x y)
  • Reduction in uncertainty about x by observing y
  • Link capacity C
  • Mutual information maximized over transmit
    distribution
  • Choose R lt C for low probability of transmission
    error
  • Transmitter knows channel ? water-filling
  • Transmitter does not know channel ? excite
    inputs uniformly
  • Long code words ? ergodic capacity
  • Short code words ? outage capacity
  • Multiple users Concept of capacity ? capacity
    region

25
References
  • 1 T.M. Cover and J.A. Thomas, Elements of
    Information Theory, 1st ed. New York John Wiley
    and sons, 1991.
  • 2 A. Goldsmith, Wireless Communications. New
    York Cambridge Univ. Press, 2005, preprint.
    Online. Available http//wsl.stanford.edu/an
    drea/Wireless/Book.ps
  • 3 I.E. Telatar, Capacity of Multi-antenna
    Gaussian Channels, European Transactions on
    Telecommunications (ETT), vol. 10, no. 6, pp.
    585-595, Nov./Dec. 1999.
  • 4 S.V. Hanly and D.N. Tse, Multiaccess Fading
    Channels Part II Delay-Limited Capacities,
    IEEE Transactions on Information Theory, vol. 45,
    no. 5, pp. 2816-2831, Nov. 1998.
Write a Comment
User Comments (0)
About PowerShow.com