Title: Part 5 Response of Linear Systems
1Part 5Response of Linear Systems
- Linear Filtering of a Random Signals
- Power Spectrum Analysis
- Linear Estimation and Prediction Filters
- Mean-Square Estimation
26. Linear Filtering of a Random Signal
- Linear System
- Our goal is to study the output process
statistics in terms of the input process
statistics and the system function.
3Deterministic System
Deterministic Systems
4Memoryless Systems
The output Y(t) in this case depends only on the
present value of the input X(t). i.e.,
.
5Linear Time-Invariant Systems
Time-Invariant System Shift in the input results
in the same shift in the output. Linear
Time-Invariant System A linear system with
time-invariant property.
6Linear Filtering of a Random Signal
7Theorem 6.1
Pf
8Theorem 6.2
- If the input to an LTI filter with impulse
response h(t) is a - wide sense stationary process X(t), the output
Y(t) has the - following properties
- Y(t) is a WSS process with expected value
- autocorrelation function
- (b) X(t) and Y(t) are jointly WSS and have I/O
cross- correlation by - (c) The output autocorrelation is related to the
I/O - cross-correlation by
9Theorem 6.2 (Contd)
Pf
10Example 6.1
X(t), a WSS stochastic process with expected
value ?X 10 volts, is the input to an LTI
filter with What is the expected value of the
filter output process Y(t) ? Sol
Ans 2(e0.5?1) V
11Example 6.2
A white Gaussian noise process X(t) with
autocorrelation function RW (? ) ?0? (? ) is
passed through the moving- average filter For
the output Y(t), find the expected value EY(t),
the I/O cross-correlation RWY (? ) and the
autocorrelation RY (? ). Sol
12Theorem 6.3
If a stationary Gaussian process X(t) is the
input to an LTI Filter h(t) , the output Y(t) is
a stationary Gaussian process with expected
value and autocorrelation given by Theorem
6.2. PfOmit it.
13Example 6.3
For the white noise moving-average process Y(t)
in Example 6.2, let ?0 10?15 W/Hz and T
10?3 s. For an arbitrary time t0, find PY(t0) gt
4?10?6. Sol
Ans Q(4) 3.17?10?5
14Theorem 6.4
The random sequence Xn is obtained by sampling
the continuous-time process X(t) at a rate of
1/Ts samples per second. If X(t) is a WSS
process with expected value EX(t) ?X and
autocorrelation RX (? ), then Xn is a WSS random
sequence with expected value EXn ?X and
autocorrelation function RX k RX (kTs). Pf
15Example 6.4
Continuing Example 6.3, the random sequence Yn is
obtained by sampling the white noise
moving-average process Y(t) at a rate of fs
104 samples per second. Derive the
autocorrelation function RY n of Yn. Sol
16Theorem 6.5
If the input to a discrete-time LTI filter with
impulse response hn is a WSS random sequence,
Xn, the output Yn has the following properties.
(a) Yn is a WSS random sequence with expected
value and
autocorrelation function (b) Yn and Xn
are jointly WSS with I/O cross-correlation (c)
The output autocorrelation is related to the I/O
cross- correlation by
17Example 6.5
A WSS random sequence, Xn, with ?X 1 and
auto- correlation function RXn is the input to
the order M?1 discrete-time moving-average
filter hn where For the case M 2, find the
following properties of the output random
sequence Yn the expected value ?Y, the
autocorrelation RYn, and the variance
VarYn. Sol
18Example 6.6
A WSS random sequence, Xn, with ?X 0 and
auto- correlation function RXn ? 2?n is
passed through the order M?1 discrete-time
moving-average filter hn where Find the output
autocorrelation RYn. Sol
19Example 6.7
A first-order discrete-time integrator with WSS
input sequence Xn has output Yn Xn 0.8Yn-1 .
What is the impulse response hn ? Sol
20Example 6.8
Continuing Example 6.7, suppose the WSS input Xn
with expected value ?X 0 and autocorrelation
function is the input to the first-order
integrator hn . Find the second moment, EYn2 ,
of the output. Sol
21Theorem 6.6
If Xn is a WSS process with expected value ? and
auto- correlation function RXk, then the vector
has correlation matrix and expected value
given by
22Example 6.9
The WSS sequence Xn has autocorrelation function
RXn as given in Example 6.5. Find the
correlation matrix of Sol
23Example 6.10
The order M?1 averaging filter hn given in
Example 6.6 can be represented by the M element
vector The input is The output vector
, then .
24- Linear Filtering of a Random Signals
- Power Spectrum Analysis
- Linear Estimation and Prediction Filters
- Mean-Square Estimation
257. Power Spectrum Analysis
- Definition Fourier Transform
- Definition Power Spectral Density
26Theorem 7.1
Pf
27Theorem 7.2
Pf
28Example 7.1
Sol
29Example 7.2
A white Gaussian noise process X(t) with
autocorrelation function RW (? ) ?0? (? ) is
passed through the moving- average filter For
the output Y(t), find the power spectral density
SY (f ). Sol
30Discrete-Time Fourier Transform (DTFT)
Definition
Example 7.3 Calculate the DTFT H(?) of the
order M?1 moving-average filter hn of Example
6.6. Sol
31Power Spectral Density of a Random Sequence
Definition
Theorem 7.3 Discrete-Time Winer-Khintchine
32Theorem 7.4
33Example 7.4
Sol
34Example 7.5
Sol
35Cross Spectral Density
Definition
36Example 7.6
Sol
37Example 7.7
Sol
38Frequency Domain Filter Relationships
x(t)
Time DomainY(t) X(t)?h(t) Frequency Domain
W(f) V(f)H(f)
where V(f) FX(t),
W(f) FY(t), and
H(f) Fh(t).
39Theorem 7.5
Pf
40Example 7.8
Sol
41Example 7.9
Sol
42Example 7.10
Sol
43Theorem 7.6
Pf
44I/O Correlation and Spectral Density Functions
Time Domain
Frequency Domain
45- Linear Filtering of a Random Signals
- Power Spectrum Analysis
- Linear Estimation and Prediction Filters
- Mean-Square Estimation
468. Linear Estimation and Prediction Filters
- Linear Predictor
- Used in cellular telephones as part of a speech
compression algorithm. - A speech waveform is considered to be a sample
function of WSS process X(t). - The waveform is sampled with rate 8000
samples/sec to produce the random sequence Xn
X(nT). - The prediction problem is to estimate a future
speech sample, Xnk using N previous samples
Xn-M1 , Xn-M2 , , Xn. - Need to minimize the cost, complexity, and power
consumption of the predictor.
47Linear Prediction Filters
Use to estimate a future sample XXnk. We
wish to construct an LTI FIR filter hn with input
Xn such that the desired filter output at time n,
is the linear minimum mean square error
estimate Then we have The predictor can be
implemented by choosing .
48Theorem 8.1
Let Xn be a WSS random process with expected
value EXn 0 and autocorrelation function
RXk. The minimum mean square error linear
filter of order M?1, for predicting Xnk at time
n is the filter such that where is
called as the cross-correlation matrix.
49Example 8.1
Xn be a WSS random sequence with EXn 0 and
autocorrelation function RXk (? 0.9)k. For
M 2 samples, find , the
coefficients of the optimum linear predictor for
X Xn1, given . What is
the optimum linear predictor of Xn1 , given Xn
?1 and Xn. What is the mean square error of the
optimal predictor? Sol
50Theorem 8.2
If the random sequence Xn has a autocorrelation
function RXn bk RX0, the optimum linear
predictor of Xnk, given the M previous samples
is and the
minimum mean square error is
. Pf
51Linear Estimation Filters
Estimate XXn based on the noisy observations
YnXnWn. We use the vector
of the M most recent observations. Our
estimates will be the output resulting from
passing the sequence Yn through the LTI FIR
filter hn. Xn and Wn are assumed independent WSS
with EXnEWn0 and autocorrelation function
RXn and RWn. The linear minimum mean square
error estimate of X given the observation Yn is
Vector From
52Theorem 8.3
Let Xn and Wn be independent WSS random processes
with EXnEWn0 and autocorrelation function
RXk and RWk. Let YnXnWn. The minimum mean
square error linear estimation filter of Xn of
order M?1 given the input Yn is given by such
that
53Example 8.2
The independent random sequences Xn and Wn have
expected zero value and autocorrelation function
RXk (?0.9)k and RWk (0.2)?k. Use M 2
samples of the noisy observation sequence Yn
XnWn to estimate Xn. Find the
linear minimum mean square error prediction
filter Sol
54- Linear Filtering of a Random Signals
- Power Spectrum Analysis
- Linear Estimation and Prediction Filters
- Mean-Square Estimation
559. Mean Square Estimation
Linear Estimation Observe a sample function of a
WSS random process Y(t) and design a linear
filter to estimate a sample function of another
WSS process X(t), where Y(t) X(t)
N(t). Wiener Filter The linear filter that
minimizes the mean square error. Mean Square
Error
56Theorem 9.1Linear Estimation
57Example 9.1
Sol