Title: Random Processes Introduction (2)
1Random ProcessesIntroduction (2)
- Professor Ke-Sheng Cheng
- Department of Bioenvironmental Systems
Engineering - E-mail rslab_at_ntu.edu.tw
2Stochastic continuity
3(No Transcript)
4(No Transcript)
5(No Transcript)
6(No Transcript)
7(No Transcript)
8Stochastic Convergence
- A random sequence or a discrete-time random
process is a sequence of random variables X1(?),
X2(?), , Xn(?), Xn(?), ? ? ?. - For a specific ?, Xn(?) is a sequence of
numbers that might or might not converge. The
notion of convergence of a random sequence can be
given several interpretations.
9Sure convergence (convergence everywhere)
- The sequence of random variables Xn(?)
converges surely to the random variable X(?) if
the sequence of functions Xn(?) converges to X(?)
as n ? ? for all ? ? ?, i.e., - Xn(?) ? X(?) as n ? ? for all ? ? ?.
10(No Transcript)
11(No Transcript)
12Almost-sure convergence (Convergence with
probability 1)
13(No Transcript)
14Mean-square convergence
15Convergence in probability
16(No Transcript)
17Convergence in distribution
18Remarks
- Convergence with probability one applies to the
individual realizations of the random process.
Convergence in probability does not. - The weak law of large numbers is an example of
convergence in probability. - The strong law of large numbers is an example of
convergence with probability 1. - The central limit theorem is an example of
convergence in distribution.
19Weak Law of Large Numbers (WLLN)
20Strong Law of Large Numbers (SLLN)
21The Central Limit Theorem
22Venn diagram of relation of types of convergence
Note that even sure convergence may not imply
mean square convergence.
23Example
24(No Transcript)
25(No Transcript)
26(No Transcript)
27Ergodic Theorem
28(No Transcript)
29(No Transcript)
30The Mean-Square Ergodic Theorem
31- The above theorem shows that one can expect a
sample average to converge to a constant in mean
square sense if and only if the average of the
means converges and if the memory dies out
asymptotically, that is , if the covariance
decreases as the lag increases.
32Mean-Ergodic Processes
33Strong or Individual Ergodic Theorem
34(No Transcript)
35(No Transcript)
36Examples of Stochastic Processes
- iid random process
- A discrete time random process X(t), t 1, 2,
is said to be independent and identically
distributed (iid) if any finite number, say k, of
random variables X(t1), X(t2), , X(tk) are
mutually independent and have a common cumulative
distribution function FX(?) .
37- The joint cdf for X(t1), X(t2), , X(tk) is given
by - It also yields
- where p(x) represents the common probability
mass function.
38(No Transcript)
39Random walk process
40- Let ?0 denote the probability mass function of
X0. The joint probability of X0, X1, ? Xn is
41(No Transcript)
42- The property
- is known as the Markov property.
- A special case of random walk the Brownian
motion.
43Gaussian process
- A random process X(t) is said to be a Gaussian
random process if all finite collections of the
random process, X1X(t1), X2X(t2), , XkX(tk),
are jointly Gaussian random variables for all k,
and all choices of t1, t2, , tk. - Joint pdf of jointly Gaussian random variables
X1, X2, , Xk
44(No Transcript)
45Time series AR random process
46The Brownian motion (one-dimensional, also known
as random walk)
- Consider a particle randomly moves on a real
line. - Suppose at small time intervals ? the particle
jumps a small distance ? randomly and equally
likely to the left or to the right. - Let be the position of the particle on
the real line at time t.
47- Assume the initial position of the particle is at
the origin, i.e. - Position of the particle at time t can be
expressed as
where are independent
random variables, each having probability 1/2 of
equating 1 and ?1. - ( represents the largest integer not
exceeding .)
48Distribution of X?(t)
- Let the step length equal , then
- For fixed t, if ? is small then the distribution
of is approximately normal with mean 0
and variance t, i.e., .
49Graphical illustration of Distribution of X?(t)
50- If t and h are fixed and ? is sufficiently small
then
51Distribution of the displacement
- The random variable is
normally distributed with mean 0 and variance h,
i.e.
52- Variance of is dependent on t, while
variance of is not. - If , then
, -
- are independent random variables.
53(No Transcript)
54Covariance and Correlation functions of