Brownian Motion, Fractional Brownian motion - PowerPoint PPT Presentation

1 / 22
About This Presentation
Title:

Brownian Motion, Fractional Brownian motion

Description:

... a position of a particle at time t. We can study f from two differing viewpoints. ... Brownian motion and paths are in general, fractals ... – PowerPoint PPT presentation

Number of Views:375
Avg rating:3.0/5.0
Slides: 23
Provided by: abidmaqs
Category:

less

Transcript and Presenter's Notes

Title: Brownian Motion, Fractional Brownian motion


1
Brownian Motion, Fractional Brownian motion
2
In 1827 the botanist R. Brown noticed that minute
particles suspended in a liquid moved on highly
irregular paths. This, and a similar phenomenon
for smoke particles in air, was explained
ultimately as resulting from molecular
bombardment of the particles.
Einstein published a mathematical study of this
notion, which eventually led to Perrirs Nobel
Prize-winning calculation of Avogadros Number.
3
In 1923 Wiener proposed a rigorous mathematical
model that exhibited random behavior similar to
that observed in Brownian Motion.
The paths described by this Wiener process in
3-dimension are quite irregular. This is a good
example of a natural phenomenon with a fractal
appearance. We give examples of such processes
from finance and meteorology
4
A path may be described by a function f R? R2
Brownian motion and paths are in general, fractals
5
Recaptulation of few Statistical Concepts
In an experiment, a measurement is usually
denoted by a variable such as X. In random
experiment, a variable whose measured value can
change (from one replicate of the experiment to
another) is referred to as a random variable.
For example, X might denote the current
measurement in the copper wire experiment. A
random variable is conceptually no different from
any other variable in an experiment.
We use the term random to indicate the noise
disturbances that can change its measured value.
Usually an uppercase letter is used to denote a
random variable.
6
Examples of continuous random variable Electrical
current, length, pressure, temperature, time,
voltage, weight
Examples of discrete random variable Number of
scratches on a surface, proportion of defective
parts among 1000 tested, number of transmitted
bits received in error
Probability that
7
probability distribution or just a distribution
of a random variable X is a description of the
set of probabilities associated with the possible
values of X
? is called mean or expected value of X
8
Covariance and Correlation
In many experiments, more than one variable is
measured. For example, suppose both the diameter
and injection model disk are measured and denoted
by X and Y respectively. These two random
variables are often related. If a pressure in the
mold increases, there might be increase in the
fill of the cavity that results in larger values
for both X and Y. Similarly a pressure decrease
might result in smaller values for both X and Y.
9
The correlation between random variables X and ,
denoted as is defined by
If X and Y are not correlated that is they are
independent events then
An observed time series vector a discrete time
series is a sequences of observance ordered by a
time indent t, where time spans from minus
infinity to plus infinity,
may be view as a realization of a random process
or a segment of an infinite sequence Normal
distribution?
10
Recall a random variable X with probability
density function
for
11
Brownian Motion
To motivate the definition, let us consider a
particle performing a random walk on the real
line. Suppose at small time intervals ? the
particle jumps a small distance ? randomly to
the left or to the right. This might be a
reasonable description of a particle undergoing
random molecular bombardment in one dimension.
Let X?(t) denote the position of the particle at
time t. Then given the position X?(k?) at time
k?, X?((k1)?) is equally likely to be X?(k?)?
or X?(k?)-?. Assuming that the particle starts at
the origin at time 0, then for t gt 0, the
position at time t is described by the random
variable

12
Where Y1, Y2, are independent random variables,
each having probability ½ of equalling 1 and
probability1/2 of equalling -1. Here t/?
denotes the largest integer less than or equal to
t/?. We normalize the step length ? as ? ? so that
The central limit theorem tells us that, for
fixed t, if ? is small then the distribution of
the random variable X?(t) is approximately normal
with mean 0 and variance t, since the Yi have
mean 0 and variance 1.
In the same way, if t and h are fixed, and ? is
sufficiently small, then X?(th)- X?(t) is
approximately normal with mean 0 and variance h.
13
We also note that, if 0?t1?t2??t2m, then the
increments X?(t2)-X?(t1),X?(t4)-X?(t3),,X?(t2m)-X
?(t2m-1) are independent variables. We define
Brownian motion with the limit of the random walk
X?(t1) as ?? 0 in mind.
Let (X, ,P) be a probability space. For our
purpose we call X a random function from 0,?
to R if X(t) is a random variable for each t with
0?tlt?. Occasionally, we consider random functions
on a finite interval t1,t2 instead, in which
case the development is similar. (In the formal
definition of a random process there is an
additional measurability condition, which need
not concern us here.)
14
Of course, we should think of X as defining a
sample function t ? X(?,t) for each point ? in
the sample space ?. Thus we think of points of ?
as parametrizing the functions X 0,??R, and we
think P as a probability measure on this class of
functions.
We define Brownian motion or the Wiener process
to be a random process X such that
BM
  • With probability 1, X(0)0 (i.e. the process
    starts at the origin) and X(t) is a continuous
    function of t
  • for any t ? 0 and h gt 0 the increment X(th) -
    X(t) is normally distributed with mean 0 and
    variance h, thus

15
  • if 0?t1?t2??t2m, the increments
    X(t2)-X?(t1),X?(t4)-X?(t3),,X?(t2m)-X?(t2m-1)
    are independent.

(There is some overkill in this definition (iii)
may be deduced from (i) and (ii)).
Note that it is immediate from (i) and (ii) that
X(t) is itself normally distributed with mean 0
and variance t for each . Observe that the
increments of X are stationary that is, X(th)
X(t) has distributed independent of t.
(On a point of notation we write E(X(t)) to
denote the expectation or mean value of X(t)
some readers may be used to seeing ltX(t)gt,
thought of as the average of X(t) over the
functions in the sample space.)
16
The first question that arises is whether
actually is a random function satisfying the
conditions (BM). It is quite hard to show that
Brownian motion does exist, and we do not do so
here.
The proof use the special properties of the
normal distribution. For example, given that
X(t2)-X(t1) and X(t3)-X(t2) are independent and
normal with mean 0 and variance t2-t1 and t3-t2
respectively, the sum X(t3)-X(t2) is necessarily
normal with mean 0 and variance t3-t1. This is
essential for definition (BM) to be
self-consistent. It should at least seem
plausible that a process X(t) satisfying (BM)
exists, if only as a limit of the random variable
walks X?(t) as ? ? 0.
17
Instead of proving existing we mention two
methods of constructing Brownian sample
functions, for example, with a computer. Both
methods can in fact, be used as a basis for
existence proofs. The first method uses random
walk approximation (1). Values of 1 or -1 are
assigned by coin tossing to Yi for 1 ? i ? m,
where m is large, and X?(t) is plotted
accordingly. If ? is small compared with t, then
this should give a good approximation to a
Brownian sample function.
Alternatively, the random midpoint displacement
method may be used to obtain a sample function
X0,1 ? R. We define the values of X(k2-j)
where 0 ? k ? 2j by induction on j. We set X(0)0
and choose X(1) at random from a normal
distribution with mean 0 and variance 1.
18
Next we select X(1/2) from a normal distribution
with mean ½(X(0)X(1)) and variance ½. At the
next step X(1/4) and X(3/4) are chosen, and so
on. At the jth stage the values X(k2-j) for odd
k are chosen independently from a normal
distribution with mean ½(X((k-1)2-j)
X((k1)2-j)) and variance 2-j . This procedure
determines X(t) at all binary points t k2-j .
Assuming that X is continuous, then X is
completely determined. It may be shown, using
properties of normal distributions, that the
functions thus generated have distributions given
by (BM). The graph of a Brownian sample function
is shown in Figure 11.
19
Figure11
20
A fundamental property of a Brownian motion is
that, with probability 1, the sample function
satisfy a Holder condition of exponent ? for each
? lt 1/2).
Suppose 0 lt ? lt 1/2 with probability 1 Brownian
sample function X 0,1 ? Rn satisfies.
For some H0 gt 0, Where b depends only on ?
21
With probability 1, a Brownian sample path Rn (n
? 2) has Hausdorff dimension and box equal to 2.
With probability 1, the graph of a Brownian
sample path X o,1?R has Hausdorff and box
dimension equal to 1.5.
Fractional Brownian Motion
Brownian motion although of central theoretical
importance, is, for many purposes, too
restrictive. A Brownian sample function is often
regarded as a typical random function, although
its graph has dimension 1.5 almost surely.
However, random functions with graphs of other
dimensions are required for a variety of modeling
purposes.
22
It may be shown that the Brownian process is
unique probability distribution of functions,
which has independent increments that are
stationary and of finite variance. To obtain
sample functions with different characteristics
it is necessary to relax one or more of these
conditions.
There are two usual variations Fractional
Brownian motion has increments which are normally
distributed but no longer independent. Stable
processes, on the other hand, dispense with the
finite-variance condition and this can lead to
discontinuous functions. For simplicity, we just
discuss the graphs of these processes in the
1-dimensional case analogous processes may be
defined in n-dimension
Write a Comment
User Comments (0)
About PowerShow.com