Title: Models Stochastic Models
1Models - Stochastic Models
2Review of Last 3 Lectures Chapter 1
- Introduction to (Actuarial) Modelling
- Classifying Models
- Components of Model
- Building a Model 10 Helpful Steps
- Advantages of Modelling
- Drawbacks of Modelling (that must be guarded
against) - Key points to assess the suitability of a model.
- Some further considerations in modelling.
- Case Study Lessons from econometric modelling in
UK over last four decades.
3Next 3 -5 Lectures Chapter 2
- Basic terminology
- Stochastic process sample path m-increment,
stationary increment. - Foundational concepts
- Stationary process weak stationarity Markov
property filtrations - Some elementary examples
- White noise random walk moving average (MA).
- Some important practical examples
- Poisson process compound Poisson process
Brownian motion (or Wiener Process).
4Chapter 2
- Basic Terminology Foundational Concepts of
Stochastic Processes
5Definition of Stochastic Process
- Definition A stochastic process is a sequence or
continuum of random variables indexed by an
ordered set T. - Generally, of course, T records time.
- A stochastic process is often denoted Xt, t?T.
I prefer ltXtgt, t?T, so as to avoid confusion with
the state space. - Recall the set of values that the random
variables Xt are capable of taking is called the
state space of the process, S.
6Comment
- A stochastic process is used as a generic model
for a time-dependent random phenomenon. So, just
as a single random variable describes a static
random phenomenon, a stochastic process is a
collection of random variables Xt, one for each
time t in some set J. The process is denoted Xt
t?J. - We are interested in modelling ltXtgt at each time
t, which in general depends on previous values in
sequence (i.e., there is a path-dependency).
7Examples
- Example 1 Discrete White Noise
- A sequence of independent identically distributed
random variables, Z0, Z1,Z2,is known as white
noise. - Important sub-classifications include zero-mean
white noise, i.e., EZi0 symmetric white
noise where the distribution of the random
variable is symmetric. - Example 2 General random walk
- Let Z0, Z1,Z2,be white noise and define
- Xn?nZt, with X0x0 (the initial value). Then
ltXngt is a random walk. - When Zt can only take values ?1 then process
known as a simple random walk. Generally we set
X00.
8Defining a Given Stochastic Process
- Defining (or wholly understanding) ltXtgt, for all
t?T amounts to defining the joint distribution
Xt1, Xt2,,Xtn for all t and all n. - Not easy to do and very cumbersome.
- But generally use indirect means, e.g., by
defining the transition process.
92-Dimensional Distribution
10Increments
- Consider Xtm Xt . This is known as an
m-increment of the process. - A 1-increment is simply known as an increment (of
the process). - Often defining how the process evolves through
time is easier to get a handle onand a more
natural description of the process (e.g.,
evolution, many games, etc.) - Hence stochastic processes often defined by their
initial value and how each increment is
generated. See how random walk was defined. - A process is said to have independent increments
if Xtm Xt is independent of the past of the
process for all t and m. - A process is said to have stationary increments
if the increments have the same distribution.
11Sample Path
- The sample path of process is a joint realisation
of the random variables Xt, for all t?T. - Remarks
- Sample path is a function from T to state space
- Each sample path has an associated probability.
- Example 3 Consider model of salary
- Progression, where salary at future
- time t is modelled as
- Equally likely sample paths from
- this model are graphed alongside.
12Stationarity
- Definition A stochastic process is said to be
stationary if the joint distributions of Xt1,
Xt2,,Xtn and Xk1, Xk2,,Xkn are the same
for all t, k and all n. - Remarks
- Stationarity means that statistical properties
unaffected by a time shift. - In particular, Xt and Xk have the same
distribution - A stringent requirement, difficult to test
- The assumption of stationarity sweats the data
allows max. use of available data.
13Five Quick Questions
- Is white noise stationary?
- Is a random walk stationary?
- Is the Salary(t) model a stationary model?
- Is the stochastic process of life stationary?
- Try to think of a stationary process which is not
iid.
14Weak Stationarity
- Definition A stochastic process is said to be
weakly stationary if - EXtEXk for all t and k.
- CovXt , Xtm is a function only of m, for all
t and m. - Remarks
- Strong stationarity implies weak stationarity.
- Concept used extensively in time series analysis
- Remark Weak stationarity is not a foundational
concept it says little enough about the
underlying distribution and relationship
structure. It is more practical, though.
15The Markov Property
- When the future evolution of the system depends
only on its current state it is not affected by
the past the system is said to possess the
Markov property. - Definition Let ltXtgt, t? ? (the natural numbers)
be a (discrete time) stochastic process. Then
ltXtgt, is said to have the Markov property if, ?t - PXt1 Xt, Xt-1,Xt-2,,X0PXt1 Xt.
- Definition Let ltXtgt, t? ? (the real numbers) be
a (continuous time) stochastic process. Then
ltXtgt, is said to have the Markov property if, ?t,
and all sets A - PXt?A Xs1x1, Xs2x2,,XsxPXt?AXsx
- Where s1lts2ltltsltt.
16Markov Processes
- Definition A stochastic process that has the
Markov property is known as a Markov process. - If state space and time is discrete then process
known as Markov chain (see Chapter 3). - When state space is discrete but time is
continuous then known as Markov jump process (see
Chapter 4).
17To Prove
- Lemma 1.1 A process with independent increments
has the Markov Property. - Proof On Board
- Lemma 1.2 Our definition of the Markov property
(discrete time) is equivalent to - PXt1 Xs, Xs-1,Xs-2,,X0PXt1 Xs, where
s?t. - Proof On Board
18More Examples of Stochastic Processes
- Example 4 An MA(p) process
- Let Z1, Z2, Z3, be white noise and let ?i be a
real number for each i. Then Xn is a moving
average process of order p iff (iff if and only
if) - Remarks
- Note process is stationary but not independent
and identically distributed (iid). - Moving average processes are stationary but not,
in general, Markovian.
19More Examples of Stochastic Processes
- Definition A Poisson process with rate ? is a
continuous-time process Nt, t?0 such that - N00
- ltNtgt has independent increments
- ltNtgt has Poisson distributed increments, i.e.,
- where n??
20Remarks on Poisson Process
- Poisson Process is a Markov jump process, i.e.,
has Markov property with a discrete state space
in continuous time. - It is not even weakly stationary.
- Think of it as the stochastic generalisation of
the deterministic natural numbers stochastic
counting. - A central process in insurance and finance due to
role as the natural stochastic counting process,
e.g., number of claims.
21Compound Poisson Process
- Definition Let ltNtgt be a Poisson process and
let Z1, Z2, Z3,be white noise. Then ltXtgt is said
to be a compound Poisson process where - With convention when Nt0 then Xt0.
22Remarks on Compound Poisson Process
- We are stochastically counting incidences of an
event with a stochastic payoff. - Markov property holds.
- Important as model for cumulative claims on
insurance company - the Cramér-Lundberg model after Lundbergs
Uppsala thesis of 1903the basis of classical
risk theory - Key problem in classical risk theory is
estimating the probability of ruin, - i.e., ?(u) such that ?(u)Puct-Xtlt0, for some
tgt0.
23Brownian Motion (or Wiener Process)
- Definition Brownian motion, ltBtgt, t?0, is a
stochastic process with state space ? (the real
line) such that - B00
- Bt has independent increments
- Bt-Bs is distributed N(?.(t-s), ?2.(t-s))
24Remarks on Brownian Motion
- Guassian Normal distribution.
- ? is known as the drift.
- Standard Brownian motion is Brownian motion when
B00, ?0, and ?21. - Sample paths have no jumps.
- This is the continuous time analogue of a random
walknot obvious but true - By CLT,ltBtgt is the limiting continuous
stochastic process for a wide class of discrete
time processes. - Simpler definition Brownian motion is a
continuous process with independent Guassian
increments Deep result.
25Question 1
- Let ltXtgt be a simple random walk with prob. of
an upward move given by p. Calculate - P(X22,X53X00)
- P(X20, X42X00)
- Is the random walk stationary?
- What is the joint distribution of X2, X4, given
X00 - Prove that ltXtgt has the Markov property
26Question 2
- Let ltXtgt, t?Z be a white noise process with
EXt0 and EXt2lt?. - Prove that the correlation between Xt-1 and
- (Xt-Xt-1) is -(2)-½.
27Filtrations
- Let ltXtgt be a stochastic process
- Associated with each stochastic process we have a
sample space ? each point (or outcome) in ?
corresponds to a sample path (i.e., a single
realisation of the process). - Also associated with each stochastic process we
have a set of events F a collection of subsets
of ? (forming a sigma algebra) each of which have
an associated probability. - In our case, for each time t, we define Ft (Ft
?F), which is the subset of events known at time
t, - i.e., A?Ft iff Af(X1,,Xt). So, as each X1,Xt
take known values at time t, A also takes a known
value by time t. - The family of (nested) sets, (Ft), t?0 is known
as the natural filtration associated with the
stochastic process ltXtgt. - It describes the information gained from
observing the process up to time t.
28Markov Property Defined Again
- Definition The stochastic process ltXtgt is said
to have the Markov property iff - PXtx FsPXtx Xs
- for all ts
- where (Ft)t0 is the natural filtration
associated with ltXtgt.
29Models based on Markov Chains
- Model 1 The No Claims Discount (NCD) system is
where the motor insurance premium depends on the
drivers claims record. It is a simple example of
a Markov chain. - Instance Three states 0 discount 25
discount and 50 discount. A claim-free year
results in a transition to a higher discount (or
remain at the highest). A claim moves to the next
lower discount level (or remain at 0).
30Model 2
- Consider the 4-state NCD model given by
- State 0 0 Discount
- State 1 25 Discount
- State 2 40 Discount
- State 3 60 Discount
- Here the transition rules are move up one
discount level (or stay at max) if no claim in
the previous year. Move down one-level if claim
in previous year but not the year before move
down 2 levels if claim in two immediately
preceding years.
31Model 2
- This is not a Markov chain
- PXn10Xn2, Xn-11? PXn10Xn2, Xn-13
- But
- We can simply construct a Markov chain from
Model 2. Consider the 5-state model with states
0,1,3 as before but define - State 2 40 discount and no claim in previous
year. - State 2- 40 discount and claim in the previous
year. - This is now a 5 state Markov chain.
- Check!
32Model 3 Another Example of Making A Process into
a Markov Chain
- Let us suppose that whether it rains today
depends on the weather in the last two days.
Specifically - If it rained in each of last two days then
probability that it will rain today is 0.5 - If it rained yesterday but not the day before, it
will rain today with probability 0.4. - If it did not rain yesterday but rained the day
before that, the probability of rain today is
0.3. - If it did not rain in the last two days then the
probability of rain today is 0.2 - The process above can be transformed into a 4
state Markov chain model.
33Models - Stochastic Models
- Completes Chapter 2
- Shane Whelan
- L527