Introduction to Models Stochastic Models - PowerPoint PPT Presentation

1 / 30
About This Presentation
Title:

Introduction to Models Stochastic Models

Description:

Introduction to Models - Stochastic Models. Dr Shane Whelan, FFA ... Lemma : A process with independent increments has the Markov Property. Proof: On Board ... – PowerPoint PPT presentation

Number of Views:1756
Avg rating:3.0/5.0
Slides: 31
Provided by: friendspro
Category:

less

Transcript and Presenter's Notes

Title: Introduction to Models Stochastic Models


1
Introduction to Models - Stochastic Models
  • Dr Shane Whelan, FFA
  • L527

2
Review of Chapter 1
  • Real ModellingNot Mathematics
  • What a model is and what is its objective
  • Classifying Models (deterministic, stochastic)
  • Components of Model (structural part, parameters)
  • Computers Modelling (the revolution)
  • Lessons from history of modelling
  • Orders of Complexity in Modelling

3
From now on
  • To model stochastically, we need to appreciate
    the different forms of stochastic models
  • This is the key content of this short course to
    overview stochastic processes that are widely
    used in modelling by actuaries
  • Introduce key concepts
  • Give some straightforward examples

4
Chapter 2
  • Foundational Concepts in Stochastic Processes

5
Definition of Stochastic Process
  • Definition A stochastic process is a sequence or
    continuum of random variables indexed by an
    ordered set T.
  • Notes
  • Generally, of course, T records time.
  • A stochastic process is often denoted Xt, t?T
    or, as I prefer, ltXtgt, t?T.
  • Recall
  • State space discrete time process continuous
    time process.

6
Examples of Stochastic Processes
  • Discrete White Noise
  • A sequence of independent identically distributed
    random variables, Z0, Z1,Z2,
  • Important sub-classifications include zero-mean
    white noise, where EZi0
  • symmetric white noise where the (common)
    distribution is symmetric.
  • General random walk
  • Let Z1, Z2, Z2, be white noise and define
  • Define
  • with, say, X0, a general random variable.
  • Then ltXngt is a random walk.
  • When Zt can only take values ?1 then process
    known as a simple random walk.
  • Generally, we set X00.

7
Defining a Given Stochastic Process
  • Defining (or wholly understanding) the stochastic
    process, ltXtgt, for all t?T amounts to defining
    the joint distribution Xt1, Xt2,,Xtn for all
    t and all n.
  • Not easy to do and very cumbersome.
  • But generally use indirect means, e.g., by
    defining the transition process.
  • Reconsider how we defined white noise and a
    random walk.

8
Picture of Joint Distribution Just two variates
  • Density Function of Bivariate Normal

9
Defining a Given Stochastic Process
  • Sample path of process is a joint realisation of
    the random variables Xt, for all t?T.
  • Sample path is a function from T to state space
  • Each sample path has an associated probability.

10
Some Sample Paths in Stochastic Salary Model
(from before)
11
Segment of Sample Path from Symmetric Random Walk
12
Increments
  • Consider Xtm Xt . This is known as an
    m-increment of the process.
  • Xt1 Xt is simply known as an increment.
  • Often defining how the process evolves through
    time is easier to get a handle onand a more
    natural description of the process (e.g.,
    evolution, many games, etc.)
  • A process is said to have independent increments
    if Xtm Xt is independent of the past of the
    process for all t and m.
  • A process is said to have stationary increments
    if the increments have the same distribution.

13
Concept Stationarity
  • Definition A stochastic process is said to be
    stationary if the joint distributions of Xt1,
    Xt2,,Xtn and Xk1, Xk2,,Xkn are the same
    for all t, k and all n.
  • Hence statistical properties unaffected by a time
    shift.
  • In particular, Xt and Xk have the same
    distribution
  • In particular, the same mean and variance.
  • Stationarity is a stringent requirement,
    difficult to test in practice.
  • Note that the assumption of stationarity
    sweats the data allows max. use of available
    data.

14
Concept Weak Stationarity
  • Definition A stochastic process is said to be
    weakly stationary if
  • EXtEXk for all t and k.
  • CovXt , Xtm is a function only of m, for all
    t and m.
  • Remarks
  • Strong stationarity implies weak stationarity.
  • Weak stationarity used extensively in time series
    analysis
  • Remark Weak stationarity is not a foundational
    concept it says little enough about the
    underlying distribution and relationship
    structure. It is more practical, though.

15
Concept The Markov Property
  • When the future evolution of the system depends
    only on its current state it is not affected by
    the past the system has the Markov property.
  • Definition Let ltXtgt, t? ? (the natural numbers)
    be a (discrete time) stochastic process. Then
    ltXtgt, is said to have the Markov property if, ?t
  • PXt1 Xt, Xt-1,Xt-2,,X0PXt1 Xt.
  • Definition Let ltXtgt, t? ? (the real numbers) be
    a (continuous time) stochastic process. Then
    ltXtgt, is said to have the Markov property if, ?t,
    and all sets A
  • PXt?A Xs1x1, Xs2x2,,XsxPXt?AXsx
  • Where s1lts2ltltsltt.

16
Markov Processes
  • Definition A stochastic process that has the
    Markov property is known as a Markov process.
  • If state space and time is discrete then process
    known as Markov chain.
  • When state space is discrete but time is
    continuous then known as Markov jump process.

17
Concept Martingales (in Discrete Time)
  • A discrete time stochastic process Xt , t?0, is
    said to be a martingale if
  • EXtlt? for all t.
  • EXnX0,,Xm-1, XmXm for all mltn
  • Explanation the current value Xm is the optimal
    estimator of all future values. All known
    information by time m on the future of the
    process is factored into Xm
  • A generalisation of the notion of a fair game.
  • Useful concept in probability theory as many
    important limit theorems can be proved for
    martingales.
  • The building block of much of capital market
    theory.

18
Simple Property of Martingales
  • Lemma If ltXtgt is a martingale then
  • EXtEX0
  • Proof Use property of iterative expectations.

19
Table Classifying White Noise Random Walk
  • Key Y yes, always ? no, never
    sometimes, depends.

20
To Prove
  • Lemma A process with independent increments has
    the Markov Property.
  • Proof On Board

21
Poisson Process
  • Definition A Poisson process with rate ? is a
    continuous-time process Nt, t?0, such that
  • N00
  • ltNtgt has independent increments
  • ltNtgt has Poisson distributed increments, i.e.,
  • where n??

22
Remarks on Poisson Process
  • Poisson Process is a Markov jump process, i.e.,
    Markovian with a discrete state space in
    continuous time.
  • It is not weakly stationary.
  • Think of the Poisson Process as the stochastic
    generalisation of the deterministic natural
    numbers stochastic counting.
  • It is a central process in insurance and finance
    modelling due to role as a natural stochastic
    counting process, e.g., number of claims.
  • Uppsala Thesis of 1903 of Filip Lundberg.

23
Compound Poisson Process
  • Definition Let ltNtgt be a Poisson process and
    let Z1,Z2,Z3,be white noise. Then Xt is said to
    be a Compound Poisson Process where
  • With convention when Nt0 then Xt0.

24
Remarks on Compound Poisson Process
  • We are stochastically counting incidences of an
    event with a stochastic payoff.
  • Markov property holds.
  • Important as model for cumulative claims on
    insurance companythe Cramér-Lundberg model
    building on Lundbergs Uppsala Thesisthe basis
    of classical risk theory
  • Key problem in classical risk theory is
    estimating the probability of ruin,
  • i.e., ? s.t. ?(u)Puct-Xtlt0, for some tgt0.

25
More Special Processes MA(p)
  • Let Z1, Z2, Z3, be white noise and let ?i be
    real numbers. Then ltXngt is a moving average
    process of order p iff
  • Notes
  • An MA(p) process is stationary but not iid.
  • Moving average processes are stationary but not,
    in general, Markovian.

26
Brownian Motion (or Wiener Process)
  • Definition Brownian motion, Bt, t?0, is a
    stochastic process with state space ? (the real
    line) such that
  • B00
  • Bt has independent stationary increments
  • And either
  • Bt-Bs is distributed N(?(t-s), ?2(t-s))
  • Or
  • ltBtgt has continuous sample paths.

27
Remarks on Brownian Motion
  • Standard Brownian motion is when B00, ?0, and
    ?21.
  • Simpler definition Brownian motion is a
    continuous process with independent Guassian
    increments.
  • Guassian Normal
  • ? is known as the drift.
  • Sample paths have no jumps deep result.
  • This is the continuous time analogue of a random
    walk.
  • By Central Limit Theorem, ltBtgt is the limiting
    continuous stochastic process for a wide class of
    discrete time processes important result.

28
Quick Questions
  • Is the stochastic process of life stationary?
  • Is White Noise stationary?
  • Is a Random Walk stationary?
  • Try to think of a stationary process which is not
    iid.

29
Question
  • Let ltXtgt be a simple random walk with prob. of
    an upward move given by p.
  • Calculate P(X22X00)
  • Calculate P(X20, X42X00)
  • Is the random walk stationary?

30
Review of Chapter 2
  • Basic terminology
  • Stochastic process sample path m-increment,
    independent increment.
  • Foundational concepts
  • Stationary process weak stationarity Markov
    property martingale
  • Some elementary examples
  • White noise random walk moving average (MA).
  • Some less-elementary examples
  • Poisson process compound Poisson process
    Brownian motion (or Wiener Process).
Write a Comment
User Comments (0)
About PowerShow.com