Title: Applied Signal Processing Emphasizing Nonlinear Dynamics
1Applied Signal ProcessingEmphasizing Nonlinear
Dynamics
- Presentation slides available at
- http//www.viskom.oeaw.ac.at/joydeep/course.html
2Course Contents
- Prologue
- Modeling Basics
- Nonlinear Dynamical System Theory
- State Space Reconstruction
- Quantification of Dynamics
- Nonlinearity
- Fractal Scaling Analysis
- Modeling of Oscillatory Process Periodic
Decomposition - Singular Spectrum Analysis
- Time Frequency Analysis
- Hierarchical Model Based Analysis
- Neural Networks
- Polynomial based Modeling
- Multivariate Signal Processing
- Epilogue
-
3Application of Applied Signal Processing One
example Human Brain
4Nonlinearity Surrogates
Histogram
S 0.01
Original
Angle variation
Power spectrum
Histogram
Angle variation
S 0.39
Surrogate
Power spectrum
5Nonlinearity Surrogates
Histogram
S 0.01
Original
Angle variation
Power spectrum
Histogram
Angle variation
S 0.39
Surrogate
Power spectrum
6Scaling Analysis
ablue1.02
ared0.53
log F(n)
agreen1.46
log n
7Scaling Analysis
ablue1.02
ared0.53
log F(n)
agreen1.46
log n
8Synchronization
9Henon-X x(k1) 1.4 - x2(k) -
0.3x(k-1) Henon-Y y(k1) 1.4 (Cx(k) (1
C)y(k))y(k) 0.3y(k-1) C coupling
coefficient Cxy
No coupling
Henon-Y
Coupled
Henon-X
10Henon-X
Henon-Y
No coupling
SI(12)0.001 SI(21)0.001
11Henon-X
Henon-Y
Very Weakly Coupled
SI(12)0.044 SI(21)0.023
12Temporal Synchrony?
Importance of Time-Frequency Analysis
13What is a Signal?
- Any system or a variable when measured over time
produces a signal - Signal ? Time Series ? Data
- Only output is available, dynamics and inputs are
unknown - Assumption There exists strong internal coupling
between the variables so the scalar valued signal
contains enough information about the underlying
dynamics
14Signal Processing
- Manipulation of a signal with the following
purposes - To remove unwanted signal components or noise
- To extract information by rendering it in a
obvious and useful form - To predict future values
- To detect abnormalities
- To control the dynamics of the source
15Deterministic vs. Random
- Deterministic explicit mathematical description
is availablee.g., height of a ball throwing
vertically, motion of a satellite in orbit,
temperature of fluid under external heat etc - Random only probabilistic description, no
explicit descriptione.g., turbulent flow,
electrical output of a noise generator, brain
signals (EEG) etc.
16(No Transcript)
17Sinusoidal Periodic Signal
- Instantaneous value at time t
- x(t) A sin(2pfotq) A amplitude fo
frequency q initial phase angle wrt
time origin - Tp Period
x(t)
t
Tp
Amplitude
Ex Voltage output of an electrical
alternator,vibratory motion of an unbalanced
weight,
Frequency
fo
18Complex Periodic Signal
- x(t) x(t nTp), n 1, 2, 3,
- It can be Fourier expanded
-
19Complex Periodic Signal
- Alternatively,
- Complex periodic signal ? DC component (X0) an
infinite number of sinusoidal components, called
harmonics, with amplitudes Xn and phases qn.
20Almost Periodic Signal
- Sum of two or more periodic signals with
incommensurate period, i.e. ratio of two periods
is not a rational number -
The time profile looks close to periodic but
there is no Tp for which x(t) x(tnTp) In
general where fn/fm is not a rational number
?m,n
21Transient Aperiodic Signal
- Transient in nature, and not periodic
A
A
A
w
x(t) A, w t 0 0, w lt t lt 0
x(t) Ae-at cos bt, t0 0, tlt0
x(t) Ae-at, t0 0, tlt0
Ex heat dissipation, damped vibration, stress in
a cable which breaks at time w
Discrete spectral representations are not
possible but continuous spectral is possible
22(No Transcript)
23Random Signals
- No explicit mathematical relationship
- Each observation is unique
- Also any given observation represents only one of
many possible outputs - Sample Function (or Sample Record) A single
time history representing a random phenomenon - Random (or Stochastic) Process Collection of
all possible sample functions that the random
phenomenon might produce - Thus, sample record is one physical realization
of a random process
24x1(t)
Different Sample Records
x2(t)
xN(t)
Time, t
t1t
t1
25Stationary Random Signal
- Mean
- Correlation
- If mx(t1) and Rxx(t1,t1t) vary as time t1
varies, the random process x(t) is
non-stationary - For weakly stationary process, mx(t1) is
constant, and Rxx(t1,t1t) depends only on time
displacement. Thus, mx(t1) mx Rxx(t1,t1t)
Rxx(t) - When all possible moments and joint moments are
time invariant, the random process x(t) is
strictly stationary
26Ergodic Random Signal
- Sample mean,
- Sample correlation
- If a random process x(t) is stationary, and
mx(k) and Rxx(t,k) do not differ when computed
over different sample records, it is called
ergodic, i.e. mx(k) mx Rxx(t,k) Rxx(t) - In short, for ergodic process, time average
space average - Ergodicity is practically important because all
properties of ergodic random processes can be
obtained from a single sample record - Not all stationary processes are ergodic
27Wold Decomposition
- A stationary process x(t) can be decomposed as
- x(t) xd(t) xr(t)
- where xd(t) ? purely deterministic part
- xr(t) ? purely random part
-
28Markov Process
- Future depends on the knowledge of the present
but not on the knowledge of the past - A random process x(t) is a Markov process if
Px(tdt)x(t),x(t-dt),, x(t0)
Px(tdt)x(t) - Order or Markov process is determined by the
duration of past to describe the future - Some properties
- A subset of a Markov sequence is also a Markov
sequence - If a Markov sequence is time reversed, it will
still be Markov - Px(tdt)x(t2dt),x(t3dt),, x(T)
Px(tdt)x(t2dt)
29Gaussian Distribution
- A random variable x(k) is called Gaussian or
normally distributed if its probability density
function is given bywhere
30N-dimensional Gaussian
- Consider N random variables x1(k), x2(k), ,
xN(k). - Their joint distribution will be N-dimensional
Gaussian if the associated N-fold probability
density function is - C is the covariance matrix of Cij, C is the
determinant and Cij is the cofactor of Cij in
determinant C
31Cofactor Cij of any element Cij is defined to
be the determined of order N-1 formed by
omitting the i-th row and j-th column of C
and multiplied by (-1)ij Outstanding feature of
N-dimensional Gaussian All of its properties
are determined solely by means and covariances.
32Random Gaussian Process (RGP)
- A random process xk(t) is said to be RGP if for
every set of fixed times tn, the random
variables xk(tn) follow a multidimensional
Gaussian normal distributions. - A linearly transformed Gaussian process is also a
Gaussian process
33White Noise Process
- A random sequence x(kn), x(kn-1), x(k1) is
said to be purely white noise sequence if x(ki)
and x(kj) are completely independent for i ? j. - Then, conditional density is same as marginal
density - Px(kn) x(kn-1) Px(kn)
- Properties (i) white noise is memory less
- (ii) the present is independent of the
past, whereas future is independent of
present - (iii) Ex(ki) x(kj) R dij, d is the
Kronecker delta function. - (iv) Usually, white noise process has
zero mean and its power spectrum will be
constant
34Wiener process
- It is a stochastic process with stationary
increments, x(s)-x(r), which are normally
distributed with zero mean and variance
proportional to the time difference (s-r) - Ex(s)-x(r) 0
- Ex(s)-x(r))2 s2 s - r, s2 is a positive
constant. - Thus the probability density of displacement from
time r to time s is the same as from time (rt)
to (st) because the density depends on the
length of time interval and not on the specific
time reference. - Ex Integrated white Gaussian noise
35Poisson Process
- An integer valued stochastic process constituting
of discrete events occurring at random time
intervals. - Thus x(t1)-x(t2) represents the number of events
that occurred in the time interval (t2,t1). - The probability of m events in the time interval
of length t is given by - where lt is the mean.
- The probability that no event (m0) happened in
the interval (0,1) e-l - The probability that at least one event happened
in the interval (0,1) 1 - e-l
36Process Models
- Primary requirements of a model
- Representativeness
- Parsimony
- Long-term validity
37Some choices of Models
- Transfer-function models
- Models based on Trigonomeric fn
- State-Space models
- Models based on Orthogonal Transform
- Hierarchical models
38Factors influencing choices
- Linearity
- No process is strictly linear but linear modeling
is highly favored due to - (i) simplicity
- (ii) robustness
- (iii) ease of implementation
- (iv) analytically tractable
- (v) nonlinearity can be broken into piecewise
linearity - Periodicity/ Regularity
- (i) strictly periodic
- (ii) almost periodic
- (iii) quasi periodic
- Stationarity
- Most models assume stationarity. Suitable
transformations of the data are needed to achieve
stationarity